To my taste this is not very illuminating, it feels like the answer is produced via a magic trick.
My personal preference for understanding linear recurrence relations (and understanding where/why sqrt(5) shows up for the Fibonacci sequence) is via eigenvalues and eigenvectors of a corresponding linear transformation. For example:
and therefore the nth Fibonacci number can be computed by computing the (n-2)th power of the matrix [[1, 1], [1, 0]], and then multiplying that matrix by the column vector [1, 1] (or any other desired first terms of the sequence). The matrix power can be computed by diagonalizing, and that's where the powers of the golden ratio will come up (as eigenvalues of that matrix).
This is completely equivalent to the method in TFA, but to my mind is better motivated.
Note: it may not be immediately apparent, but all constant-coefficient linear recurrence relations of any complexity can be straightforwardly solved with this technique.
I've been thinking about this a bit and concluded (surprisingly!) that it is probably not an asymptotic speed up!
First thoughts: The memoization algorithm is linear in n, and the repeated squaring algorithm is logarithmic in n, so the second one is faster.
Second thoughts: Fibonacci numbers grow exponentially, so we should consider the speed of actually adding and multiplying them. Adding is linear in the number of digits, so linear in n (because F(n) grows exponentially). Multiplication is quadratic in n! The memoization algorithm is repeated adding, so we get an extra linear factor, total complexity O(n^2). The repeated squaring uses multiplication, so we get an extra quadratic factor, total complexity O(n^2 log n). The first is faster!
Third thoughts: the fastest known algorithm for multiplying numbers is actually O(n log n) in the number of digits. We're down to O(n log n log n). The second is faster! That's probably terribly impractical, but there are apparently multiplication algorithms that are less impractical, with a smaller exponent than 2, so still an improvement over the first method.
But without specialized multiplication routines, the first is asymptotically faster.
As an unimportant bit of trivia, that's only true on the non-existent platforms with O(1) bignum operations. The output size is exponential in the input, guaranteeing that any non-approximate answer requires O(exp(n)) computations to produce on a real-world register machine.
Unfortunately the presented method only works for the most simple linear recurrence relations.
Tangentially: look up the master theorem if you're interested in at least estimating growth rates for recurrence relations that crop up in computer science.
A powerful method in math is to assume the thing you're looking for is continuous, in which case you can write it F(n) = a0 + a1 n + a2 n^2 + ... .
You can solve Fibonacci by writing it as
a0 + a1 n + a2 n^2 + ...
= a0 + a1 (n-1) + a2 (n-1)^2 + ...
+ a0 + a1 (n-2) + a2 (n-2)^2 + ...
Expand terms, then match the n, n^2, n^3 etc terms and solve for a0, a1, a2, etc. This works in solving differential equations too - it's called "Method of Frobenius" (easier in diffeq because you don't have to do annoying binomial coefficient math).
My personal preference for understanding linear recurrence relations (and understanding where/why sqrt(5) shows up for the Fibonacci sequence) is via eigenvalues and eigenvectors of a corresponding linear transformation. For example:
and therefore and therefore the nth Fibonacci number can be computed by computing the (n-2)th power of the matrix [[1, 1], [1, 0]], and then multiplying that matrix by the column vector [1, 1] (or any other desired first terms of the sequence). The matrix power can be computed by diagonalizing, and that's where the powers of the golden ratio will come up (as eigenvalues of that matrix).This is completely equivalent to the method in TFA, but to my mind is better motivated.