A
λ
A
v
λ
Av=λv
Power iteration is a very simple algorithm, but it may converge slowly. The most time-consuming operation of the algorithm is the multiplication of matrix
A
(λ1/
k | |
λ | |
2) |
The power iteration algorithm starts with a vector
b0
bk+1=
Abk | |
\|Abk\| |
bk
A
If we assume
A
b0
\left(bk\right)
Without the two assumptions above, the sequence
\left(bk\right)
bk=
i\phik | |
e |
v1+rk
v1
\|rk\| → 0
i\phik | |
e |
\left(bk\right)
i\phik | |
e |
=1
\left(\muk\right)
\muk=
| ||||||||||
|
One may compute this with the following algorithm (shown in Python with NumPy):
import numpy as np
def power_iteration(A, num_iterations: int): # Ideally choose a random vector # To decrease the chance that our vector # Is orthogonal to the eigenvector b_k = np.random.rand(A.shape[1])
for _ in range(num_iterations): # calculate the matrix-by-vector product Ab b_k1 = np.dot(A, b_k)
# calculate the norm b_k1_norm = np.linalg.norm(b_k1)
# re normalize the vector b_k = b_k1 / b_k1_norm
return b_k
power_iteration(np.array(0.5, 0.5, [0.2, 0.8]]), 10)The vector
bk
This algorithm is used to calculate the Google PageRank.
The method can also be used to calculate the spectral radius (the eigenvalue with the largest magnitude, for a square matrix) by computing the Rayleigh quotient
\rho(A)=max\left\{|λ1|,...c,|λn|\right\}=
| ||||||||||
|
.
Let
A
A=VJV-1
V
A
λ1
A
J
1 x 1
[λ1],
λ1
b0
b0=c1v1+c2v2+ … +cnvn.
By assumption,
b0
c1\ne0
The computationally useful recurrence relation for
bk+1
bk+1=
Abk | = | |
\|Abk\| |
Ak+1b0 | |
\|Ak+1b0\| |
,
where the expression:
Ak+1b0 | |
\|Ak+1b0\| |
\begin{align} bk&=
Akb0 | |
\|Akb0\| |
\\ &=
\left(VJV-1\right)kb0 | |
\|\left(VJV-1\right)kb0\| |
\\ &=
VJkV-1b0 | |
\|VJkV-1b0\| |
\\ &=
VJkV-1\left(c1v1+c2v2+ … +cnvn\right) | |
\|VJkV-1\left(c1v1+c2v2+ … +cnvn\right)\| |
\\ &=
VJk\left(c1e1+c2e2+ … +cnen\right) | |
\|VJk\left(c1e1+c2e2+ … +cnen\right)\| |
\\ &=\left(
λ1 | |
|λ1| |
\right)k
c1 | |
|c1| |
| |||||||||
|
\end{align}
The expression above simplifies as
k\toinfty
\left(
1 | |
λ1 |
J\right)k=\begin{bmatrix} [1]&&&&\\ &\left(
1 | |
λ1 |
J2\right)k&&&\\ &&\ddots&\\ &&&\left(
1 | |
λ1 |
Jm\right)k\\ \end{bmatrix} → \begin{bmatrix} 1&&&&\\ &0&&&\\ &&\ddots&\\ &&&0\\ \end{bmatrix} as k\toinfty.
The limit follows from the fact that the eigenvalue of
1 | |
λ1 |
Ji
\left(
1 | |
λ1 |
Ji\right)k\to0 as k\toinfty.
It follows that:
1 | |
c1 |
V\left(
1 | |
λ1 |
J\right)k\left(c2e2+ … +cnen\right)\to0 as k\toinfty
Using this fact,
bk
v1
\begin{align} bk&=\left(
λ1 | |
|λ1| |
\right)k
c1 | |
|c1| |
| |||||||||||||
|
\\[6pt] &=
i\phik | |
e |
c1 | |
|c1| |
v1 | |
\|v1\| |
+rk\end{align}
where
i\phik | |
e |
=\left(λ1/|λ1|\right)k
\|rk\|\to0
k\toinfty
The sequence
\left(bk\right)
\left(bk\right)
bk
Alternatively, if A is diagonalizable, then the following proof yields the same result
Let λ1, λ2, ..., λm be the m eigenvalues (counted with multiplicity) of A and let v1, v2, ..., vm be the corresponding eigenvectors. Suppose that
λ1
|λ1|>|λj|
j>1
The initial vector
b0
b0=c1v1+c2v2+ … +cmvm.
If
b0
\begin{align} Akb0&=c1Akv1+c2Akv2+ … +cmAkvm\\ &=c1
k | |
λ | |
1 |
v1+c2
k | |
λ | |
2 |
v2+ … +cm
k | |
λ | |
m |
vm\\ &=c1
k | |
λ | |
1 |
\left(v1+
c2 | \left( | |
c1 |
λ2 | |
λ1 |
\right)kv2+ … +
cm | \left( | |
c1 |
λm | |
λ1 |
\right)kvm\right)\\ &\toc1
k | |
λ | |
1 |
v1&&\left|
λj | |
λ1 |
\right|<1forj>1 \end{align}
On the other hand:
bk=
Akb0 | ||||||
|
.
Therefore,
bk
v1
\left|
λ2 | |
λ1 |
\right|,
where
λ2
Although the power iteration method approximates only one eigenvalue of a matrix, it remains useful for certain computational problems. For instance, Google uses it to calculate the PageRank of documents in their search engine,[2] and Twitter uses it to show users recommendations of whom to follow.[3] The power iteration method is especially suitable for sparse matrices, such as the web matrix, or as the matrix-free method that does not require storing the coefficient matrix
A
Ax
Some of the more advanced eigenvalue algorithms can be understood as variations of the power iteration. For instance, the inverse iteration method applies power iteration to the matrix
A-1
bk