Conjugate gradient squared method explained
In numerical linear algebra, the conjugate gradient squared method (CGS) is an iterative algorithm for solving systems of linear equations of the form
, particularly in cases where computing the
transpose
is impractical.
[1] The CGS method was developed as an improvement to the
biconjugate gradient method.
[2] [3] [4] Background
consists of a known
matrix
and a known
vector
. To solve the system is to find the value of the unknown vector
. A direct method for solving a system of linear equations is to take the inverse of the matrix
, then calculate
. However, computing the inverse is computationally expensive. Hence, iterative methods are commonly used. Iterative methods begin with a guess
, and on each iteration the guess is improved. Once the difference between successive guesses is sufficiently small, the method has converged to a solution.
[5] [6] As with the conjugate gradient method, biconjugate gradient method, and similar iterative methods for solving systems of linear equations, the CGS method can be used to find solutions to multi-variable optimisation problems, such as power-flow analysis, hyperparameter optimisation, and facial recognition.[7]
Algorithm
The algorithm is as follows:[8]
- Choose an initial guess
- Compute the residual
{\boldr}(0)={\boldb}-A{\boldx}(0)
- Choose
\tilde{\boldr}(0)={\boldr}(0)
- For
do:
\rho(i-1)=\tilde{\boldr}T(i-1){\boldr}(i-1)
- If
, the method fails.
- If
:
{\boldp}(1)={\boldu}(1)={\boldr}(0)
- Else:
\beta(i-1)=\rho(i-1)/\rho(i-2)
{\boldu}(i)={\boldr}(i-1)+\betai-1{\boldq}(i-1)
{\boldp}(i)={\boldu}(i)+\beta(i-1)({\boldq}(i-1)+\beta(i-1){\boldp}(i-1))
- Solve
M\hat{\boldp}={\boldp}(i)
, where
is a pre-conditioner.
\hat{\boldv}=A\hat{\boldp}
\alpha(i)=\rho(i-1)/\tilde{\boldr}T\hat{\boldv}
{\boldq}(i)={\boldu}(i)-\alpha(i)\hat{\boldv}
- Solve
M\hat{\boldu}={\boldu}(i)+{\boldq}(i)
{\boldx}(i)={\boldx}(i-1)+\alpha(i)\hat{\boldu}
\hat{\boldq}=A\hat{\boldu}
{\boldr}(i)={\boldr}(i-1)-\alpha(i)\hat{\boldq}
- Check for convergence: if there is convergence, end the loop and return the result
See also
Notes and References
- Web site: Conjugate Gradient Squared Method. Noel Black. Shirley Moore. Wolfram Mathworld.
- Web site: cgs. Mathworks. Matlab documentation.
- Book: Henk van der Vorst. Iterative Krylov Methods for Large Linear Systems. Bi-Conjugate Gradients. 2003. Cambridge University Press . 0-521-81828-1.
- CGS, A Fast Lanczos-Type Solver for Nonsymmetric Linear systems. Peter Sonneveld. SIAM Journal on Scientific and Statistical Computing. 10. 1. 36–52. 1989. limited. 10.1137/0910004. .
- Web site: Iterative Methods for Linear Systems. Mathworks.
- Web site: Iterative Methods for Solving Linear Systems. Jean Gallier. UPenn.
- Web site: Conjugate gradient methods. Alexandra Roberts. Anye Shi. Yue Sun. 2023-12-26. Cornell University.
- Book: R. Barrett. M. Berry. T. F. Chan. J. Demmel. J. Donato. J. Dongarra. V. Eijkhout. R. Pozo. C. Romine. H. Van der Vorst. Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods, 2nd Edition. SIAM. 1994.