From Wikipedia, the free encyclopedia

In numerical linear algebra, the conjugate gradient squared method (CGS) is an iterative algorithm for solving systems of linear equations of the form , particularly in cases where computing the transpose is impractical. [1] The CGS method was developed as an improvement to the biconjugate gradient method. [2] [3] [4]

Background

A system of linear equations consists of a known matrix and a known vector . To solve the system is to find the value of the unknown vector . [3] [5] A direct method for solving a system of linear equations is to take the inverse of the matrix , then calculate . However, computing the inverse is computationally expensive. Hence, iterative methods are commonly used. Iterative methods begin with a guess , and on each iteration the guess is improved. Once the difference between successive guesses is sufficiently small, the method has converged to a solution. [6] [7]

As with the conjugate gradient method, biconjugate gradient method, and similar iterative methods for solving systems of linear equations, the CGS method can be used to find solutions to multi-variable optimisation problems, such as power-flow analysis, hyperparameter optimisation, and facial recognition. [8]

Algorithm

The algorithm is as follows: [9]

  1. Choose an initial guess
  2. Compute the residual
  3. Choose
  4. For do:
    1. If , the method fails.
    2. If :
    3. Else:
    4. Solve , where is a pre-conditioner.
    5. Solve
    6. Check for convergence: if there is convergence, end the loop and return the result

See also

References

  1. ^ Noel Black; Shirley Moore. "Conjugate Gradient Squared Method". Wolfram Mathworld.
  2. ^ Mathworks. "cgs". Matlab documentation.
  3. ^ a b Henk van der Vorst (2003). "Bi-Conjugate Gradients". Iterative Krylov Methods for Large Linear Systems. Cambridge University Press. ISBN  0-521-81828-1.
  4. ^ Peter Sonneveld (1989). "CGS, A Fast Lanczos-Type Solver for Nonsymmetric Linear systems". SIAM Journal on Scientific and Statistical Computing. 10 (1): 36–52. doi: 10.1137/0910004. ProQuest  921988114.
  5. ^ "Linear equations" (PDF), Matrix Analysis and Applied Linear Algebra, Philadelphia, PA: SIAM, pp. 1–40, doi: 10.1137/1.9780898719512.ch1 (inactive 31 January 2024), archived from the original (PDF) on 2004-06-10, retrieved 2023-12-18{{ citation}}: CS1 maint: DOI inactive as of January 2024 ( link)
  6. ^ "Iterative Methods for Linear Systems". Mathworks.
  7. ^ Jean Gallier. "Iterative Methods for Solving Linear Systems" (PDF). UPenn.
  8. ^ Alexandra Roberts; Anye Shi; Yue Sun. "Conjugate gradient methods". Cornell University. Retrieved 2023-12-26.
  9. ^ R. Barrett; M. Berry; T. F. Chan; J. Demmel; J. Donato; J. Dongarra; V. Eijkhout; R. Pozo; C. Romine; H. Van der Vorst (1994). Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods, 2nd Edition. SIAM.