Eigenvalues and Eigenvectors



Eigenvalues and Eigenvectors
Given a square n×n matrix A defined over a field F, a (right) eigenvalue of A is a scalar µ ∈ F such that A x = µ x for some non-zero n-dimensional column vector x. The non-zero n-dimensional column vector x is then called a (right) eigenvector corresponding to the (right) eigenvalue µ.
The matrix equation (A - µI) x = 0 for given scalar µ has a non-zero solution for x if and only if det (A - µI) = 0. The eigenvalues of A therefore are roots to the polynomial in µ,
det (A - µI), which is called the characteristic polynomial of A.

Several immediate consequences of this result follow:
  • The eigenvalues need not exist unless the field F is algebraically closed.

  • If an eigenvalue exists, it need not be unique.

  • Since eigenvectors corresponding to distinct eigenvalues are linearly independent, if the characteristic equation admits n distinct eigenvalues, then every column vector can be expressed as a linear combination of the corresponding eigenvectors.

  • If the characteristic equation admits n eigenvalues which are not distinct, then every column vector may or may not be expressible as a linear combination of eigenvectors. Compare the case of the 2×2 zero matrix and the 2×2 matrix with 1 in the upper right-hand corner and 0's elsewhere. Both matrices have the same characteristic polynomial, but different minimal polynomials (the monic polynomial of least degree which is satisfied by the matrix).
For real symmetric matrices the following methods are available: Rayleigh's method which yields the dominant eigenvalue, Rayleigh's quotient method which also yields the dominant eigenvalue and is usually called after Rayleigh's method is used to obtain an estimate of the dominant eigenvalue, Given's bisection algorithm which returns all the eigenvalues but requires that the symmetric matrix first be transformed to a similar tridiagonal matrix, the QL algorithm with implicit shifts of the origin which returns all the eigenvalues and eigenvectors but requires that the symmetric matrix first be transformed to a similar tridiagonal matrix, and Jacobi's cyclic method which returns all the eigenvalues and eigenvectors.

For arbitrary real square matrices the following: The power method which yields the dominant eigenvalue providing it is unique in particular this implies that it must be real, the inverse power method which yields the eigenvalue nearest a preassigned fixed constant providing that the eigenvalue is real, and Francis's double QR algorithm which provides all the eigenvalues and eigenvectors, both real and complex, after first transforming the matrix to a similar matrix in upper Hessenberg form.

Included in the section are routines to balance a matrix which conditions a matrix to reduce the errors involved in calculating the eigenvalues, Gershgorin's theorm which gives estimates as to the location of the eigenvalues, as well as routines to tridiagonalize a symmetric matrix and to transform a general matrix to a similar matrix in upper Hessenberg form. Also included is a routine to sort the eigenvalues and interchange the corresponding columns of a matrix containing the corresponding eigenvectors.
Table of Eigenvalue - Eigenvector Routines