Matrix Decompositions (SymPy)
Eigen Decomposition (diagonalization)
An
for some invertible matrix
A beautiful result in linear algebra is that such a decomposition is possible if
To diagonalize a matrix in SymPy use the .diagonalize()
method. This returns a tuple (P,D)
.
Example
Diagonalize the matrix
Let's check if
We can also use this idea to create a matrix with any eigenvalues and eigenvectors we want.
For example, suppose we would like a
An example of such a matrix would be
In this case,
Example
Let's check the matrix above has the desired eigenvalues and eigenvectors
This will return an answer involving floating point numbers. If we want to force SymPy to work exclusively over the rational numbers we can define the entries to be rationals.
A = Matrix([[2,2,-1],[Rational(5,4),Rational(1,2),Rational(-5,4)],[Rational(3,2),-3,Rational(-1,2)]])
pprint(A.eigenvects())
We can see that the columns are scaled versions of
Diagonalization of Symmetric Matrices
If
where
Let's summarize some important theorems about symmetric matrices:
- Eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal.
- Principle Axis Theorem: a matrix is symmetric if and only if it is orthogonally diagonalizable.
- All the eigenvalues of a symmetric matrix are real numbers.
Example
Orthogonally diagonalize the symmetric matrix
Solution: First we summarize how we would do this one (check the results for yourself). The eigenvalues and eigenvectors are
In other words,
If we formed the matrix
Let's now do the calculations.
We can see
Since
We can use Gram-Schmidt on the columns of
# Let's Gram-Schmidt the columns of P
vectorList = [P.col(i) for i in range(P.shape[1])]
pprint(GramSchmidt(vectorList,True)) # optional argument True normalizes vectors
Let's set
Therefore,
Now that we got a few of the subtleties out of the way, let's do another example and focus just on the computations. Again this example has an eigenvalue with algebraic/geometric multiplicty 2.
Example
Orthogonally diagonalize the symmetric matrix
vectorList = [P.col(i) for i in range(P.shape[1])]
P_list = GramSchmidt(vectorList,True)
P = P_list[0]
for i in [1,2]:
P = P.row_join(P_list[i])
P
Verify
Therefore,
Jordan Decomposition
If a matrix
Each block
An example of such a Jordan matrix is
The double eigenvalue
To compute the Jordan form of a matrix use the .jordan_form()
method. It returns the tuple
Example
Compute the Jordan form of the following matrix.
Let's check that
Example
Find the Jordan form of
Since
Singular-Value Decomposition (svd)
The theory of orthogonal diagonalization of symmetric matrices is extremely useful and powerful. SVD extends these ideas to matrices that are not symmetric, not diagonalizable, or perhaps not even square! In short, every matrix has a svd.
Definition of SVD
Let
where
is an (square) orthogonal matrix, is an (square) orthogonal matrix, is an (rectangular) diagonal matrix (of non-zero singular values ).
is an (rectangular) column orthogonal matrix, is an (rectangular) column orthogonal matrix, is an (square) diagonal matrix (of non-zero singular values ).
Key Fact:
- first
columns of : column space of ( ) - last
columns of : left nullspace of ( ) - first
columns of : row space of ( ) - last
columns of : nullspace of ( )
The diagonal (but rectangular) matrix
Computing an SVD
To compute
- the square roots of the non-zero eigenvalues of
are the singular values . is the diagonal matrix made up of these values in descending order:
- the columns of
are the eigenvectors of arranged in descending order of their eigenvalues:
- the columns of
are the eigenvectors of arranged in descending order of their eigenvalues. Although, they are usually computed by noticing first columns of are and the last columns are an orthonormal basis for :
The condensed svd tosses away:
- the last
columns of (i.e. an orthonormal basis of ) - the last
columns of (i.e. an orthonormal basis of ) - the last
columns of (i.e. the zero columns)
since these don't affect the product. Click the "condensed svd" tab above to see the changes to the definitions of
Example
The
has rank
where both the first and third matrices are orthogonal. The condesnsed svd is
We use SymPy's .singular_value_decomposition
method to find a decomposition in python. This function returns condensed svd the 3-tuple
Warning
The matrices .singular_value_decomposition
may return something slightly different than ypu may have expected.
Example
Find a singular value decomposition of the rank
Notice that the singular values are sorted in ascending order, and this is the condensed svd.
Let's check that this gives a decomposition of
The product is indeed
Since
First we look at the columns of
The vector
Now the columns of this matrix are not orthonormal, so we could apply Gram-Schmidt. This would require making a list of the columns, applying gram_schmidt
, then taking the resulting list and making a matrix. It is easier to just do a QR-decomposition since
This is the extended matrix
Now let's check that
The product is indeed
LU Decomposition
The LU decomposition of a matrix
where
A permutation matrix is constructed from the identity matrix
The command LUdecomposition
returns a 3-tuple perm
is a list of row swap index pairs. If A=(L*U).permuteBkwd(perm)
, and the row permutation matrix P=eye(A.rows).permuteFwd(perm)
.
Example
Find the LU-decomposition of
Typically, one would call the function using tuple assignment.
A = Matrix([[2,-1,4],[4,-1,6],[-1,-1,3]])
L, U, perm = A.LUdecomposition()
P = eye(A.rows).permuteFwd(perm)
P
Let's check the product is
QR Decomposition
The QR decomposition of a matrix
where
The command QRdecomposition
returns a 2-tuple