Orthogonal diagonalization
I want to give myself a real symmetric matrix, find its eigenvalues and orthonormal bases for its eigenspaces, and then also want to orthogonally diagonalize it. The matrix will be
$$ A= \left[ \begin{array}{rrrr} 1&1&1&-1\\ 1&1&-1&1\\ 1&-1&1&1\\ -1&1&1&1 \end{array} \right] $$ I'll implement that in SymPy, and then we will get to work. The pattern to follow for solving this type of problem is Example 11.65 in our textbook.
from sympy import *
init_printing(use_latex='mathjax')
A=Matrix([[1,1,1,-1],[1,1,-1,1],[1,-1,1,1],[-1,1,1,1]]); A
Now, of course, in SymPy, we can simply ask for eigenvectors, and the software will produce that together with the eigenvalues:
A.eigenvects()
That says that $-2$ is an eigenvalue with multiplicity $1$ (algebraic and geometric multiplicity; recall the distinction from Definition 8.40 in the text; for real symmetric matrices these always coincide, because we know from Theorem 11.63 that such matrices are diagonalizable).
It also gives us an eigenvector for that multiplicity-1 eigenvalue, so a basis for that eigenspace. Similarly, it tells us that $2$ is an eigenvalue with multiplicity $3$, and gives us a basis for the corresponding eigenspace. Now look at what Example 11.65 tells me to do:
- first find orthonormal bases for my eigenspaces;
- then stack those together as columns into a matrix $P$;
- and then the diagonal matrix I'm after will be $D=P^{-1}\cdot A\cdot P$.
I will begin with that first item, by applying SymPy's
GramSchmidt
function to the list of $2$-eigenvectors:
A.eigenvects()[1][2]
Those indices (which in Python start at 0, remember!) extract the third array of vectors from the second entry of the result of A.eigenvects()
displayed before. This is what I need to make orthonormal with Gram-Schmidt:
GramSchmidt(A.eigenvects()[1][2], True)
You can do the same for the single $(-2)$-eigenvector, since even that one isn't normalized (i.e. its length is not $1$, as it stands):
GramSchmidt(A.eigenvects()[0][2], True)
Those four vectors, I need to collect together as columns of my matrix $P$ (any order will do!).
P=Matrix(); P
Now we'll start adding the vectors above as columns to this empty matrix. I learned this hstack
trick by searching online.
for i in range(2):
P=Matrix.hstack(P,*GramSchmidt(A.eigenvects()[i][2], True))
P
Now for the moment of truth: let's check that $P^{-1}\cdot A\cdot P$ actually is diagonal!
P.inv()*A*P
Exactly as expected: it is diagonal, and its diagonal entries are the eigenvalues of the original matrix $A$ ($-2$ once and $2$ thrice, since that one had multiplicity $3$).