Orthogonal diagonalization

I want to give myself a real symmetric matrix, find its eigenvalues and orthonormal bases for its eigenspaces, and then also want to orthogonally diagonalize it. The matrix will be

$$ A= \left[ \begin{array}{rrrr} 1&1&1&-1\\ 1&1&-1&1\\ 1&-1&1&1\\ -1&1&1&1 \end{array} \right] $$ I'll implement that in SymPy, and then we will get to work. The pattern to follow for solving this type of problem is Example 11.65 in our textbook.

In [1]:
from sympy import *
init_printing(use_latex='mathjax')
In [2]:
A=Matrix([[1,1,1,-1],[1,1,-1,1],[1,-1,1,1],[-1,1,1,1]]); A
Out[2]:
$\displaystyle \left[\begin{matrix}1 & 1 & 1 & -1\\1 & 1 & -1 & 1\\1 & -1 & 1 & 1\\-1 & 1 & 1 & 1\end{matrix}\right]$

Now, of course, in SymPy, we can simply ask for eigenvectors, and the software will produce that together with the eigenvalues:

In [3]:
A.eigenvects()
Out[3]:
$\displaystyle \left[ \left( -2, \ 1, \ \left[ \left[\begin{matrix}1\\-1\\-1\\1\end{matrix}\right]\right]\right), \ \left( 2, \ 3, \ \left[ \left[\begin{matrix}1\\1\\0\\0\end{matrix}\right], \ \left[\begin{matrix}1\\0\\1\\0\end{matrix}\right], \ \left[\begin{matrix}-1\\0\\0\\1\end{matrix}\right]\right]\right)\right]$

That says that $-2$ is an eigenvalue with multiplicity $1$ (algebraic and geometric multiplicity; recall the distinction from Definition 8.40 in the text; for real symmetric matrices these always coincide, because we know from Theorem 11.63 that such matrices are diagonalizable).

It also gives us an eigenvector for that multiplicity-1 eigenvalue, so a basis for that eigenspace. Similarly, it tells us that $2$ is an eigenvalue with multiplicity $3$, and gives us a basis for the corresponding eigenspace. Now look at what Example 11.65 tells me to do:

  • first find orthonormal bases for my eigenspaces;
  • then stack those together as columns into a matrix $P$;
  • and then the diagonal matrix I'm after will be $D=P^{-1}\cdot A\cdot P$. I will begin with that first item, by applying SymPy's GramSchmidt function to the list of $2$-eigenvectors:
In [4]:
A.eigenvects()[1][2]
Out[4]:
$\displaystyle \left[ \left[\begin{matrix}1\\1\\0\\0\end{matrix}\right], \ \left[\begin{matrix}1\\0\\1\\0\end{matrix}\right], \ \left[\begin{matrix}-1\\0\\0\\1\end{matrix}\right]\right]$

Those indices (which in Python start at 0, remember!) extract the third array of vectors from the second entry of the result of A.eigenvects() displayed before. This is what I need to make orthonormal with Gram-Schmidt:

In [5]:
GramSchmidt(A.eigenvects()[1][2], True)
Out[5]:
$\displaystyle \left[ \left[\begin{matrix}\frac{\sqrt{2}}{2}\\\frac{\sqrt{2}}{2}\\0\\0\end{matrix}\right], \ \left[\begin{matrix}\frac{\sqrt{6}}{6}\\- \frac{\sqrt{6}}{6}\\\frac{\sqrt{6}}{3}\\0\end{matrix}\right], \ \left[\begin{matrix}- \frac{\sqrt{3}}{6}\\\frac{\sqrt{3}}{6}\\\frac{\sqrt{3}}{6}\\\frac{\sqrt{3}}{2}\end{matrix}\right]\right]$

You can do the same for the single $(-2)$-eigenvector, since even that one isn't normalized (i.e. its length is not $1$, as it stands):

In [6]:
GramSchmidt(A.eigenvects()[0][2], True)
Out[6]:
$\displaystyle \left[ \left[\begin{matrix}\frac{1}{2}\\- \frac{1}{2}\\- \frac{1}{2}\\\frac{1}{2}\end{matrix}\right]\right]$

Those four vectors, I need to collect together as columns of my matrix $P$ (any order will do!).

In [7]:
P=Matrix(); P
Out[7]:
$\displaystyle \left[\begin{matrix}\end{matrix}\right]$

Now we'll start adding the vectors above as columns to this empty matrix. I learned this hstack trick by searching online.

In [8]:
for i in range(2): 
    P=Matrix.hstack(P,*GramSchmidt(A.eigenvects()[i][2], True))    
In [9]:
P
Out[9]:
$\displaystyle \left[\begin{matrix}\frac{1}{2} & \frac{\sqrt{2}}{2} & \frac{\sqrt{6}}{6} & - \frac{\sqrt{3}}{6}\\- \frac{1}{2} & \frac{\sqrt{2}}{2} & - \frac{\sqrt{6}}{6} & \frac{\sqrt{3}}{6}\\- \frac{1}{2} & 0 & \frac{\sqrt{6}}{3} & \frac{\sqrt{3}}{6}\\\frac{1}{2} & 0 & 0 & \frac{\sqrt{3}}{2}\end{matrix}\right]$

Now for the moment of truth: let's check that $P^{-1}\cdot A\cdot P$ actually is diagonal!

In [10]:
P.inv()*A*P
Out[10]:
$\displaystyle \left[\begin{matrix}-2 & 0 & 0 & 0\\0 & 2 & 0 & 0\\0 & 0 & 2 & 0\\0 & 0 & 0 & 2\end{matrix}\right]$

Exactly as expected: it is diagonal, and its diagonal entries are the eigenvalues of the original matrix $A$ ($-2$ once and $2$ thrice, since that one had multiplicity $3$).