Spectral Theorem

This post is on the Spectral Theorem. This is something that I should have been clear on a long time ago, but for reasons unknown to me, I was not. I hope to be able to rectify that now. The proof was discussed today in class. I am only recording my thoughts on it.

The spectral theorem states that a self-adjoint operator in an n dimensional vector space has n orthogonal eigenvectors, and all its n eigenvalues are real.

Let V be the n-dimensional vector space under consideration, and let a,b\in V. A self adjoint operator is one that satisfies the following condition: \langle Ta,b\rangle=\langle a,Tb\rangle. If the inner product is defined in the conventional way in this setting, which is a sesquilinear product, then T has to be a hermitian matrix. For motivation, we’re going to assume this inner product for the rest of the proof.

As we’re working in \Bbb{C}, T has at least one eigenvalue, and consequently and eigenvector. Let Tv=\lambda v, where \lambda is the eigenvalue and v is the eigenvector. \langle Tv,v\rangle=\langle \lambda v,v\rangle=\lambda\langle v,v\rangle=\langle v,Tv\rangle=\langle v,\lambda v\rangle=\overline{\lambda}\langle v,v\rangle. We know that \langle v,v\rangle\in\Bbb{R} (it is in fact greater than $0$). Hence, \lambda=\overline{\lambda}, which shows that \lambda is real valued.

How do we contruct the basis of orthogonal eigenvectors though? We start with one eigenvector v. Now consider the orthogonal complement of v. Let this be A. We claim that T(A)\subset A. This is because for a\in A, \langle Ta,v\rangle=\langle a,Tv\rangle=\langle a,\lambda v\rangle=\overline{\lambda}\langle a,v\rangle=\lambda\langle a,v\rangle=0 (remember that \lambda=\overline{\lambda}). Hence, if we write T in terms of the new basis which has v and elements from its orthogonal complement, then the first row and column will be all 0‘s except for the the top left position, which will have \lambda.

Now the action of T on the orthogonal complement, or A, is the same as the action of T\setminus \{\text{first row and first column}\}. The determinant of this again will have at least one solution, which ensures that we have an eigenvalue to work with. In this way, through an iterative process which ends after n iterations, we can generate n eigenvectors.

Published by ayushkhaitan3437

Hello! My name is Ayush Khaitan, and I'm a graduate student in Mathematics. I am always excited about talking to people about their research. Please please set up a meeting with me if you feel that I might have an interesting perspective to offer- https://calendly.com/ayushkhaitan/meeting-with-ayush

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: