STAT 350: Lecture 17
Quadratic forms, Diagonalization and Eigenvalues
The function
is a quadratic form. The coefficient of a cross product term like is so the function is unchanged if each of and is replaced by their average. In other words we might as well assume that the matrix Q is symmetric. Consider for example the function . The matrix Q is
What I did in class is the n-dimensional version of the following: Find new variables and and constants and such that . Put in the expressions for in terms of the and you get
Comparing coefficients we can check that
where A is the matrix with entries and is a diagonal matrix with and on the diagonal. In other words we have to diagonalize Q.
To find the eigenvalues of Q we can solve The characteristic polynomial is whose two roots are 2 and 7. To find the corresponding eigenvectors you ``solve'' . For you get the equations
These equations are linearly dependent (otherwise the only solution would be v=0 and would not be an eigenvalue). Solving either one gives so that is an eigenvector as is any non-zero multiple of that vector. To get a normalized eigenvector you divide through by the length of the vector, that is, by . The second eigenvector may be found similarly. We get the equation so that is an eigenvector for the eigenvalue 2. After normalizing we stick these two eigenvectors in the matrix I called P obtaining
Now check that
This makes the matrix A above be and and . You can check that as desired.
As a second example consider a sample of size 3 from the standard normal distribution, say, , and . Then you know that is supposed to have a distribution on n-1 degrees of freedom where now n=2. Expanding out
we get the quadratic form
for which the matrix Q is
The determinant of may be found to be . This factors as so that the eigenvalues are 1, 1, and 0. An eigenvector corresponding to 0 is . Corresponding to the other two eigenvalues there are actually many possibilities. The equations are which is 1 equation in 3 unknowns so has a two dimensional solution space. For instance the vector is a solution. The third solution would then be perpendicular to this, making the first two entries equal. Thus is a third eigenvector.
The key point in the lecture, however, is that the distribution of the quadratic form depends only on the eigenvalues of Q and not on the eigenvectors. We can rewrite in the form . To find and we fill up a matrix P with columns which are our eigenvectors, scaled to have length 1. This makes
and we find to have components
and
You should check that these new variables all have variance 1 and all covariances equal to 0. In other words they are standard normals. Also check that . Since we have written as a sum of square of two of these independent normals we can conclude that has a distribution.