Eigenvectors by Inspection
William A. McWorter Jr.
Many years ago I mistakenly assigned for linear algebra homework the problem
Find the characteristic polynomial of the matrix
One student found a way to avoid the tedious calculation of the determinant of
He literally 'saw' four eigenvectors! An eigenvector of a square matrix M is a nonzero vector v such thatMv = λv, for some scalar λ.
The scalar λ is called the eigenvalue associated with the eigenvector v. Note the above equation forces M to be square. Nonsquare matrices cannot have eigenvectors.
For normal human beings, only the most special of square matrices yield up their eigenvectors. The scalar matrix aI, for example, where I is the identity matrix, has every nonzero vector v as eigenvector with a as associated eigenvalue, because aIv = av. Upper triangular matrices give up only one eigenvector easily. For example,
has the eigenvector v = (1,0)T because
Square matrices with obviously linearly dependent columns permit one to easily construct some eigenvectors. For example,
has the eigenvector v = (1, -1, 0)T with associated eigenvalue 0 because
Many special matrices, including stochastic matrices, have constant row sums. This
implies that the vector
In the old days, before the student with the eigenvector eyes was born, combinatorialists computed the determinant of the special n by n matrix
with a on the main diagonal and b everywhere else, by doing some clever row and column
operations to triangularize the matrix before computing its determinant. But now with
new eyes given me by the clever student, I can see a complete set of eigenvectors for D!
No doubt you see the first one, namely,
As a final example consider the matrix
A complete set of eigenvectors for E is
The last example points up the fact that, although it is possible to see eigenvectors directly, it is almost never a routine matter. However, constructing square matrices with non-obvious eigenvectors is much easier. Suppose you want to construct a 4 by 4 matrix with eigenvalues 2, 2, and -1. For reasons that will become apparent, we'll let the fourth eigenvalue be whatever it turns out to be. Let the first column of a matrix F be whatever, say
Choose as an eigenvector for F with eigenvalue 2 a vector with a nonzero
first entry, a 1 in the third column, and zeros elsewhere, say
and Fu = 2u. Now as a second eigenvector for F with eigenvalue 2 choose a
vector with a nonzero entry in the third column, a 1 in its second column, and zeros elsewhere, say
with Fv = 2v. Finally, for a third eigenvector for F with eigenvalue -1 choose a vector with a nonzero entry in its second column, a 1 in its fourth column, and zeros elsewhere, say
with Fw = -w. Since the trace of F is -4, the fourth eigenvalue of F is -7. Hence the characteristic polynomial of F is
For those who want examples a computer can be taught to spew out, here are some infinite families of matrices with given eigenvectors and eigenvalues.
Let a, b, c, d, e, f, and s, be any real numbers. Define
|a b c | |b a -c | |c -c a-b-c|
| a c-a a-c | |b-a a b-a | |b-a a-c -a+b+c|