Eigenvectors by Inspection

William A. McWorter Jr.

Many years ago I mistakenly assigned for linear algebra homework the problem

  Find the characteristic polynomial of the matrix

A =

One student found a way to avoid the tedious calculation of the determinant of xI - A and instead found all eigenvalues of A with only a little mental arithmetic. He wrote down the correct answer, completely factored! The characteristic polynomial of A is CA(x) = (x - 5)(x + 5)(x + 1)(x + 3). How did he do it?

He literally 'saw' four eigenvectors! An eigenvector of a square matrix M is a nonzero vector v such that

Mv = λv, for some scalar λ.

The scalar λ is called the eigenvalue associated with the eigenvector v. Note the above equation forces M to be square. Nonsquare matrices cannot have eigenvectors.

For normal human beings, only the most special of square matrices yield up their eigenvectors. The scalar matrix aI, for example, where I is the identity matrix, has every nonzero vector v as eigenvector with a as associated eigenvalue, because aIv = av. Upper triangular matrices give up only one eigenvector easily. For example,

B =

has the eigenvector v = (1,0)T because Bv = 2v. Being upper triangular, B displays as its diagonal entries all eigenvalues, namely, 2 and 4; but B does not make it easy to see what eigenvector goes with the eigenvalue 4.

Square matrices with obviously linearly dependent columns permit one to easily construct some eigenvectors. For example,

C =

has the eigenvector v = (1, -1, 0)T with associated eigenvalue 0 because Cv = 0v = 0, and the eigenvector w = (1, 1, -1)T also with associated eigenvalue 0 because Cw = 0w = 0. There is a third eigenvector with associated eigenvalue 9 (3 by 3 matrices have 3 eigenvalues, counting repeats, whose sum equals the trace of the matrix), but who knows what that third eigenvector is. Could you have 'seen' that it is (1, 2, 3)T? (Hence the characteristic polynomial of C is CC(x) = x2(x - 9).)

Many special matrices, including stochastic matrices, have constant row sums. This implies that the vector v = (1, 1, ..., 1)T is an eigenvector of such matrices with associated eigenvalue the constant row sum. By now you have probably observed that the matrix A that began this page has the eigenvector (1, 1, 1, 1)T with associated eigenvalue 5, because A has the constant row sum 5.

In the old days, before the student with the eigenvector eyes was born, combinatorialists computed the determinant of the special n by n matrix

D =

with a on the main diagonal and b everywhere else, by doing some clever row and column operations to triangularize the matrix before computing its determinant. But now with new eyes given me by the clever student, I can see a complete set of eigenvectors for D! No doubt you see the first one, namely, (1, 1, ..., 1)T, with eigenvalue a + (n-1)b. But there are n-1 more, namely, (1, -1, 0, ..., 0)T, (1, 0, -1, 0, ..., 0)T, etc., all with the eigenvalue a-b! Hence the determinant of D is the product, (a + (n-1)b)(a - b)n-1, of the n eigenvalues of D (once you have a basis consisting of eigenvectors, their eigenvalues, counting repeats, comprise all eigenvalues). Can you see now a complete set of eigenvectors for the matrix A that began this story?

As a final example consider the matrix

E =

A complete set of eigenvectors for E is (1, 1, 1, 1)T, with eigenvalue a + b + c + d, (1, -1, 1, -1)T, with eigenvalue a - b + c - d, (1, -1, -1, 1)T, with eigenvalue a - b - c + d, and (1, 1, -1, -1)T, with eigenvalue a + b - c - d.

The last example points up the fact that, although it is possible to see eigenvectors directly, it is almost never a routine matter. However, constructing square matrices with non-obvious eigenvectors is much easier. Suppose you want to construct a 4 by 4 matrix with eigenvalues 2, 2, and -1. For reasons that will become apparent, we'll let the fourth eigenvalue be whatever it turns out to be. Let the first column of a matrix F be whatever, say

F =
|1   |
|0   |
|3   |
|-1   |.

Choose as an eigenvector for F with eigenvalue 2 a vector with a nonzero first entry, a 1 in the third column, and zeros elsewhere, say u = (2, 0, 1, 0)T. Then, without affecting any columns of F except the third, the third column of F must be (2, 0, -4, 2)T. So far

F =
|1  2  |
|0  0  |
|3  -4  |
|-1  2  |,

and Fu = 2u. Now as a second eigenvector for F with eigenvalue 2 choose a vector with a nonzero entry in the third column, a 1 in its second column, and zeros elsewhere, say v = (0, 1, -1, 0)T. This forces the second column of F without caring about any other columns. We have

F =
|122  |
|020  |
|3-6-4  |
|-122  |,

with Fv = 2v. Finally, for a third eigenvector for F with eigenvalue -1 choose a vector with a nonzero entry in its second column, a 1 in its fourth column, and zeros elsewhere, say w = (0, 1, 0, 1)T. This now determines F:

F =

with Fw = -w. Since the trace of F is -4, the fourth eigenvalue of F is -7. Hence the characteristic polynomial of F is CF(x) = (x - 2)2(x + 1)(x + 7). So, without calculating a determinant, the teacher can quickly construct nontrivial examples for lectures or exams.

For those who want examples a computer can be taught to spew out, here are some infinite families of matrices with given eigenvectors and eigenvalues.

Let a, b, c, d, e, f, and s, be any real numbers. Define u = (a, b, c)T, v = (d, e, f)T. If uTv ≠ 0, then the matrix G = sI + uvT has the eigenvectors e1 = (e, -d, 0)T and e2 = (f, 0, -d)T with eigenvalue s, and e3 = u with eigenvalue s + uTv, provided, of course, that the vectors ei are each nonzero.

In addition,

 |a   b    c  | 
 |b   a   -c  |  
 |c  -c  a-b-c| 

has eigenvector (1, 1, 0)T with eigenvalue a + b, eigenvector (1, -1, 1)T with eigenvalue a - b + c, and eigenvector (1, -1, -2)T with eigenvalue a - b - 2c. And, finally,

 | a  c-a   a-c | 
 |b-a  a    b-a | 
 |b-a a-c -a+b+c| 

has eigenvector (1, -1, -1)T with eigenvalue a, eigenvector (0, 1, 1)T with eigenvalue b, and eigenvector (-1, 0, 1)T with eigenvalue c.

Related material

  • Matrices Help Relationships
  • Eigenvalues of an incidence matrix
  • Addition of Vectors and Matrices
  • Multiplication of a Vector by a Matrix
  • Multiplication of Matrices
  • Matrix Groups
  • Vandermonde matrix and determinant
  • When the Counting Gets Tough, the Tough Count on Mathematics
  • Merlin's Magic Squares
  • |Contact| |Front page| |Contents| |Algebra|

    Copyright © 1996-2018 Alexander Bogomolny