Matrix Groups

The problem below was proposed in the Mathematics Magazine (September, 1959) and discussed in The American Mathematics Monthly (vol. 70, n. 4, Apr 1963, p. 427) by R. A. Rosenbaum:

(I) The set of nonsingular n×n matrices such that the sum of the elements in each row of each matrix is 1 forms a group under multiplication.

R. A. Rosenbaum points out that a generalization of that statement is of the kind that gets closer "to the heart of the matter" than the statement itself, by stripping away nonessentials and exposing the significant relationships. As he notes, the significant hypothesis is that the row-sums be constant - not necessarily 1, and suggests to consider another problem instead.

(II)

Let A = [aij ] be an m×n matrix, B = [bij ] be an n×p matrix, and C = A×B.

  1. If the row-sums of A are all equal to a and the row-sums of B are all equal to b, then the row-sums of C are also constant and are equal to ab.

  2. If the row-sums of B are all equal to b ≠ 0, and the row-sums of C are all equal to c, then the row-sums of A are also constant and are equal to c/b.

The reformulation has a

Corollary

The set of all nonsingular n×n matrices (over any field) such that, for any one matrix, the sum of the elements of each row is constant (but perhaps not the same constant for different matrices) forms a group under multiplication.

If in the generalization a, b, c are all taken to be 1, then, with m = n = p, the original problem comes out as another corollary.

The proof of the generalization is indeed straightforward and, were a, b, c to be replaced by 1, would not differ of that for the original statement.

 j cij= ∑jk aik bkj
  = ∑kj aik bkj
  = ∑k aikj bkj
  = ∑k aik b
  = b ∑k aik
  = b × a
  = a × b.

Professor W. McWorter has remarked that, for square n×n matrices, having constant row-sums means exactly having an eigenvector 1 = (1 1 ... 1)T:

 
a11...a1n
a21...a2n
...
an1...ann
1
1
...
1
=
j a1j
j a2j
...
j anj
=a
1
1
...
1

Thus we can write A1 = a1, B1 = b1, and C1 = c1, implying

 c1= C1
  = AB1
  = A(b1)
  = b A1
  = b×a1
  = a×b 1.

So, again, c = ab, as expected.

As Professor McWorter has observed, the latter derivation works just fine when vector 1 is replaced with any other vector v. More accurately, the following statement holds:

(III)

Let A = [aij ] and B = [bij ] be an n×n square matrices, and C = A×B.

  1. If A and B have a common eigenvector v with the eigenvalues a and b respectively, then v is also an eigenvector of C with the eigenvalue c = ab.

  2. If B and C have a common eigenvector v with the eigenvalues c and b (so that b ≠ 0 by definition), then v is also the eigenvector of A with the eigenvalue a = c/b.

Furthermore, similar claims for the addition of matrices and the multiplication by a scalar also hold true.

Related material
Read more...

  • Matrices Help Relationships
  • Eigenvalues of an incidence matrix
  • Addition of Vectors and Matrices
  • Multiplication of a Vector by a Matrix
  • Multiplication of Matrices
  • Eigenvectors by Inspection
  • Vandermonde matrix and determinant
  • When the Counting Gets Tough, the Tough Count on Mathematics
  • Merlin's Magic Squares
  • |Contact| |Front page| |Contents| |Generalizations| |Geometry|

    Copyright © 1996-2018 Alexander Bogomolny

    71537040