Combinatorial Proofs

Combinatorial proof is a perfect way of establishing certain algebraic identities without resorting to any kind of algebra. For example, let's consider the simplest property of the binomial coefficients:

(1)

C(n, k) = C(n, n - k).

To prove this identity we do not need the actual algebraic formula that involves factorials, although this, too, would be simple enough. All that is needed to prove (1) is the knowledge of the definition: C(n, k) denotes the number of ways to select k out n objects without regard for the order in which they are selected. To prove (1) one needs to observe that whenever k items are selected, n-k items are left over, (un)selected of sorts. So that proving (1) becomes a word usage matter. (In this example, another simple proof is by introducing m = n - k, from which k = n - m so that (1) translates into an equivalent form C(n, n - m) = C(n, m).)

As another example, the identity

(2)

C(n, 0) + C(n, 1) + C(n, 2) + ... + (n, n-1), + C(n, n) = 2n

which is a consequence of the binomial theorem

(x + y)n = Σ C(n, k) xk yn-k, 0 ≤ k ≤ n.

admits a combinatorial interpretation. The left hand side in (2) represents the number of ways to select a group -- empty or not -- of items out of a set of n distinct elements. The first term gives the number of ways not to make any selection, which is 1. The second term gives the number of ways to select one item (which is n), etc. What does the right hand side represent? Exactly same thing. Indeed, with every selection of items from a given set we can associate a function that takes values 0 or 1. A selected element is assigned value 1, while an unselected element is assigned value 0. If for the sake of counting convenience, the elements of the set are ordered with indices 1, ..., n, then every selection from the set is represented by a string of 0's and 1's; the total number of such strings is clearly the right hand side in (2): 2n.

Thus a combinatorial proof consists in providing two answers to the same question. But not to forget, finding the question to be answered in two ways is conceivably the most important part of the proof. As a matter of convention, it is often convenient to think of sets and their elements as groups of students and of selections of elements as endowing them with a membership on a committee.

For a third example, consider the popular identity underlying the Pascal triangle:

(3)

C(n, k) = C(n - 1, k) + C(n - 1, k - 1).

By definition, the left hand side is the number of ways to compose a k-member committee out of a group of n students. To grasp the significance of the right hand side, pick arbitrarily one of the students. Then the first term on the right gives the number of k-member committees that do not include the student, whereas the second term gives the number of committees in which the student is a member.

Here is an additional example. Prove that

(4)

C(n, r)C(r, k) = C(n, k)C(n-k, r-k),

where k ≤ r ≤ n.

C(n, r) is the number of ways to form an r-member committee from a group of n students. C(r, k) is the number of ways to form a 4-member committee out of a group of r students. As r is the same in both cases, it it sensible to assume that the r students selected from the initial n are exactly those among whom we seek a more restrictive r. So we could describe the left hand side in (4) as the number of ways to choose a k-member committee from an n-member student body and a k-member subcommittee out of the selected r. So the question is in how many ways is it possible to choose a r-member committee from an n-member student body and a k-member subcommittee out of the selected r. The left hand side gives an answer to that question. The right hand side answers the same question but in a different way. First we select a k-member subcommittee out of the n-member student population and later complete it to an r-member committee by selecting r-k members out of the remaining population of n-k students.

I'll provide more examples plucked mostly from an outstanding and readily available book by A. T. Benjamin and J. J. Quinn.


References

  1. V. K. Balakrishnan, Theory and Problems of Combinatorics, Schaum's Outline Series, McGraw-Hill, 1995
  2. A. T. Benjamin, J. J. Quinn, Proofs That Really Count: The Art of Combinatorial Proof, MAA, 2003
  3. P. Zeitz, The Art and Craft of Problem Solving, John Wiley & Sons, 1999

Combinatorial Proofs

  1. Combinatorial Proofs
  2. Lucas' Theorem.
  3. Ways To Count.
  4. Geometric Proof of Wilson's Theorem.
  5. Partitioning a Circle
  6. A Minesweeper Theorem
  7. Another Binomial Identity with Proofs
  8. Counting Fat Sets
  9. Counting Permutations with Fixed Points
  10. Pythagorean Triples via Fibonacci Numbers

Related material
Read more...

  • The Basic Rules of Counting
  • Fibonacci Tilings
  • The Inclusion-Exclusion Principle
  • Inclusion-Exclusion Principle: an Example
  • Ramsey's Theorem
  • Coloring Points in the Plane
  • Probabilities
  • Example: A Poker Hand
  • |Contact| |Front page| |Contents| |Algebra|

    Copyright © 1996-2018 Alexander Bogomolny

    71753347