Cut The Knot!

An interactive column using Java applets
by Alex Bogomolny

Mazes

October 2003

Long time ago at a carpenter shop in Iowa City I purchased an unfinished desk for my first computer -- a grant from a local IBM office. I wondered aloud at the price, $35. "Cost of the material aside, labor alone should have cost more than that." The carpenter replied with pride, "Not if you know how to go about cutting and assembling the pieces, and you do not." I always remember this episode when a problem I try to resolve has a simple solution I keep missing.

For a long time I wanted to write a computer program to draw mazes. I even made a few attempts, too. But none was either satisfactory or satisfying. As it finally came out, there exist well established techniques for creating and solving mazes, known probably to every undergraduate computer science major. Indeed, the web is inundated with well documented examples by students who made their homework available on the Internet. But of course... Graphs and mazes are one and the same thing -- in some sense at least. The caveat is that the apparent complexity of a maze should not be judged by a naked eye, but rather with the mind's eye. A good example is furnished by what one may call a Hilbert maze. (To see the example in all its glory, check "Maze" and keep clicking on the applet below.)


This applet requires Sun's Java VM 2 which your browser may perceive as a popup. Which it is not. If you want to see the applet work, visit Sun's website at https://www.java.com/en/download/index.jsp, download and install Java VM and enjoy the applet.


As a graph, Hilbert's maze is trivial. It's just a two vertex graph connected by a single edge, the latter being the canonical approximation to the famous plane filling curve. A similar observation applies to Peano's mazes, although there are 272 of them. Curiously, many of the historic mazes [Dudeney, Gardner, Rouse Ball] are exactly of this kind. The path meanders in a small area making the journey from the entrance to the exit long, but no fear -- there is no way to get lost in the hedges. (However, concerning Peano and Hilbert's mazes, one might contemplate the properties of the limiting maze. The path is still there, but the passageway is too narrow to squeeze by. If one could, it would make a long and a fearful journey indeed. A situation worth Zeno's attention.)

Next in complexity to the trivial ones are the mazes represented by trees. Trees are the graphs with a single path from every vertex to any other vertex. Any connected graph has a spanning tree, i.e., a tree which is a subgraph with exactly same vertices but only some of the edges.

There are several algorithms that extract a spanning tree from a given graph. The applet below implements two of them. Both Prim's and Kruskal's algorithms have been published in 1956. Both are iterative and exemplify the greedy strategy, but in different ways. Prim's algorithm grows a tree starting with a single vertex, the simplest tree possible. On every iteration it adds to the existent tree a branch - a vertex and an edge connecting the vertex to the tree, such that the resulting graph remains a tree. To this end, the vertices are selected from the set of vertices not on the tree, but directly connected to the latter with at least one edge. Naturally, the set of candidate vertices is being updated with every iteration [Levitin, pp. 305-309].

Kruskal's algorithm starts with the set of all vertices and an empty set of edges. On each iteration an edge is added to the collection. The only restriction in this process is that the addition of an edge should not create a circuit, a path that starts and ends with the same vertex. (Simple structures and algorithms have been devised to speed up verification, see [Levitin, pp. 314-317].)

As a matter of fact, the situation is a little more complex than just described. Both algorithms work on weighted graphs, where each edge has been assigned a weight. The algorithms are greedy in that, on every iteration, out of the available set, they select an edge of the minimum weight. The resulting tree is known as the minimum spanning tree. (Without the weights, the resulting mazes are unappealing. The applet assigns weights almost randomly.) Simple proofs by induction show that the algorithms really work.

One of the standard ways to solve a maze is by the depth first search [Levitin, pp. 163-165] technique: follow a path from a node until a dead end is reached. Retreat up the path till the first opportunity to check another node. Continue from there. Repeat the process until all the nodes have been visited.

The applet bellow has four tabs - Define grid, Define shape, Create maze, Solve maze. The starting point is a rectangular grid whose dimensions range from 2 to 50. The shape of the grid could be changed by removing some of the vertices and also by putting some of the removed ones back, and by selection of the Start (blue) and End (red) vertices. In any event, the vertices must be clicked on. As described, to create a maze, you are given a choice of Prim's and Kruskal's algorithm. You can watch them work by clicking the "Step By Step" button, or perform one step at a time with the "One Step" button. You can also watch a maze solved or proceed one step (of the depth first algorithm) at a time.


This applet requires Sun's Java VM 2 which your browser may perceive as a popup. Which it is not. If you want to see the applet work, visit Sun's website at https://www.java.com/en/download/index.jsp, download and install Java VM and enjoy the applet.


The English word "maze", that only in the 14th century became associated with the network of paths [Schwartzman] is related to a longer form "amaze." In old times it is said [Dudeney, p. 127] to be amazed meant to be "lost in thought" which may happen naturally once in a maze. Nowadays, the word "maze" still comes to mind sometimes when we are confronted with a quandary. The word "amaze" has now a different connotation. With a reference to a later, now customary meaning of the word, it is surely amazing how things become quite simple once "you know how."

References

  1. H. E. Dudeney, Amuzements in Mathematics, Dover, 1970
  2. M. Gardner, The Second Scientific American Book of Mathematical Puzzles and Diversions, The University of Chicago Press, 1987
  3. A. Levitin, The Design & Analysis of Algorithms, Addison Wesley, 2003
  4. W. W. Rouse Ball and H. S. M. Coxeter, Mathematical Recreations and Essays, Dover, 1987
  5. S. Schwartzman, The Words of Mathematics, MAA, 1994

|Contact| |Front page| |Contents| |Geometry|

Copyright © 1996-2018 Alexander Bogomolny

Prim's and Kruskal's algorithms do work.

The basis for the proofs is the existence of minimum spanning trees in the first place. Any connected graph has subgraphs which are spanning trees. Indeed, if a given connected graph is not a tree by itself, it has a circuit. Removal of any edge in this circuit from the graph leaves it connected with the same vertices but fewer edges. If at this stage, the remaining graph is a tree we may stop. Otherwise, the process of edge removal will continue. But, since the number of edges is finite, it can't continue forever. If |S| denotes the number of elements in a set S, then the process of edge removal will stop when |V| = |E| + 1, since, as is easily seen, this property characterizes the trees: for a connected graph, for which |V| = |E| + 1, removal of any of its edges renders it disconnected.

For weighted graphs, the number of spanning trees, however large, is finite. For this reason, among all spanning trees of a given graph, there exist trees with minimum weight possible.

Now let's turn to the demonstration that the two algorithms deliver on the promise, i.e., that each leads to a minimum spanning tree of a given graph. Denote the set of vertices of the given graph as V, and the set of its edges as E. Let Vi and Ei denote the set of vertices and, respectively, edges gathered on the ith step of either algorithm. We'll call the pair <Vi, Ei> acceptable if it can be included into a minimum spanning tree. We have to show that on every step the algorithms produce an acceptable pair. When |En| + 1 = |V|, the graph is bound to be a tree that contains all the vertices of the given graph and, therefore, serves as its spanning tree.

Since the proofs for the two algorithms are very similar, I'll arrange them in two columns. The proofs are by induction.

Prim's algorithm Kruskal's algorithm
On every step of the algorithm we have a set of vertices connected by edges such that the graph they form is a tree.  On every step of the algorithm we have a set of edges such that the graph they form is circuit-free.
Let V0 consist of a single vertex (Start), and E0 be empty.  Let V0 = V and E0 be empty.
Since minimum spanning trees exist, <V0, E0> is an acceptable pair.  Since minimum spanning trees exist, <V, E0> is an acceptable pair.
Assume that the pair <Vk, Ek> produced on the kth step is acceptable. To complete the proof, we'll show that the pair <Vk+1, Ek+1> is also acceptable.  Assume that the pair <V, Ek> produced on the kth step is acceptable. To complete the proof, we'll show that the pair <V, Ek+1> is also acceptable.
Assume to the contrary that <Vk+1, Ek+1> is not acceptable.  Assume to the contrary that <V, Ek+1> is not acceptable.
By assumption, there exists a minimum spanning tree <V, F> with Ek a subset of F. Assume ek+1 is the edge selected by Prim's algorithm on the (k+1)st step. Adjoining ek+1 to F is bound to produce a circuit.  By assumption, there exists a minimum spanning tree <V, F> with Ek a subset of F. Assume ek+1 is the edge selected by Kruskal's algorithm on the (k+1)st step. Adjoining ek+1 to F is bound to produce a circuit.
This means that there exists another edge e with weight at least that of ek+1 whose removal from F will leave a tree <V, F-{e}+{ek+1}>. The weight of the latter can't exceed that of the minimum spanning tree <V, F>. So it, too, is a minimum spanning tree, but in contradiction with our assumption, its set of edges contains Ek+1.  This means that there exists another edge e with weight at least that of ek+1 whose removal from F will leave a tree <V, F-{e}+{ek+1}>. The weight of the latter can't exceed that of the minimum spanning tree <V, F>. So it, too, is a minimum spanning tree, but in contradiction with our assumption, its set of edges contains Ek+1.

|Contact| |Front page| |Contents| |Geometry|

Copyright © 1996-2018 Alexander Bogomolny

72099504