Minimum Spanning Trees and Kruskal's Algorithm

(The instructions for using the applet are available on a separate page and can also be read under the first tab directly in the applet.)

 

This applet requires Sun's Java VM 2 which your browser may perceive as a popup. Which it is not. If you want to see the applet work, visit Sun's website at https://www.java.com/en/download/index.jsp, download and install Java VM and enjoy the applet.


What if applet does not run?

(This applet was created to accompany Excursions in Modern Mathematics, Seventh Edition, by Peter Tannenbaum © Pearson Education. Reproduced with permission.)

Kruskal's algorithm is so simple, many a student wonder why it really produces what it does, the minimum spanning tree. Intuitively, it collects the cheapest eligible edges which bolsters the belief that the "minimum" part in the caption (Minimum Spanning Tree) may well be justified. The algorithm avoids loops maintaining at every stage a forest of a finite number of trees. The number of trees can't grow indefinitely so that one may expect that, with time, some trees will be bridged into a single tree until only one tree remains. More formally we have

Theorem

  Kruskal's algorithm generates a minimum spanning tree for every connected graph.

Proof

Let G be a connected graph and T the result of applying to G of Kruskal's algorithm.

First off, T is a tree. Indeed, it contains no loops. Thus if it were not a tree, it would be disconnected, implying the existence of edges that could be added to T without incurring creating a loop. Picking the cheapest of such bridges we would simply continue Kruskal's algorithm in contradiction with our assumption that T was the final step of the algorithm. T is a spanning tree. For if there was a vertex of G not included in T, then again it would be possible to continue with Kruskal's steps by choosing the cheapest edge incident to that vertex or a cheaper one, with such existed.

What remains to show that T is a minimum spanning tree. Since the number of subgraphs of G is finite, there exists at least one such tree T'. If T' = T, we are done. Otherwise, i.e. assuming that T' ≠ T, we are going to show that the two have the same cost - the sum of weights of their edges.

Let E(T) and E(T') be the sets of edges of T and T', respectively. Both T and T' have n-1 edges where n is the number of vertices in G (and, thus also in T and T'). The assumption that E(T) ≠ E(T') lets us choose an edge e∈E(T) and e ∉ E(T'). Adding e to T' would create a circuit (and only one circuit at that.) At least one of the edges in this circuit does not belong to E(T), for T is a tree. Let f be such an edge. Let w(f) and w(e) be the weights of the edges e and f. It must be that w(f) ≥ w(e). If not, Kruskal's algorithm would have selected f instead of e or even earlier. But since f ∉ E(T), it was not selected by the algorithm and, therefore, w(f) ≥ w(e). By replacing f with e in T' we shall get a spanning tree of no greater cost than that of T' (because T' is a minimum spanning tree.) In particular, it follows that w(f) = w(e). The operation replaces T' with another minimum spanning tree which is "one edge closer to T." If there is another edge in T but not in T' the process may be repeated. It will go on until E(T') becomes equal to E(T). However, on every step T' maintained its minimality which it eventually bestows on T, as required.

Reference

  1. E. Horowitz, S. Sahni, Fundamentals of Computer Algorithms, Computer Science Press, 1984

|Activities| |Contact| |Front page| |Contents|

Copyright © 1996-2018 Alexander Bogomolny

71543725