What Is Algorithm?

Simple as the definition of the notion of algorithm is, the concept of what it attempts to convey is a matter of debate and scientific research. In most of textbooks (see, e.g. Review of Discrete Algorithmic Mathematics by S. B. Maurer and A. Ralston) algorithms are required to possess several properties, notably Finiteness and Definiteness. In prose

 An algorithm is a sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any legitimate input in a finite amount of time [Levitin, p. 3].

If the problem is that of computations, being "unambiguous" usually means that a human of average intelligence must be able (if only in principle) to follow the instructions with pencil and paper. In theory, a discerning robot must be able to perform the job as well. And this inescapably links the idea of algorithm to programming. Indeed, algorithms are commonly expressed either in a programming language like Pascal or C++, or a pseudocode. Some go an extra mile by inventing a custom programming language, like AL by Maurer and Ralston and the notorious assembly for the hypothetical computer MIX by R. Knuth. This is all for the purpose of making the Description of the algorithms as "unambiguous" as possible.

However, quite often, the concept of algorithm is thought to be distinct from that of program. The distinction is similar to that made in mathematics between the Platonic notion of number and a magnitude describing the size of a naturally available set (as in "three cows," say.)

Often, too, a program, unlike an algorithm, is not required to terminate in a finite time, and, indeed, the Halting Problem, that is the problem of existence of an algorithm that could predict whether a given program will terminate (in finite time, of course) for a given input, is a cornerstone of the Computability Theory.

In my view, the finiteness requirement is a red herring. Consider an example of typing out consecutive prime numbers. For any integer, there is an unambiguous (and finite) way to determine whether it is prime or not. We'll do this an integer at a time staring with, say, 2. We may set up a terminating condition (say, check the first million of integers or print out the first million of primes) or not. Compare

  N = 2
while (N < 1000000)
{
  if (N is prime)
    print N
  N = N + 1
}
     N = 2
while (true)
{
  if (N is prime)
    print N
  N = N + 1
}

It's hard for me to imagine why one of these would be honored with the term algorithm while the other would be denied that honor.

There are several algorithms discussed and illustrated at this site:

  1. Algorithm for Computing LCM
  2. Ambassadors at a Round Table
  3. Calculation of the Digits of pi by the Spigot Algorithm
  4. Calculation of the Digits of pi (Faster Version)
  5. Cheapest Link & Kruskal's Algorithms
  6. Distance Between Strings
  7. Egyptian Multiplication
  8. Euclid's Algorithm
  9. Genetic Algorithm Solves the Toads and Frogs Puzzle
  10. Horner's Method
  11. Make It All Zeros
  12. Marriage Problem, simulation
  13. Mazes
  14. Peasant Multiplication
  15. Project Scheduling
  16. Sorting Algorithms

Geometric construction problems offer additional examples.

The word "algorithm" has an interesting history. Here's a quote from S. Schwartzman's The Words of Mathematics:

  algorism (noun), algorithm (noun), algorist (noun): these words come from the now-quite-distorted name of a person, Ja'far Mohammed Ben Musa, who was known as al-Khowarazmi, meaning "the man from Khwarazm." (In a similar way, Leonardo da Vinci was actually Leonard, a man from the town of Vinci). Around the year 825 al-Khowarazmi wrote an arithmetic book explaining how to use the Hindu-Arabic numerals. That book was later translated for Europeans and appeared with the Latin title Liber Algorismi, meaning "Book of al-Khowarazmi." As a consequence, the term algorism came to refer to the decimal system of numeration. Any use or manipulation of Arabic numerals - especially a pattern used to add, subtract, multiply, etc. - was known as an algorism. Arithmetic itself was sometimes called algorism, and in a similar fashion Europeans who advocated the adoption of Hindu-Arabic numerals were known as algorists. Over the centuries the word algorism underwent many changes in form. In Old French it became augorisme, which then developed into the now obsolete English augrim, agrim, and agrum. The current form algorithm exhibits what the Oxford English Dictionary calls a "pseudo-etymological perversion": it got confused with the word arithmetic (which was one of its meanings, and which has several letters in common with it); the result was the current algorithm. Current dictionaries still list the older form algorism in the sense of "the decimal or Arabic system of numeration."

(I believe that the name is more often transcribed as al-Khowarizmi. He was a prolific and an important Persian writer. The title of one of his books, al-jebr w'al-muqabalah, metamorphosed into the current term Algebra. It gives me a great pleasure to mention a fact of which I was unaware until very recently. al-Khowarizmi also wrote a treatise on the Hebrew calendar, Risala fi istikhraj ta'nkh al-yahud "Extraction of the Jewish Era". A calendar is nothing but an algorithmic way of assigning - in this case Hebrew - dates.)

References

  1. R. Knuth, The Art of Computer Programming, volumes 1-3, boxed set, 2nd edition, Addison-Wesley, 1998
  2. A. Levitin, Introduction to The Design & Analysis of Algorithms, Addison-Wesley, 2003
  3. S. B. Mauer, A. Ralston, Discrete Algorithmic Mathematics, A K Peters, 3rd edition, 2004
  4. R. Sedgewick, Algorithms, Adison-Wesley Publishing Co., 1983
  5. R. Sedgewick, Algorithms in C++, Adison-Wesley Publishing Co., 1983
  6. S. Schwartzman, The Words of Mathematics, MAA, 1994

|Contact| |Front page| |Contents| |Algebra| |Up|

Copyright © 1996-2018 Alexander Bogomolny

71492631