Fundamental Theorem of Algebra
Complex numbers are in a sense perfect while there is little doubt that perfect numbers are complex.
Starting from the tail, perfect numbers have been studied by the Ancients (Elements, IX.36). Euler (1707-1783) established the form of even perfect numbers. [Conway and Guy, p137] say this:
Are there any other perfect numbers? ... All we know about the odd ones is that they must have at least 300 decimal digits and many factors. There probably aren't any! |
Every one would agree it's rather a complex matter to write down a number in excess of 300 digits. Allowing for a pun, if there are odd perfect numbers they may legitimately be called complex. What about complex numbers in the customary sense? There is at least one good reason to judge them perfect. The Fundamental Theorem of Algebra establishes this reason and is the topic of the discussion below.
In the beginning there was counting which gave rise to the natural numbers (or integers): 1,2,3, and so on. In the space of a few thousand years, the number system kept getting expanded to include fractions, irrational numbers, negative numbers and zero, and eventually complex numbers. Even a cursory glance at the terminology would suggest (except for fractions) the reluctance with which the new numbers have been admitted into the family.
The oldest known record of mathematical development comes from the Rhind Papyrus dated at about 1700 B.C. The scroll appears to be a practical handbook of Egyptian mathematics with solutions to some 85 problems mostly involving fractions. Except for 2/3 all other fractions had 1 in the numerator. So that, for example, 2/61 was written as 1/40 + 1/244 + 1/488 + 1/610. No wonder the document was headed "Directions for knowing all dark things."
Irrational numbers have been discovered by the Pythagoreans (c 500 B.C.) They have been so shocked the fellow who divulged the secret to the broad world is reported to have been drowned. (The American Heritage Dictionary lists the following synonyms: mentally ill, psychotic, paranoid, deranged, invalid, defective, indefensible and more in the same vein.)
Negative numbers have not been fully accepted until the 18th century. Gerolamo Cardano (1501-1576) who scorned them as numeri ficti, devoted two separate chapters in his Ars Magna, one to equations in the form x3 + mx = p and another to equations x3 = mx + p. Rene Descartes (1596-1650), the father of Analytic Geometry, referred to negatives as recines fausses, or false roots.
The number zero has been invented by Hindus sometime before 825 A.D. when it was described (along with their positional system in base 10) by the Persian mathematician al-Khowarizmi. The Hindu's term for zero was sunya, or void. [Ore] mentions that translated into Arabic this became as-sifi, which is the common root of the words zero and cipher. I would assume that the latter rather came from the Hebrew sfira (counting) or sifra (digit). [Ore]'s version has the virtue of suggesting that even in small things there may be found fundamental secrets.
During 11th and 12th centuries a number of European scholars went to Spain to study Arab learning. I would speculate many of them met with or used books by the great Jewish poet, mathematician, and Torah commentator Abraham ben Meir ibn Ezra (1092-1167) who wrote a book on Arithmetic in which he explained the Arab system of numeration and a zero [Ore. p166]. The most influential on the spread of Arab numerals and the use of zero in Europe was Liber abaci (1202 A.D.) by Leonardo Fibonacci of Pisa (c 1175-1250).
Imaginary numbers have been discovered by Cardano when solving cubic equations. He ran into what would amount in the present day notations as a square root of a negative number. For somebody who looked askance at negative quantities, their square roots were bound to appear quite illusory. In the true mathematical spirit, he went on and used them nonetheless in formulas like
Leonhard Euler (1707-1783) made complex numbers commonplace and the first proof of the Fundamental Theorem of Algebra was given by Carl Friedrich Gauss (1777-1855) in his Ph.D. Thesis (1799). He considered the result so important he gave 4 different proofs of the theorem during his life time.
References
- J. H. Conway and R.K.Guy, The Book of Numbers, Springer-Verlag, NY, 1996.
- W. Dunham, The Mathematical Universe, John Wiley & Sons, NY, 1994.
- H. Eves, Great Moments in Mathematics Before 1650, MAA, 1983
- J. R. Newman, The Rhind Papyrus, in The World of Mathematics, v1, Simon and Schuster, NY, 1956
- O. Ore, Number Theory and Its History, Dover Publications, 1976
- J. A. Paulos, Beyond Numeracy, Vintage Books, 1992
- Perfect numbers are complex, complex numbers might be perfect
- Fundamental Theorem of Algebra: Statement and Significance
- What's in a proof?
- More about proofs
- Axiomatics
- Intuition and Rigor
- How to Prove Bolzano's Theorem
- Early attempts
- Proofs of the Fundamental Theorem of Algebra
- Remarks on Proving The Fundamental Theorem of Algebra
- A Proof of the Fundamental Theorem of Algebra: Standing on the shoulders of giants
- Yet Another Proof of the Fundamental Theorem of Algebra
- Fundamental Theorem of Algebra - Yet Another Proof
- A topological proof, going in circles and counting
- A Simple Complex Analysis Proof
- An Advanced Calculus Proof
|Contact| |Front page| |Contents| |Algebra| |Did you know?|
Copyright © 1996-2018 Alexander Bogomolny71930226