What Is Calculus?
Calculus is a branch of mathematics invented in the 17th century by I. Newton and G. W. Leibniz amid controversies of continental proportions. First, there was an acrimonious question of precedence which took a long way to settle. It is now accepted that the two founding fathers made their discoveries independently, with Newton being the first to come up with some basic ideas, while Leibniz was the first to publish his results. To Leibniz we also owe the common notations nowadays used in Calculus and the term Calculus itself. But the story is not a pretty one [Dunham, p. 21].
Second, the fundamental notion of an infinitely, or indefinitely, small quantity, the infinitesimal in other words, was assumed to have such strange properties, like sometimes being zero and sometimes being non-zero, that the best contemporary minds refused to accept the emergent methods. The forefathers have been driven by their intuition and the final results of their methods were indisputably correct. However, the methods themselves were suspect. The revulsion to the new ideas seeps through in George Berkeley's 1734 essay, The Analyst. Here is an overused quote:
And what are these fluxions? The velocities of evanescent increments? And what are these same evanescent increments? They are neither finite quantities nor quantities infinitely small, nor yet nothing. May we not call them the ghosts of departed quantities?
As an example, let's find the derivative y' of y = x². Assuming o to be a small, but a non-zero, quantity,
|[(x + o)² - x²] / o||= (x² + 2ox + o² - x²) / o|
|= (2ox + o²) / o|
|= 2x + o.|
By setting o = 0 we conclude that y' = 2x. Which is a meaningful and, in fact, correct result. But, according to Berkeley, it has been derived by both letting o to be zero and non-zero in the same breath. This he justly found utterly objectionable.
Both founding fathers made mistakes. For example [Grattan, p. 246], Leibniz believed (albeit briefly and before the result had been published) that
So what is Calculus? Calculus is the branch of mathematics that defines and deals with limits, derivatives and integrals of functions. It is often divided into two parts: Differential Calculus (dealing with derivatives, i.e., rates of change and tangents) and Integral Calculus (dealing with integrals, i.e., areas and volumes). Differential Equations that use methods from both grew into a separate discipline (actually more than just one), as did Calculus of Variations.
As a way to illustrate the evolution of Calculus, I shall list several quotes plucked from Dunham's book that demonstrate the attitude of various authors to the notion of infinitely small and the related notion of limit.
|Leibniz||by infinitely small, we understand something ... indefinitely small, so that each conducts itself as a sort of class, and not merely as the last thing of a class. If anyone wishes to understand these [the infinitely small] as the ultimate things ..., it can be done, and that too without falling back upon a controversy about the reality of extensions, or of infinite continuums in general, or of the infinitely small, ay even though he think that such things are utterly impossible.|
|Euler||There is no doubt that any quantity can be diminished until it all but vanishes and then goes to nothing. But an infinitely small quantity is nothing but a vanishing quantity, and so it is really equal to 0. ... There is really not such a great mystery lurking in this idea as some commonly think and thus have rendered the calculus of the infinitely small suspect to so many.|
|D'Alembert||... the quantity to which the ratio z/u approaches more and more closely if we suppose z and u to be real and decreasing. Nothing is clearer than that.|
|Cauchy||When the value successively attributed to a variable approach indefinitely to a fixed value, in a manner so as to end by differing from it by as little as one wishes, this last is called the limit of all the others.|
The latter is the notorious ε-δ definition of limit that completely avoids the reference to any conduct of a variable or to any change that it may undergo with time. It is surely a far cry from the uncritical use of the infinitely small that characterized the early years of Calculus. Weierstrass' definition of limit appeared to finally nail the coffin of the departed quantities and led to a complete abandonment of the original idea of infinitesimals. However, in the 1960s the ghosts have been resurrected by Abraham Robinson and placed on the sound foundation of the non-standard analysis thus vindicating the intuition of the founding fathers. To give the flavor of what A. Robinson's has eventually come up with here's his statement:
limx→af(x) = L, iff f(x) is infinitely close to L whenever
- W. Dunham, The Calculus Gallery: Masterpieces from Newton to Lebesgue, Princeton University Press, 2008
- I. Grattan-Guinness, The Rainbow of Mathematics, W. W. Norton, 1997