Date: Friday, January 24, 1997 9:45 PM

From: Alex Bogomolny

Dear Mary:

Please stop tearing your hair - this is extremely unhelpful. Calm down and tell yourself that the topic so much frastrating to you now has been taken up, digested, passed over and used by probably millions of students not necessarily smarter than you are. Apply some self-therapy. Tell yourself you can do it. Because no one will be able to do this for you. If you are in a fighting mood let's continue.

But before going any further let me make two remarks which you may find interesting to ponder:

- Do you realize that, say, number 0 is a more or less recent invention. Ancient Babylonians, Greeks, Romans, Chinese and the rest of the world had no notion of that "simplest" of all numbers. 0 was introduced by Hindus somewhere by the end of the first millennium AD. Since numbers originated from counting and there is no good reason to count a single object, number 1 appeared later than other numbers. Negative numbers have been considered an absurdity as recently as the beginning of the 19th centuary.
- Human brain capacity has not changed for probably the last 2000-3000 years. We know more but we are not smarter. The Greek poetry, sculpture, math, drama became our classic. Still we do see farther. This is because, as Newton put it, we stand on the shoulders of giants. Ideas that would be a shock to the smartest of Ancients come to you naturally just because you happened to be born 2500 years later in the most industrial country in the world. They would not understand Algebra at all for they never had any symbolism in their employ, but you should. You are supposed to; for this is how things stand out now. You are neither the first nor the last.

If you wish to continue, make sure that you understand or, at least, are familiar with the concept of a negative number. Let me know if you have difficulty with any of the following:

- 5 + (-5) = 0 and 0 - 5 = -5.
- a
^{b}* a^{c}= a^{(b+c)} - a
^{0}= 1 - If b = c then a
^{b}= a^{c} - If a
^{b}= a^{c}then b = c.

There is nothing difficult or profound in the *negative powers*.
The reason for introducing negative powers is a matter of convenience.
Using 1/a as the inverse of a is quite OK. But note that writing 1/a
is as much a matter of convention as writing a^{-1}. Mathematics
is very adept in generating new symbols and terminology. There is a
good historic lesson. Newton and Leibniz have invented Calculus more
than 300 years ago. However, Calculus notations are entirely due to
Leibniz who believed that the right language and symbolism help
and further right thinking and understanding.

Look at 1/a=a^{-1} as an axiom that comes in company of a few
others. For example, a^{b}*a^{c} = a^{(b+c)}
is true not only for positive b and c but for any real b and c -
positive, negative, and 0. Introduction of a^{-1}=1/a makes
this axiom meaningful.

Try some simple manipulations to convince yourself that the definition will cause no contradictions. Following is not a proof but a justification for the definition:

(b/a)(d/c)=(bd)/(ac), right? This is true for any a,b,c,d with a and c
non-zero. Now, let b=d=1 and a=c. Then (1/a)(1/a)=1/a^{2}. On the
left, by definition, we have (1/a)(1/a)=a^{-1}*a^{-1}. If
you accepted that a^{b}*a^{c} = a^{(b+c)} holds
for any b and c then you should accept also (1/a)(1/a)=a^{-1}*a^{-1}=a^{-2}.
Look at the nice outcome. On the one hand, (1/a)(1/a)=1/a^{2}. On the other,
(1/a)(1/a)=a^{-2}. Therefore, a^{-2}=1/a^{2} which makes sense
too because, by definition, (a^{2})^{-1}=1/a^{2}, and
one other axiom you should accept is (a^{b})^{c}=a^{bc}.
(This can be proven for positive numbers and regarded as an axiom for negative.)

Mathematicians chose to use a certain language (a^{-1}) because this
made their rules more universal and greatly simplified formulation of their
theorems. But this is what any language is about. Imagine a society in which
a fellow wakes up in the morning, looks out a window and says: "Wooooo waaaaa whew."
meaning something like "What a wondeful morning. I am extremely hungry. Why
won't I have my continental breakfast under the apple tree that my dear father
planted down yonder when I was born?"

The lesson is that development of the right language is quite important. Using
a^{-1} to denote 1/a appears to be a very right language because of the
convenience that comes along. Nothing to it. Just do it a step at a time until
you get used to the *negative powers.*

Sincerely,

Alexander Bogomolny