The Nobel Prize winner Richard Feynman said in 1948 that his research in quantum electrodynamics is all hinged on what he called a "great identity":

\frac{1}{ab} = \int^{1}_{0}\frac{dx}{[ax+b(1-x)]^2}.

This is puzzling. The RHS of the equation is always positive, while the LHS is negative when a and b are of different signs. How is it possible?

Let's simply confirm the above identity by computing the integral in the RHS. Let's define u=ax+b(1-x) so that du=(a-b)dx and

\int^{1}_{0}\frac{dx}{[ax+b(1-x)]^2}
=\int^{a}_{b}\frac{du/(a-b)}{u^2}
=\frac{1}{a-b}\int^{a}_{b}\frac{du}{u^2}
=\frac{1}{a-b}\big\{-\frac{1}{u}\big\}\big|^{a}_{b}
=\frac{1}{a-b}\frac{a-b}{ab}=\frac{1}{ab}.

It all looks pretty solid. Where is then the problem?

For some pairs of a and b the denominator ax+b(1-x) may turn 0 on the interval [0,1], forcing an integration over infinity and making the result suspect. When may this happen? If ax+b(1-x)=0 then (a-b)x+b=0, implying x=\frac{b}{b-a}.

Now, do verify that this fraction may fall in the interval [0,1] only when a and b are of different signs. If a and b are of the same sign, the critical value x=\frac{b}{b-a} falls outside the interval [0,1], with no harm to the integral.

References