The Nobel Prize winner Richard Feynman said in 1948 that his research in quantum electrodynamics is all hinged on what he called a "great identity":
\frac{1}{ab} = \int^{1}_{0}\frac{dx}{[ax+b(1x)]^2}.

This is puzzling. The RHS of the equation is always positive, while the LHS is negative when a and b are of different signs. How is it possible?
Let's simply confirm the above identity by computing the integral in the RHS. Let's define u=ax+b(1x) so that du=(ab)dx and
\int^{1}_{0}\frac{dx}{[ax+b(1x)]^2}

=\int^{a}_{b}\frac{du/(ab)}{u^2}
 
=\frac{1}{ab}\int^{a}_{b}\frac{du}{u^2}
 
=\frac{1}{ab}\big\{\frac{1}{u}\big\}\big^{a}_{b}
 
=\frac{1}{ab}\frac{ab}{ab}=\frac{1}{ab}.

It all looks pretty solid. Where is then the problem?
For some pairs of a and b the denominator ax+b(1x) may turn 0 on the interval [0,1], forcing an integration over infinity and making the result suspect. When may this happen? If ax+b(1x)=0 then (ab)x+b=0, implying x=\frac{b}{ba}.
Now, do verify that this fraction may fall in the interval [0,1] only when a and b are of different signs. If a and b are of the same sign, the critical value x=\frac{b}{ba} falls outside the interval [0,1], with no harm to the integral.
References
 P. H. Nahin, NumberCrunching, Princeton University Press, 2011