>This is a problem I've spent many hours vainly on. If a and
>d are relatively prime, is 1/(a) +1/(a+d)
>+1/(a+2d)...+1/(a+nd) ever an integer for some n? I know
>erdos proved this in the 30's. Can anyone tell me how I can
>get hold of the proof? I've looked all over the net and
I'm not familiar with this problem, and I don't know where to find a proof for it, but here are some ideas I have thought of. Perhaps they will help.
First, let's name the terms x_n = a+nd. Putting the first n terms over a common denominator gives the numerator as a sum of products, each of which contains all but one of the x_n's. For example
1/x_0 + 1/x_1 + 1/x_2 + 1/x_3 =
(x_1 x_2 x_3 + x_0 x_2 x_3 + x_0 x_1 x_3 + x_0 x_1 x_2)/
x_0 x_1 x_2 x_3.
Now it is clear that all but the last term contain a factor of x_3, or in general x_n. For this fraction to represent an integer, every factor in the denominator must divide into the numerator, and since x_3 already occurs in all but the last term of the numerator, it must divide the last term also. Therefore, x_n must divide the product of all the other x_k's. The situation is symmetrical for all the other x_k's for 0 <= k <= n. Hence each x_k must divide into the product of all the others.
Now suppose p is a prime factor of x_k. Then p must also occur as a factor of some other x_j, and so p|(x_k - x_j). But x_k - x_j = (k-j)d, so either p|d or p|(k-j). However, p|d is impossible, because p|x_k implies that p|x_k - kd = a, so gcd(a,d)>=p, contradicting their relative prime status. Therefore, if p|x_k, p must also divide the distance k-j to some other term of the series.
This implies that for each x_k, if it has a prime factor p > k, then the series must extend to at least the n=k+p term. If new primes are encountered, when the series is extended, it may need to be extended further. If the number of terms ever "catches up" to the size of the prime factors, the series will terminate.
However, this does not prove that the series does sum to an integer. It is only a necessary, not a sufficient condition. If you could show that new prime factors showed up often enough to prevent the series from terminating, this would prove the impossiblity of getting an integer from a sum of its first n terms.
At this point, you need something like Bertrand's Postulate, also known as Chebyshev's Theorem. But these discuss the occurence of primes in stretches of consecutive integers, and you would need a version of this theorem for arithmetic sequences. I do not know of such a result, although you could perhaps derive it by copying Chebyshev's proof but using some of the theory of primes in arithmetic progressions (perhaps Dedekind already proved it -- I'm not too familiar with that area). By the way, the link above has a reference to a paper by Erdos in 1934, where he proves a fact that looks relevant, about the occurance of prime factors in a block of integers. This makes me suspect that that 1934 paper may be the one you want. He wrote so MANY papers though, that saying "Erdos" doesn't narrow the search much.
Well, this is not very conclusive, but perhaps there are ideas here you could use.
Hope this helps.