# Two Envelopes Paradox

Some time ago, Zvi Mendlowitz from Israel posted a question on one of the CTK Exchange forums:

In a box there are two envelopes. It is known that with probability 1/2, one envelope contains $1 and the other one $10; with probability 1/4, one envelope contains $10 and the other one $100; with probability 1/8, one envelope contains $100 and the other one $1000; and so on.

You open one envelope and find x dollars in it. Now you can keep the money or take instead the other envelope. What do you do?

Calculation shows that whatever the value of x, the expected amount of money in the other envelope is more than in the envelope you took, so it seems you should always switch; but if this is true, why not take the other envelope in the first place, and not switch?

Mark Huber replied:

This is a great paradox, here's one way of thinking about what is going on. Suppose that initially there is one envelope in the box. This envelope has $1 in it with probability 1/4, $10 with probability 1/4 + 1/8 = 3/8, $100 with probability 1/8 + 1/16 = 3/16, and so on. (Picking an envelope according to this distribution is the same as choosing which two envelopes go in the box according to your distribution and then picking one of the envelopes at random.)

Now, as soon as you have chosen a single envelope according to this distribution, a leprechaun who has been watching the proceedings (and who knows the value of the envelope is x) runs out, and with probability 2/3 puts x/10 dollars in an envelope and throws it in the box, and with probability 1/3 puts 10x dollars in an envelope and throws it in the box.

Would you switch? I know that I would! There is a 1/3 chance of gaining 9x dollars, with a 2/3 chance of losing .9x dollars, so there is an expected gain of 2.4x dollars by switching.

Now, why does this paradox occur: because it is *impossible* to set up this experiment! It appears to be real, but that is an illusion. In order to create such a distribution, a player would need to have an infinite amount of dollars on hand to set up the box. If you only have a fixed amount of dollars M, no matter how large M is, there is a chance that you will need more than M to set up the box, and so the experiment cannot be performed as stated.

This is similar to the Martingale system of betting: bet $1 on red on Roulette. If you lose, bet $2 then $4, then $8 and so on until you win. Each time you win, you gain $1. So why can't you break the house with this system? Because you need an infinite amount of money on hand to play the Martingale system, and no one actually has an infinite amount of money.

This made Zvi wonder:

The question was not a practical problem, but a theoretical one.

Mark then has clarified:

Of course it was not a practical problem, but a theoretical construct. (If you know someone running this experiment in real life, I want in!) I used a leprechaun to plant the extra money envelope to emphasize that this is purely an "outside the real world" problem.

In a paradox like this one, our intuition fails to tell us the correct answer. In this case, our intuition says that the always switching strategy is ridiculous, why not stick with the original envelope? Why should it matter if we switch every time? I framed the question using the leprechaun with the extra envelope partially to convince myself that in fact intuition is wrong here: you should always switch envelopes in the problem. A direct calculation of expected return conditioned on the value in the first envelope leads to the same conclusion.

But the leprechaun point of view of the problem illustrates why our intuition fails: you can view the two envelope problem set up as being inherently unphysical. You cannot perform the experiment in the real world, and so you should not be surprised that the intuition that we get from performing experiments in the real world does not apply.

To summarize my three points: the correct answer to your problem is that you should always switch. The intuitive answer to the problem is switching should not matter. The intuitive answer is wrong because this experiment cannot be conducted in the real world, and so real world rules do not apply.

A comment came from Nathan Bowler

Mark Huber said: To summarize my three points: the correct answer to your problem is that you should always switch.

This is because:

(a)

Given the value of the contents of either envelope, the expected value of the contents of the other is always greater.

Mark Huber said: The intuitive answer to the problem is switching should not matter.

This is because:

(b)

Exchanging the envelopes before opening does not affect the probability distribution for the amounts of money in the envelopes.

Mark Huber said: The intuitive answer is wrong because this experiment cannot be conducted in the real world, and so real world rules do not apply.

More specifically, we have:

(c)

The expected amount of money in either envelope is infinite.

First note that if money is put at random into two envelopes in such a way that both (a) and (b) hold then so must (c).

For any particular value for the amount of money in X, the expected value of the amount in Y is greater than the expected value of the amount in X (by (a)). Averaging over all possible values for the amount of money in X, E(Y) > E(X) (if both are finite). But from (b) we know that E(X) = E(Y). Hence E(X) > E(X), which is a contradiction if E(X) is finite. So E(X) is infinite, which is (c).

Now (c), on the face of it, does not seem to be a paradoxical condition (however, there is a closely linked paradox). But it can be seen to easily produce both (a) and (b). The following thought experiment constructs a situation in which both (a) and (b) hold, and in which the way that both arise from (c) is far more obvious:

Let two envelopes X and Y be independently filled with money with the same probability distribution as one another, and in such a way that the expected amount in either envelope is infinite.

When envelope X is opened, the amount of money it contains is found to be finite. But the expected value of the amount in envelope Y is infinite. So, given the chance, you should switch. But by symmetry the same would be true if envelope Y were to be opened first.

Stuart Anderson clarified further:

This reminds me of the radio program "A Prairie Home Companion" in which one of the standing jokes is that in a certain town in Minnesota "all the children are above average." In the present case, because the sum diverges, the expected value (which is after all a sort of weighted average) exceeds the value of each individual envelope. Hence, "all the envelopes are below average." This is of course counterintuitive, if one's intuition was formed by thinking about finite cases (as is usual). Really, this is just a rephrasing of Nathan's point. Since each envelope is below average, and you must always expect the other envelope to BE AVERAGE, it seems one must always switch.

Looking at the problem "locally" in terms of conditional probabilities, the initial symmetry of the game is broken as soon as you choose a specific envelope, because the rules exclude the possibility that the other envelope contains the same amount of money as the one you chose, so if you would have chosen the other envelope first, you would have been presented with a DIFFERENT set of possible outcomes. If you think about the set of possible events conditioned upon the value of the envelope, you would be looking at a different space of probability events.

Ib Jørgensen has shifted the view point:

It is clear that the situation gives rise to an experiment where the expected return for switching envelopes (or indeed for amount of money in an envelope that is picked) is infinite.

Now the argument for switching envelope says the expected value would increase if you switch. But here our intuition plays a trick with us. We are used to expected values that converge. In these "normal" situations, when you repeat an experiment a number of times, the average gain you get will approach the expected value.

When the expected value is infinite on the other hand, it means that we cannot expect the average gain to converge to anything.

Even if the expected value does not converge, it is possible to carry out a long series of the experiment in reality, so we have to understand what it means that the average gain does not converge.

If you were to try to run a series of experiments with a stochastic variable with an expected value that is infinite (easily simulated in a spreadsheet), you would see that the average gain would keep jumping around. This happens because the distribution is such that the largest gain in the series will be comparable to the sum of all the other gains. As a result it doesn't matter what you have done in all the experiments that didn't have the largest gain. Just one experiment decides what you have gained after the long series of experiments. As just one experiment decides the outcome, the argument for switching envelopes based on expected values disappears. There is a 50% chance that you picked the envelope with the largest amount of money in the dominating experiment in the first place in which case keeping the envelope will give you the largest gain for the series. This is then the reason why arguments based on expected values fail when the expected value does not converge.

|Contact| |Front page| |Contents| |Up|

Copyright © 1996-2018 Alexander Bogomolny

68259643