Food for thought:
http://consc.net/papers/envelope.html
This paper states that problems like this with an infinite expected value, even with a proper distribution function to draw values from, lead to an apparent paradox that is merely counter-intuitive due to the "fuzzy arithmetic of infinity". In this respect it is similar to the St. Petersburg paradox.
http://en.wikipedia.org/wiki/St._Petersburg_paradox
Originally posted by PBE6But what are you (in your strategy) doing with the information?
As for the second part of the question, if we never see the value of the money inside the envelope then we really haven't made a selection yet. No information flows from simply selecting an envelope, we can only make a proper decision once we open it.
Nothing.
You always switch.
Why do you need to see the contents?
Having a strategy implies you get more than go.
If you only ever get 1 shot, then in real life it does become more about utility and personal circumstances. And this is where the tension is in game shows like Millionaire and Deal or No Deal.
Comparing this puzzle with the St. Petersburg paradox,
if we (notionally) mark the envelopes with A for the lower value and B for the higher value
Is the expected return from always choosing A infinite?
Is the expected return from always choosing B infinite?
if both answers are yes, then it makes no difference whether you switch or not....does it?
Originally posted by aging blitzerThat is exactly what I would do with the information - use it to determine the utility of the money to me given my circumstances, as stated previously.
But what are you (in your strategy) doing with the information?
Nothing.
You always switch.
Why do you need to see the contents?
Having a strategy implies you get more than go.
If you only ever get 1 shot, then in real life it does become more about utility and personal circumstances. And this is where the tension is in game shows like Millionaire and Deal or No Deal.
The game shows you mentioned do not operate in the same way as this problem. Here the possible sum can be infinite, on both game shows the maximum values of the prizes are known (among other details). If that is the case, there is no longer a perceived paradox.
Originally posted by PBE6Yes, in real life, fine.
That is exactly what I would do with the information - use it to determine the utility of the money to me given my circumstances, as stated previously.
But back in the problem where the expectation is infinite,
is the expected return from always choosing A infinite?
is the expected return from always choosing B infinite?
does switching make any difference...in the long run
Originally posted by aging blitzerThe problem is because the sum for the expected amount of money is infinite. This means that working out expected returns is going to give nonsense. Let's just find the expected number of envelope pairs. The probability there is exactly one pair is 1/2, that there are exactly 2 is 1/4 and so on so the expected number of pairs is given by:
Yes, in real life, fine.
But back in the problem where the expectation is infinite,
is the expected return from always choosing A infinite?
is the expected return from always choosing B infinite?
does switching make any difference...in the long run
pairs = 1/2 + 2/4 + 3/8 + ... + n/2^n + ... = S(1/2)
where S(x) = sum {n=0 ... infinity} nx^n
Using the identity x*d/dx x^n = n*x^n we can work out S(x):
S(x) = x d/dx sum {n=0 ... infinity} x^n
The sum was worked out centuries ago and comes to 1/(1-x), doing the algebra gives:
S(x) = x d/dx 1/(1 - x) = x / (1-x)^2
And we can put in x = 1/2 to find that the expected number of envelope pairs is 2. Let's just assume that that is the actual number of envelopes in the box and use the following strategy:
If we see $1 we swap as the other envelope is known to have $10 in.
If we see $10 then as we expect 2 pairs of envelopes it is a 50% chance for $100 so we may as well go for it as losing $9 isn't a disaster.
If we see $100 or more then we expect that that is the largest amount of money present and keep it.
The problem posted does have a finite answer, despite the tie-in with the St Petersburg principle.
I play this game and pull out a pair, and open an envelop. Depending on the amount, I may have eliminated some possibilities of the number of heads (or I may not..)
Anyways, expected payoffs, (compared to the original amount).
If I stay, then my payoff is 1.
If I swap the envelop with $1, then my payoff is 10, as 1 is the low amount. (100% chance of improving)
If I swap the envelop with $10, then what are my odds of improving?
Well, I have a 50% that that was the only pair, in which case it's low.
The other 50% of the time, I have equal odds of high or low, which means I have a 25% chance overall of improving my sum.
That makes the expected payoff 10*25% + 0.1*75% or 2.575.
Oddly enough, this same payoff holds for any greater value, with the key difference being that I know more and more heads were tossed.
The determining factor in whether to swap or not seems to be the multiplier, which breaks even at 3.
If it's less than 3, then you only swap with the low amount.
Originally posted by geepamoogle"If it's less than 3, then you only swap with the low amount."
The problem posted does have a finite answer, despite the tie-in with the St Petersburg principle.
I play this game and pull out a pair, and open an envelop. Depending on the amount, I may have eliminated some possibilities of the number of heads (or I may not..)
Anyways, expected payoffs, (compared to the original amount).
If I stay, then my pa ...[text shortened]... plier, which breaks even at 3.
If it's less than 3, then you only swap with the low amount.
Not only didn't I understand the rest ofyour post, but surely this conclusion makes no sense. Which is 'the low amount'?
Originally posted by Mephisto2The problem as given works in this general way. For each pair of envelops there is a multiplier at work.. In this case, it's 10.
"If it's less than 3, then you only swap with the low amount."
Not only didn't I understand the rest ofyour post, but surely this conclusion makes no sense. Which is 'the low amount'?
So once you are aware of the amount in one envelop, you know it's either 10 times the amount in its mate, or else it;s mate is 10 times as much.
If I find that the amount is the low amount ($1 in this case), then I know the other has $10.
If the amount I find is anything greater, then it ends up I only have a 1-in-4 chance that the other envelop is greater, but but payoff is 10 times higher, which makes the rational choice to swap, especially in a repeat run game. This is also true in the event you picked the other one in the first place, although it is counterintuitive that that rule would maximize average payoff, but the logic is sound nonetheless.
But let's assume for the moment that the amount only triples each time, so that the first pair of envelops has $1 and $3 respectively, the second has $3 and $9, the third pair $9 and $27, etc..
And suppose you find $9 in the envelop. You still have 1-in-4 odds of improving your lot by picking the other, and will win $27 in this case. The other 3-in-4 times you'll wind up with $3 instead.
So your expected winnings for swapping is ($27 + 3*$3) / 4 or $9. So either choice is equally acceptable rationally. Of course, if you get the $1, then you'll always swap for $3.
Now what if it only doubles each time. ($1/$2 - $2/$4 - $4/$8 - etc)
And suppose you get a $4 envelop. By swapping you can expect to get $8 a quarter of the time, and $2 the rest of the time, for an expected payout of ($8 + 3*$2)/4 = $3.50 which means you'll get less on average than what you can get now. Of course, if you get the $1 envelop here, swapping will always net you $2, because $1 is always the 'low amount'.