Yet Another Monty Hall Paradox

Just when I thought I had seen every wrinkle on the Monty Hall problem, Raymond Smullyan has to go come up with another one. Here's an excerpt, from his book The Riddle of Scheherazade and Other Amazing Puzzles:

“And now,” said Scheherazade, “I have a paradox for you. There are three boxes labeled A, B, and C. One and only one of the three boxes contains a gold coin; the other two are empty. I will prove to you that regardless of which of the three boxes you pick, the probability that it contains the gold coin is one in two.”

“That's ridiculous!” said the king. “Since there are three boxes the probability is clearly one in three.”

“Of course it's ridiculous,” said Scheherazade, “and that's what makes it a paradox. I will give you a proof that the probability is one in two, and your problem is to find the error in the proof - since the proof must obviously contain an error.”

“All right,” said the king.

“Let's suppose you pick Box A. Now, the coin is with equal probability in any of the boxes, so if Box B should be empty, then the chances are fifty-fifty that the coin is in Box A.”

“RIght,” said the king.

“Also, if Box C is empty, then again the chances are fifty-fifty that the coin is in Box A.”

“That's right,” said the king.

“But at least one of the boxes, B or C, must be empty, and whichever one is empty, the chances are fifty-fifty that the coin is in Box A. Therefore, the chances are fifty-fifty, period!”

“Oh my!” said the king.

What is the solution to the paradox?

Feel free to hash that out in the comments.

More like this

I have in front of me an anthology of bridge (as in the card game) essays entitled For Experts Only, edited by Pamela and Matthew Granovetter. Essay number six was written by Phil Martin, and is entitled “The Monty Hall Trap.” Sounds interesting, but I am most definitely not an expert at bridge…
You might have noticed that blogging has been a bit erratic lately, and I have fallen off my usual pace of updating every weekday. There's a reason for that! Regular readers of this blog are aware that I have a small obsession with the Monty Hall problem. I managed to convince Oxford University…
Here is an interesting variation on the Monty Hall problem. For now I will simply present it cold, without indicating the context of where I saw it. Feel free to leave your proposed solutions in the comments. Everything from vague intuitions to hard-core Bayesian analysis is welcome. Adam and…
Sorry for the sporadic blogging. For the past week I've been working on the Progressive Monty Hall problem, and it has proven to be considerably more complicated than I at first realized. I had expected to polish it off with a few hours work. Instead I have thought about little else for the past…

My intuition tells me that it might have something to do with overlapping probablities, since the possibility that box B is empty and the possibility that box C is empty overlap (i.e., they could both be empty). Haven't thought it out in detail, though.

By looking in one of the boxes, one is effectively "decreasing the number of boxes." If you know with absolute certainty that a specific box is empty, you need not consider it in your probabilities as to where the coin might be.

This is an interesting variation, being couched in contingencies rather than in revealed information. But just as the details of why some piece of information was revealed can matter in calculating probabilities, the details of how each contingency might come about also matter when calculating probabilities.

"Let's suppose you pick Box A. Now, the coin is with equal probability in any of the boxes, so if Box B should be empty, then the chances are fifty-fifty that the coin is in Box A."

... and if Box B should not be empty, then the chances are nil that the coin is in Box A. It is, of course, not true that the probability that Box A contains the marble is (1/2) just because Box B might be empty. ...

"But at least one of the boxes, B or C, must be empty, and whichever one is empty, the chances are fifty-fifty that the coin is in Box A.

Which one is "whichever one" depends in part on the location of the marble, and that must be taken into account in the probability calculations. In the previous example, the probabilities where 1/2, 0, 1/2 contingent on Box B being empty, but which box was Box B was not contingent on the location of the marble. The box that is "whichever one" is contingent on the location of the marble, and the probabilities need to be analyzed with that in mind. A box can be "whichever one" because the "other" box has the marble, or it can be "whichever one" because of a combination of Box A having the marble and of some unspecified selection process, S. And crucially, the relative likelyhoods of the two cases (marble in "other box", marble in A and selected by S) depends on the details of S; they are not necessarily 1/2, 1/2.
An example should illuminate:
Suppose S is that Box B should be "whichever one" whenever possible.
Now if Box B is "whichever one", then the probability of the marble being in Box A and in Box C are both indeed 1/2.
If however, Box C is "whichever one", then it is not at all true that the probability of the marble being in Box A and in Box B are both 1/2. The probably of the marble being in Box A is 0, and the probably of the marble being in Box B is 1; Box C will be "whichever one" only when the marble is in Box B.
The chances of Box B being "whichever one" are fairly obviously 2/3, from which it is a fairly simple calculation to show that the probabilities of the marble being in each box are 1/3, as one would expect.

These probabilities must, of course, result no matter what S is, otherwise we have an inconsistent mathematical system.

This is probably clear as mud, and no-one sane would calculate probabilities this way.

By Andrew Wade (not verified) on 16 Jan 2007 #permalink

I might make an ass of myself here...

I choose box A.

Now consider box C:
If box is C is empty, there's a 1/2 probability that box A does not have the coin. But there's also a 2/3 probability that box C is empty in the first place, so the overall probability for this part is (1/2)(2/3)= 1/3.

Now consider box B:
If box B is empty, there's a 1/2 probability that box A does not have the coin. Again, there's a 2/3 probability that box C is empty, so the overall probability for this part is (1/2)(2/3)= 1/3 (probability that box B is empty and A is empty)

Summing: (1/3)+(1/3)=(2/3), which is the probability that we would expect for A to be empty.

I have a feeling I'm wrong...
For one thing, if I said "there's a 1/2 probability that box A has the coin" instead of "there's a 1/2 probability that box A does not have the coin" then I'd get the wrong answer... what the fuck??

"Again, there's a 2/3 probability that box C is empty"

That should be box B. That's what I get for cutting, pasting and altering.

For one thing, if I said "there's a 1/2 probability that box A has the coin" instead of "there's a 1/2 probability that box A does not have the coin" then I'd get the wrong answer... what the fuck??

Ah. P(Box A has coin and (Box B OR Box C is empty)) = P(Box A has the coin and Box B is empty) + P(Box A has the coin and Box C is empty) - P(Box A has the coin and Box B is empty AND Box C is empty). You need to subtract a P(Box A has the coin and Box B is empty AND Box C is empty) "correction term", otherwise you're counting this situation twice. With it:
P(Box A has coin AND (Box B OR Box C is empty)) = (1/3) + (1/3) - (1/3) = (1/3)
The "correction term" didn't matter before because P(Box A is empty AND Box B is empty and Box C is empty) = 0.

By Andrew Wade (not verified) on 16 Jan 2007 #permalink

What is wrong with you guys? This is just one more example of misdirection. The contents (or lack thereof) of the unselected boxes is utterly irrelevant after the selection is made. There are two empty boxes and one gold coin box; you have a 1/3 chance of selecting the gold box. THE END

For those of you playing at home....

You have 3 quarters:
1. two-headed;
2. two-tailed;
3. one head, one tail.

You put them all in a bag, then you pull one out looking only at the top. It's a head. What's on the other side?

A. A tail (60/40) because there are 3 heads and 3 tails and you have eliminated one head.

B. A wash (50/50) it can't be the two-tailed coin so it must be either H/H or H/T.

C. A head ?????????????

since this generic problem constantly recurs on the internet, the real paradox is that no one learns about conditional probabilities.

let X be the event {the coin is in box X}, notX the complementary event. then

P{A} = P{A|notB} x P{notB} + P{A|B} x P{B} = 1/2 x 2/3 + 0 x 1/3 = 1/3.

all the verbiage - assuming it's logically correct - is just longhand for this trivial computation.

-charles

Okay, here's a way of looking at it that satisfies me:

1. Let's take the case that it's not in B. This is a 2/3 probability (that it's not in B) times the 1/2 probably that it's in A. Multiplying those two together gives a 1/3 probability.

2. Now let's go to the case that it's not in C. First off, though, we have to rule out the overlap with the "not in B" case, i.e., if it's not in B, it's already covered by step 1. However, that only leaves the case that it is in B, which means the probability that it's in A is zero.

3. So the overall probability is 1/3 + 0 = 1/3, just as expected.

I have a feeling someone expressed this above in more technical terms, but this is how I look at it.

Charles, some of us do not speak "trival computation" and have to make do with "verbiage" to express ourselves.

~David D.G.

By David D.G. (not verified) on 17 Jan 2007 #permalink

david dg:

understood, but anyone clever enough to figure it out logically (eg, kevin parker who reproduced in words exactly what I wrote in symbols) could get the idea of simple conditional probabilities ala venn diagrams in a blink and then wouldn't have to end their try at the next MP problem with "altho I may be wrong", "this is how I look at it", etc. (eg, knowing that P{A|B}=P{AB}/P{B} and what a venn diagram is is about all you really need.)

the last (and as it happens, also first time) I encountered the MP problem (at volokh conspiracy - a legal blog, strangely enough), there were 200+ stabs at it of which most were wrong, a few were right with varying degrees of confidence, and exactly one was formal, right, and presented with confidence. I bet most of those 200 people are smarter than I am and could learn all they need to know to do these in less time than they spent groping around. plus it is useful to have the concept of conditional probabilites even if you don't spend all day solving MP problems.

just a suggestion. makes no nevermind to me.

-charles

DTis wants to know
"You have 3 quarters:
1. two-headed;
2. two-tailed;
3. one head, one tail.

You put them all in a bag, then you pull one out looking only at the top. It's a head. What's on the other side?"

There are six sides and each counts once. The probability of getting heads are 2/6 for coin 1. and 1/6 for coin 3. So having selected heads we have just 3 counts remaining out of the original 6, the count is 2 for coin 1. and 1 for coin 3. Therefore, for coin 1. the other side is heads with probability 2/3 and for coin 2. the other side is tails with probability 1/3.

For people who are not trained to think in terms of probability and who haven't given the mathematics of probability much thought it can seem baffling at first but with some practice it becomes easy enough to figure out.

By Explicit Atheist (not verified) on 17 Jan 2007 #permalink

Hold on, didn't Jason say this was a variation of the Monty Hall problem?

First, we are given two conditionals:
1. If B=0 then A=1/2 and C=1/2
2. If C=0 then A=1/2 and B=1/2

In MH, we are not given conditionals, we are given a known of one of the unchosen doors.

Thus, since all we have as a conditional, and no *new* knowledge, the solution to the MH problem via probabilities, does not apply.

Thus, mere logic gives the two conditionals, but the same *could* be said for A, giving a third conditional:
3. If A=0 then B=1/2 and C=1/2

Adding that conditional to the list, according to the "proof" then A does not equal 1/2, it equals 0.

Therefore, we are left with the original hypothesis (of the king) that A=1/3

I forgot to add to my post above:

...because all three conditionals have equal weight and equal bias according to the logic of Scheherazade.

A few posts ago, JR suggested (very gently) that those of us who don't "get" this kind of problem just haven't tried hard enough. I've tried, attentively, for over 40 years, with virtually no success. I've read books and articles, gutted-through equations, and been tutored by top-notch probability professors and knowledgable amateurs. I've not experienced this level of incomprehension, or frustration, in almost any other intellectual endeavor I've attempted. Just sayin'....

By Jeff Chamberlain (not verified) on 17 Jan 2007 #permalink

I agree that there is some misdirection going on here. The problem is first posed as which of the three boxes holds the coin? Take your pick. But at the end, the question is: you have picked box A, what are the odds it holds the coin? The answer must depend on lucky numbers and the stars.

jeff C:

have you tried graphic aids? for example, the coin in the box problem is exactly equivalent to throwing a dart at a circular dart board divided into three equal pie slices colored red, blue, green (an example of a simple Venn diagram). assuming that you throw randomly but always hit the board, clearly the likelihood (ie, probability) of hitting any specific color is 1/3. now if you throw and are told you didn't hit red, then you must have hit either blue or green. the probability that you hit either is equal, ie 1/2.

the first probability (1/3) is the "unconditional probability" of hitting, say, blue, the second (1/2) is the "conditional probability" of hitting, say, blue where "conditional" refers to the fact that you were told additional information, namely that the dart didn't hit red.

the answer to the puzzle actually isn't a number, it's the fact that Scheherazade isn't distinguishing between unconditional and conditional probabilities.

-charles

DTis's problem:

the unconditional event space of the other (lower) side of the chosen coin is: H1,H2,T1,T2,H,T (same as the upper side).

the event space conditioned on the chosen coin showing a head reduces to H if it's coin 1, T if it's coin 3, each equally likely.

or one can use a venn diagram comprising six slices with the slices colored as:

1,2-red (coin 1)
3,4-green (coin 2)
5-green, 6-red (coin 3)

given that your dart throw hits red, it's either in (1,2) or (5,6), each equally likely, ie, the unhit sixth is equally likely to be red (H) or green (T).

or you can just note that if the drawn coin has a head it has to be either coin 1 or coin 3, and each is equally likely.

-c

Sorry guys;

The event is the selection. There are three coins AA AB BB, so there is a 2/3 chance that one has selected a double (AA or BB). Therefore, there is a 2/1 likelihood that the bottom and top of the coin are the same.

The esence of the Monty Hall problem (and all variations) is to focus the victim's attention on spurious statistics derived using the subset of ojects/choices/events remaining after the "true" selection has been made.

Charles: Thanks. Yes, I have employed visual aids. And I probably should have noted that the "Riddle of Scheherazade" did not baffle me (or at least not as much as the traditional "Monty Hall" formulation, although it took me a bit to figure out what was wrong with the oh-so-reasonable sounding statement of the "paradox"). I also probably could have reiterated something I've said before, that I'm not a "denier" or "resistor." When Monty asks me if I want to switch, I will, I promise. But I won't really understand why, and that annoys me.

By Jeff Chamberlain (not verified) on 18 Jan 2007 #permalink

For the coin selection problem we have been told the following...

"You have 3 quarters:
1. two-headed;
2. two-tailed;
3. one head, one tail.

You put them all in a bag, then you pull one out looking only at the top. It's a head. What's on the other side?"

It is a 50-50 split whether the other side is "heads" or "tails"

Proof:

Coin 1 is eliminated. It is Tail-Tail and therefore the selected coin may not possibly be coin 1.

The remaining coin is then simply a choice between coin 2 or coin 3. You will note that the question is not which side of which specific coin you are looking at, simply whether it is heads or tails, and only one of the two coins will produce a "heads" result (coin 2).

Err, reversed coin 1 and 2. Sorry.

"there is a 2/1 likelihood that the bottom and top of the coin are the same."

true, but you didn't ask for the probability of the event (which BTW is a set of outcomes of a random action, not the action itself)

{"bottom and top of the coin are the same"} = {bottom=top=H or bottom=top=T}

you asked (implicitly) for the conditional probabilities

P{bottom=H|top=H} and P{bottom=T|top=H}

which is 1/2 for each.

-c

But at least one of the boxes, B or C, must be empty, and whichever one is empty, the chances are fifty-fifty that the coin is in Box A. Therefore, the chances are fifty-fifty, period!

The first sentence is true, but the second is not.

It seems to me that the error in the proof is that Scheherazade has changed the problem from a choice between three boxes to a choice between two boxes, having eliminated one box as being empty.

in trying to argue (for jeff C's benefit) that the dart board approach can help in understanding the MH problem, I stumbled upon what seems to be a very simple "proof". for his benefit, I'm reframing it as a dart game.

suppose you guess red. if the dart hit red, which it will with probability 1/3, you win by not switching to another color. so, your probability of winning by not switching is 1/3.

if the dart doesn't hit red, which it won't with probability 2/3, and you switch, you win with probability 1 (the dart not only didn't hit red by hypothesis but also didn't hit the color monty tells you it didn't hit). so the probability of winning by switching is 2/3 x 1 = 2/3.

also, I noticed an interesting point. IIRC, the MH problem states that monte will always choose one of the doors not guessed. but suppose that after you guess but before he chooses a door, he asks if you know the correct strategy for the MH problem. if you say yes and guessed wrong, he should open the door you guessed because that forces you to switch, and your probability of winning goes down to 1/2.

since I'm doing what I argued that others shouldn't do by offering "proof by verbiage" rather than formal analysis (been there, done that), I have to finish with "I think", (referring to the "proofs", not the answers, which I know is correct for the MH problem and I'm pretty sure for the augmented MH problem). ie, do as I say, not as I do (I'm pretty lazy).

-charles

also, I noticed an interesting point. IIRC, the MH problem states that monte will always choose one of the doors not guessed. but suppose that after you guess but before he chooses a door, he asks if you know the correct strategy for the MH problem. if you say yes and guessed wrong, he should open the door you guessed because that forces you to switch, and your probability of winning goes down to 1/2.

Not so fast. If you cotton on to this and he doesn't open the door you guessed, then you know where the MacGuffin is. What Monty needs to do is open the door you guessed when you guessed wrong on average 1/4 of the time, at random.

By Andrew Wade (not verified) on 18 Jan 2007 #permalink

ctw wrote "or you can just note that if the drawn coin has a head it has to be either coin 1 or coin 3, and each is equally likely."

and Peter wrote "The remaining coin is then simply a choice between coin 1 or coin 3. You will note that the question is not which side of which specific coin you are looking at, simply whether it is heads or tails, and only one of the two coins will produce a "heads" result (coin 1)."

You are both ignoring the relative probability of the coin that shows heads being coin 1 versus coin 3. They are not equally likely and therefore which coin it is is important. It is twice as likely that a randomly selected coin that shows heads is coin 1. So 2/3 of time the other side will be heads (coin 1)and 1/3 of the time the other side will be tails (coin 3).

By Explicit Atheist (not verified) on 18 Jan 2007 #permalink

To find the source of the paradox let A1 be the event "the gold coin is in Box A", and let B0 be the event "Box B is empty", and C0 the event "Box C is empty". Then the conversation can be rewritten as:

Scheherazade: "The conditional probability of event A1 given that event B0 has occurred is: P(A1|B0) = 0.5."
King: "Right."
Scheherazade: "Also, P(A1|C0) = 0.5."
King: "That's right."
Scheherazade: "But at least one of events B0 or C0 must occur, and both P(A1|B0) = P(A1|C0) = 0.5. Therefore P(A1) = 0.5!"
King: "Oh my!"

The paradox comes about because Scheherazade cunningly suggests that P(A1) = P(A1|B0), but this is only true if event A1 is independent of event B0, which is clearly not the case since event A1 is a subset of event B0.

Dear Peter, CTW, et al;

Let's play coin game 30 times, assuming a normal distribution. The results are:

10 H/H;
5 H/T;
5 T/H;
10 T/T.

So, experimentally, we can demonstrate that for every 3 occurrences of an obverse head, we will have 2 reverse Heads and 1 reverse Tail. Odds equalling 2/1, I do believe.

Knowledge deduced after the selection cannot affect the results of the selection.

same problem as before. you MUST define clearly what events you are computing the probabilities of. here you are essentially computing (in a most awkward way) the a priori (unconditional) probabilites

P{top=H and bottom=H}=P{coin 1}=1/3
P{top=H and bottom=T}=P(coin=2}P{top=H|coin 2}=1/3 x/1/2 = 1/6

and the ratio is in fact 2/1. however, the probabilities you first asked for were the a posteriori (conditional) probabilities

P{bottom=H|top=H} and P{bottom=T|top=H} each of which is 1/2.

"Knowledge deduced after the selection cannot affect the results of the selection"

correct but irrelevant. your original question wasn't "what are the a priori probabilities for various selection outcomes" but "what are the a posteriori probabilities of various outcomes given some information about the results of the toss" which are by definition affected by the selection. these are different questions and therefore can (and in this case do) result in different answers.

-charles

As dtis says, after the first round it is twice as likely that the randomly selected coin showing heads is coin 1 versus coin 3 and coin 2 is impossible. So the other side will be heads (coin 1) 2/3 of the time and tails (coin 3) 1/3 of the time. There no ambiguity at all in the wording of the question or the correct answer. I don't understand what ctw and Peter are talking about when they say the odds are one half for the other side being heads and one half for the other side being tails. That isn't true at all.

By Explicit Atheist (not verified) on 19 Jan 2007 #permalink

This is a three box question, not a two box question.

You/we started with Box A as 1/3 chance, and Boxes B + C as 2/3 chances.

Even after seeing that one of the non-A boxes is empty. It remains a three box question.

The chances of the coin being in box B after seeing that box C is empty is 2/3 not 1/2.

Exposing box B does not lessen the chances of the coin being in Boxes B + C, and if C is empty, B+C is still 2/3...so B now is a 2/3 chance of having the coin. Box A remains a 1/3 chance, even though we are down to two remaining boxes. Box A's chances do not raise up to 1/2 just because we now know that C is empty. Box B's chances of having the coin does not drop down to 1/2. Box B + Box c chance remains 2/3.

The chance of the coin being in box C after seeing that box B is empty is also 2/3.

1/2 never enters into the calculation if you start with 3 boxes.

John Allen Paulos has written on this one over and over again. http://www.math.temple.edu/~paulos/

well. this is a classic example of why I don't like "verbiage" solutions instead of formal solutions: people who do the former are most often wrong, even when people do the former and are right I can't understand their solutions, and when I try verbiage solutions I always screw them up. dtis and EA are correct and I am wrong.

which was really lazy on my part since the formal solution is a one liner:

the event space is:

coin top
1......H1
1......H2
2......H
2......T
3......T1
3......T2

let Ht be head on top, Hb be head on bottom, etc. then

P{Hb|Ht} = P{HbHt}/P{Ht} = (1/3)/(1/2) = 2/3
P{Tb|Ht} = P{TbHt}/P{Ht} = (1/6)/(1/2) = 1/3

as they said.

interesting note on human psychology. my "proof" via Venn diagram in my 1/18 11:14AM comment bothered me at the time but since I had committed to the 1/2 probabilites, I fudged it to confirm that presumption. but the argument should be that given red, there are two segments for which the complementary segment is red, only one for which it is green, ie, 2 to 1.

mea culpa x 2.

-charles

"Even after seeing that one of the non-A boxes is empty. It remains a three box question." (BMKMD.) This is one of the (many) things I don't "get" about the MH problem.

By Jeff Chamberlain (not verified) on 20 Jan 2007 #permalink

"...I don't get it."

I know. I know. It took me several times to hear and read this one until I got the idea.

If you try it with 100 boxes vs. 1 box, and gradually, step by step take one away, it makes more sense.

The odds remain 1/100 for the single box, 99/100 for the remaining group. Remember it's a group setup not a bet on the last 2 items. In this example it remains a test of 100 boxes.

As you expose one box at a time, the remaining 98 or 97, or 50 or 10 or 2 or 1 unopened boxes, remain at 99 to 1 boxes for the coin to reside in a non-A box.

The problem is that we forget that this is not a 2 box question. In this example it's a 99 to 1 question.

You'ld be most unwise to bet on he A box with 1/99 odds, but as with all "large" number situations, it occasionally would turn up to be box A. 1 in 99 times on average.

The bigger the total number of Non-A boxes, the surer you should be in betting on the Non-A box.

It's a 100 box question.

jeff C:

here's the simplest "verbiage" solution I can imagine.

assume you guess box 1. the coin is in box 1 with p=1/3.

after monty shows you that box 2 is empty, you know the coin is either in box 1 or box 3. if you don't switch to box 3, you'll win if the coin is in box 1, ie, with p=1/3.

if you do switch to box 3, you'll win if the coin isn't in box 1, ie, with p=2/3. so you should switch.

not much insight, but certainly simple.

-c

See, here's the thing. I know I should switch. I understand the math, at least to the extent that I can follow the manipulations. I do believe that I'll have a better chance if I switch. I'll switch. I promise. But it's a matter of acceding to authority -- mostly of the people who are sure about this (notwithstanding that every time I've seen a discussion of the MH problem there seems a good bit of dispute even among folks who seemingly should know better).

The rest is mysterious, and one of the mysteries is why, when one of the boxes is opened and shown to be empty, it remains a 3-box problem and doesn't become a 2-box problem. If it remains a 3-box problem, then I understand the rest. But I don't understand why it remains a 3-box problem, and thus "the rest" is based on a premise I don't understand and is therefore not responsive to my inquiry. (BTW, the mystery is the same in the 100-box problem. Show me that first empty box and I don't understand why it has not now become a 99-box problem, and so on.)

Suppose, after I choose box A and Monty opens box B and it's empty (and, yes, I know that Monty knows where the prize is), a stranger appears on the scene. She sees 3 boxes, one of which is open and empty. If she chooses box A or box C are her odds 1/3 - 2/3 for box C? It seems to me that in this situation, box B is irrelevant and her odds are 50-50. She is, for all practical purposes, really choosing between only 2 boxes. If her odds change to 1/3 - 2/3 if she's advised of the history (me choosing box A and Monty opening box B) I don't understand why.

By Jeff Chamberlain (not verified) on 20 Jan 2007 #permalink

before monty opens box 2, the probability that the coin is in each box is 1/3. so the probability that the coin is in box 1 is 1/3 and the probability that it's in either box 2 or in box 3 is 2/3. after monty opens box 2 and shows it to be empty, the probability that the coin is in box 1 is still 1/3 and the probability that it's in box 2 or 3 is still 2/3, only you know it's not is box 2. so the probability it's in box 3 has to jump from 1/3 to 2/3 after monty opens box 2.

as for the stranger, look at it this way. suppose after monty opens box 2 you want time to think over whether to switch or not so you go to dinner. when you come back hours later, you are in the same position as a stranger who has been told the history of the game; you are confronting two closed boxes and one open box and know that the coin was originally put randomly in one of the three boxes. ie, it's the same 3-box problem as before only you have some additional information because one box has been opened. and clearly the probabilities haven't changed while you were at dinner.

on the other hand, if the stranger confronted the situation but was told that a coin had been put randomly in one of the two closed boxes, then the open box would in fact be irrelevant and the probabilities would be 1/2 for each closed box.

-c

a little elaboration. it's important to recognize that probabilities are a measure of uncertainty and that when monty opens box 2, he is providing information, ie, reducing uncertainty. that's why the probability that the coin is in box 3 "jumps" to 2/3. and that's the essence of a conditional probability - it's the probability of a situation (in this case, that the coin is in box 3) "conditioned" on additional information, in this case that the coin is not in box 2.

as simple example of unconditional or a priori probabilities versus conditional or a posteriori probabilities, consider flipping a coin. the a priori probability that a random toss will result in H is 1/2. but the conditional or a posteriori probability that the result of the toss was H given the additional information that the result wasn't T is 1.

-c

There's another way to look at the Monty Hall problem that is a little backwards and complicated, but an alternate view can help people who can't get their head around the normal method.

You pick a door. Monty eliminates one. You're left with two doors, so that's 1/2 chance of a win for each door, right?

But consider the moment he eliminated one. Imagine if, every week, the contestants always picked door 1 and Monty eliminated door 2 if it had no prize, leaving the contestant to decide between door 1 and door 3. If, one week, he suddenly eliminated door 3, you'd know for sure there was a prize behind door 2 (because he always eliminated door 2 if he could). So you can see that he could give away critical information when he makes his choice. In this case he'd be giving the whole game away one night in three.

But Monty doesn't want to be too obvious, so he doesn't follow a regular system. He chooses a random one of the two doors to elimiate. But of course, if his random selection would have made him eliminate a prize door, he'd had to change his mind and pick the other one.

So whenever Monty chooses a door, you, the contestant, have to consider the possibility thay he *would have* picked the other door, but couldn't since it had the prize. That is, there's a chance that the door left over was considered and rejected, because it has the prize!

The chance that this is what happened is, first, a 1/2 chance of picking either door 2 or door 3, and, second, a 1/3 chance of the door he picked having a prize. So, all up, a 1/6 chance that the door left over was considered for opening, but rejected because it has a prize.

So starting with your 1/2 chance for each door, you have to add 1/6 to the door that might have been rejected. And that makes 2/3.

By SmellyTerror (not verified) on 20 Jan 2007 #permalink

CTW: If I understand you correctly, "the odds" are 1/3 - 2/3 for someone who knows the history, but 1/2 - 1/2 for someone who doesn't know the history. That baffles me. For the former, it apparently remains a 3-door problem, but for the latter it's a 2-door problem.

If I'm told that a prize has been placed in one of 3 boxes, but not in box B, how (why?) is that different from telling me that a prize has been placed in one of 2 boxes, A or C?

By Jeff Chamberlain (not verified) on 21 Jan 2007 #permalink

" "the odds" are 1/3 - 2/3 for someone who knows the history, but 1/2 - 1/2 for someone who doesn't know the history."

no. I said that if you tell the stranger the history, she knows everything you do and can make the same decision you can make and will win or lose with the same probabilities you will. the fact that MH has opened a box doesn't change the game; it's still a 3-box game.

I just added the (probably confusing) observation that if you start a new game by putting the coin randomly into one of the two closed boxes, then you have created a 2-box game because the open box still lying around is irrelevant.

a game retains it's character from beginning to end. nothing can convert a 3-box game into a 2-box game. they are seperate entities. and that's the essence of the MH paradox. the instinctive approach is to convert the game from 3-box to 2-box after MH opens the box or door, or draws the coin (I know - even knowing better, I do it with every new variation. see multiple demonstrations of that foolishness in my comments above re the DTis problem). but that's wrong. it's always a 3-box problem.

the way to really understand what's going on is to define the event space (ie, the set of possible outcomes) for the game as in my 2:17AM comment on 1/20. for the MH game, since you guess randomly at first, we can assume you guess box 1. then the event space is:

box ............. p
1 ... 2 ... 3
---------------------------------
c ... M ... - ... 1/3 x 1/2 = 1/6
c ... - ... M ... 1/3 x 1/2 = 1/6
- ... c ... M ... 1/3 x 1 = 1/3
- ... M ... c ... 1/3 x 1 = 1/3

where each row is a possible outcome. c means the coin is in that box (eg, for the first outcome it's in box 1), M means MH opens that box, and p is the probability of that outcome (which must all add up to 1). for example, row 1 is the case where the coin is in box 1 and MH opens box 2. since the probability that the coin is in any box is 1/3 and if it is in box 1, MH opens either box 2 or box 3 randomly, the probability of that outcome is 1/3 x 1/2 = 1/6. similarly for row 2. row 3 is the case where the coin is in box 2 and MH opens box 3. he won't open the box you guess and he won't open the box the coin is in, so he is certain to open box 3. the probability of that outcome is then 1/3 x 1 = 1/3, similarly with row 4.

the total probability that you win if you don't switch is the probability of the first two outcomes, ie, 1/3. the probability that you win if you do switch is the total probability of the last two, ie, 2/3.

everything there is to know about the MH problem is laid out before you when you look at this way, so I'd suggest poring over it until you understand each piece. then you can work any problem of this sort.

-c

A stab at stating the fallacy succinctly:

The proof describes two conditional probabilities but those two c.p.'s don't fully describe the range of pre-selection probabilities. He left one box out. I could go through the math, but there isn't room in this comment.

Does that do it?

To all, and Charles especially, thank you. I very much appreciate your generous efforts to try to help me understand something which for whatever reason seems just not to compute for me.

By Jeff Chamberlain (not verified) on 21 Jan 2007 #permalink

taking my 1/20 3:19PM MH solution one step further.

having chosen a door, if you don't switch you win only if your choice was right, ie, with p=1/3. thus, you should switch NO MATTER WHAT monty does unless, of course, you were right and he opens your door.

ie, other than opening your door if you were right, he can at worst do something that makes your door less likely than one of the others: 1/3 vs 2/3 for the classic MH game or 0 vs 1/2 for my modified game in which he opens your door if you know to switch and guessed wrong. and even if he does nothing, you're no worse off by switching.

so consider this MH game:

you choose a door and monty asks if you will commit to switching after he opens one of the three doors

you should commit to switching. even though you'll lose if you chose the right door, that will happen with p=1/3. so no matter what monte does, you're never worse off than a random guess.

and monty should open your door if you guess wrong because you'll then win with p=1/2 rather than p=1 (you have to switch to one of the other doors, chosen randomly).

-charles

Hey guys;

It's a two box problem.

1. You pick a box.
2. Monty shows you an empty box.
3. You pick again from the remaining 2 boxes.

A. SIMPLE LOGIC
Assume that, after step two, Monty disqualifies you and brings in another contestant to choose from the remaining two boxes. You and she have exactly the same knowledge, which is that the remaining two boxes are PRIZE and notPRIZE. Logically, your odds cannot be 2/1 while hers are 50/50.

B. BRUTE FORCE
Take 3 poker chips:
1. WHITE (W) = PRIZE,
2. RED (R) = EMPTY1,
3. BLUE (B) = EMPTY2.

The selection possibilities are:
WRB,
WBR,
RBW,
RWB,
BWR,
BRW.

Pick any column to be Monty's choice, we'll use column two for this demonstration. We know that Monty did not select the prize (given). So we eliminate any possibility in which column two is W.

New Selection Universe;
WBR,
WRB,
RBW,
BRW.

So, there are 4 possibilities, W occurs with the same frequency in both columns, and, no matter which column you assign to the contestant and which to unselected there is no advantage to either keeping or changing one's pick.

You get the same results no matter which column of the original selection universe you assign to Monty.

Sorry;

Had my head totally up my ass. Combinations right, frequencies of occurrence wrong. CTW et al. correct in all aspects.

Quick explanation.
1. Because Monty always shows empty box (problem definition), the unselected box must always be empty if you have selected the prize box, or the prize box if you have selected an empty box (A/notA, where A = your choice).

2. So, if A = Empty (2/3), the unselected box is the prize.
If A =prize (1/3), the unselected box is empty.

Oh, and my second contestant? She has exactly the same odds (2/1) as contestant one if she knows which box contestant one selected, and 50/50 if she doesn't.

Again humble apologies for the screw up.

It may be helpful to demonstrate why the second contestant would have a 50-50 shot.

First, we note that she has a 50-50 shot of choosing the same box the first contestant did. If so, there is a 1/3 chance that that is the correct box, and a 2/3 chance that it is the incorrect box (as proven earlier). If not, then there is a 2/3 chance that it is the correct box, and only a 1/3 chance that it is incorrect. Now let's figure out our probability for each of the four possibilities so we can calculate the probability of being correct (P(s) is the probability of choosing the same box as the first contestant, while P(1c) is the probability that the first contestant was correct):

P(s)*P(1c)=1/2*1/3=1/6
P(s)*P(1c!)=1/2*2/3=1/3
P(s!)*P(1c)=1/2*1/3=1/6
P(s!)*P(1c!)=1/2*2/3=1/3

The two bolded probability calculations are the components of the probability of the second contestant being correct. If we add them up, we get:

P(2c)=1/6+1/3=1/2

In essence, while the second contestant knows that one of the boxes is more likely to be correct, she has no way of knowing which of the boxes is more likely to be correct. So she has to make a coinflip to decide which is more likely.

This is a request on the side, so to speak. Can anyone explain to me how the game "The Liar Has The Coin" works, either mathematically or logically? A few years ago, I played this on a group of college students, keeping them baffled and up till 4AM. Though the "trick" works every time, I'm unable to explain it to even myself.

First, you have to assume that the liar always lies and the other person always tells the truth, and that one of them has always taken the coin.

LIAR = L,
OTHER = T,
COIN = C,
NO COIN = X.

There are four possibilities:
1. LC,
2. LX,
3. TC,
4. TX.

You ask "Did the liar take the coin?" for each possibility:
1. LC = NO,
2. TC = NO,
3. LX = YES,
4. TX = YES.

So, "NO" means that the asked person must have the coin, and "YES" means the other person must have the coin.

DTIS,

Thanks so much for the explanation. What I was missing all this time was the equivalancy of L and T, wrongly suspecting there must be some numerological hoo-doo about "No" and "Liar" being in hijinks together.