Another Way to Be Confused About the Monty Hall Problem

In the Monty Hall problem, you are confronted with three identical doors, one of which conceals a car while the other two conceal goats. You choose a door at random, number one say, but do not open it. Monty now opens a door he knows to conceal a goat. He then gives you the option of sticking or switching. What should you do to maximize your chances of winning the car.

As we are all by now aware, the correct answer is that you double your chances of winning by switching doors. Most people find this counterintuitive on the grounds that after Monty opens a door, only two, equally likely options remain. Thus, there is no advantage to be gained from switching.

Consider the following argument in defense of that view:

I claim that you have erred in initially assigning a probability of 1/3 to each of the three doors. This assignemnt was premised on the idea that for all you knew to the contrary, each of the three doors was equally likely to conceal the car. This did not adequately consider all of the information you had at your disposal. In particular, you knew that (1) You were going to choose door one and that (2) Monty would then open a goat-concealing door. These facts alter the way you should assign probabilities.

You see, at the moment you choose door one you know that either door two or door three will be eliminated. Regardless of which it is, you know that there will then be two equally likely doors remaining, one of which will be door one. It follows that door one never has a 1/3 probability of being correct. Rather, it had a 1/2 probability of being correct all along.

What do you think?

There are a number of ways to expose the flaw in the argument, but I especially like the following: If we accept this sort of reasoning, then regardless of the situation it will always be impossible to assign probabilities in a non-contradictory way.

Imagine that we have n equally probable hypotheses, H1, H2, ...,Hn. Each of these hypotheses gets a probability of 1/n. Imagine now that we are talking to an omniscient being. We ask Him, “Please point to a true hypothesis out of the collection H1 or H2, H1 or H3, H1 or H4, ..., H1 or Hn.” After he does so, we could follow our previous reasoning to conclude that H1 now has a probability of 1/2.

What if we took the hypotheses three at a time? We ask the omnisicent one to name a true hypothesis out of H1 or (H2 or H3), H1 or (H2 or H4), and on we go allowing the hypotheses in the parentheses to cycle through all possible pairs of subscripts. Once a true triple is pointed out, we will conclude that H1 has probability 1/3.

By continuing in this way, we could justify a probability of 1/n for H1, for any integer n we choose (this might require altering our partition of the sample space into possible hypotheses). And it works the other way too! Just replace H1 with its negation and use the same trick. Partition the hypothesis “Not H1” into some suitable number of equiprobable possibilities and use our previous argument to drive its probability down as low as you wish. This has the effect of driving the probability of H1 as high as you wish.

Thus, accepting the argument presented at the beginning of this post would force us to conclude that the whole idea of a meaningful calculus for probabilities is a pipe dream, and that the whole subject is a bottomless pit of contradictions and black holes. Then again, after pondering the Monty Hall problem for a while that seems like an all too reasonable conclusion!

Incidentally, I encountered this argument in a paper by Ruth Weintraub, entitled “A Paradox of Confirmation.” It was published in the philosophy journal Erkenntnis in 1988. It is part of the considerable professional philosophical literature on the Monty Hall problem (techinically, Weintraub was discussing the three prisoners problem, which is isomorphic to the MHP.) Wading through this literature has lately been a major project of mine.

More like this

You can exhaust the possibilities quite easily.

WLOG, you choose door number 1.

Cases A B C = prize behind door 1 2 3 respectively.

Case A, Monty chooses 2 or 3 to reveal. You switch, you lose. Chance 1 in 3.

Case B, Monty chooses door 3 to reveal. You switch to 2 and win. Chance 1 in 3.

Case C, Monty chooses door 2 to reveal. You switch to 3 and win. Chance 1 in 3.

Assumes of course that Monty must always reveal a door. Computer simulation shows that this analysis is correct given the assumptions (for the empiricists among us.)

By Craig Pennington (not verified) on 15 Nov 2007 #permalink

Eventually all this mathematical theory has to get back to the real world. Although this alternative has a patina of plausibility, it fails a simple consideration of all the possible ways the game can turn out, and that should be sufficient to dismiss it. (OTOH, it is essentially the argument made by those math experts that tried to take Ms Savant to task; a few rounds of the game played for real money would quickly convert any proponent.)

"Assumes of course that Monty must always reveal a door."

True, sometimes Monty offers you $500 instead of prize.

I think the argument confuses how randomness works. Specifically, the quote

... you knew that (1) You were going to choose door one and that (2) Monty would then open a goat-concealing door. These facts alter the way you should assign probabilities.

implies that probabilities can be assigned after you make a choice.

There are actually two random events: the random placement of the prize followed by a random choice by the player. The player's choice is independent of the placement, so one cannot assign probabilities based on his choice.

I think this is exactly the same kind of fallacy as in the Two-envelopes paradox in Wikipedia. In that problem you're offered a choice between two envelopes that contain $10 and $20.

After you make your choice, you're asked whether you'd like to switch. The argument goes that the other envelope has 2X with 0.5 and X/2 with 0.5 where X is the amount in your choice, and since the expected value is 5X/4 > X, you should switch. The same reasoning applies after that switch, too, so you switch indefinitely.

Again there are two events: the random placement of $20 and your choice. There are no more random events. Therefore, after having made the choice no probabilities can be assigned to the other envelope possibly containing either amount.

You are all missing the obvious solution.

The underlying objective is to maximize U (utility). Once fundamental needs are satisfied, utility quickly switches from an energy obtaining and conservation function and a reproductive function to a happiness function (Burnham 1996). It is not specified in your outline of the problem, but we can assume the "decider" is well fed, lives in a house, and is getting it on a semi regular basis.

Therefore, there is a way to always win, every time. All you have to do is choose the goat. You want the goat. Go for the goat. The goat will make you happy (increases U more than would winning the car).

This way, you have a 2/3 chance of getting the goat. Why? because you are going to randomly pick one of three equally likely outcomes, two of which are "goat" one of which is "not goat."

Why do I say you will always win even though there is a smaller but non-trivial chance of getting stuck with the car? Because you can always, always trade the car for a goat. That should never be a problem.

Does the "Unexpected Hanging" have anything to do with this?

Re: Trading the car for the goat: You will still have to pay taxes on the car. A friend of mine won a Miata that way and hated his tax bill.

The original argument is right because when you make a choice, Monte is going to open a door. No matter what, HE CANNOT OPEN THE DOOR WITH THE CAR, and he cannot open the door you chose. He always eliminates one of the two goat doors.

So by eliminating the door he opens, the probability increases that the one he did not open is the door with the car. He gives you no extra information about the door you chose, only about the one he did not open. Therefore, you should switch.

Suppose there were 100 doors (99 with goats, one with a car), and after you choose one, he opens 98 to show 98 goats. Would you switch to the single door he did not open?

It really isn't that tough. There are two simple and surprisingly intuitive solutions.

(1) When he offers you the other door, you are getting two doors for one (the door he opened plus the remaining door), so double your chances and take him up on his offer.

(2) As my son put it after we actually tried it out with cups and a pea, you are only WRONG to change when your first choice was right (which would be 1/3 of the time), making it correct to change 2/3 of the time.

I think doug b's son's explanation together with the one quoted in the post confirm a rule that I observed in a previous 200+ comment resurrection of the seemingly immortal MH problem:

The probability that a verbal explanation is correct is a rapidly decreasing function of the explanation's length.

- Charles

Mark Haddon's book The Strange Incident of the Dog in the Night-Time has a really good (and funny) section where the narrator, a math-and science-obsessed kid with Asperger's Syndrome, explains in both diagrams and equations how to solve this problem. I wish Science Blogs had a book club; that book would definitely make it on the list, as a funny and original positive portrayal of a staunch atheist and rational thinker.

Here's yet another way to be confused: a slight variant on the original rules.

The new rules of the game are as follows. You choose a door; Hall flips a coin to choose one of the other two doors, and opens it. If it's a car, you lose right away. If it's a goat, he asks you whether you'd like to stick with your current choice or switch to the other one.

You play. You choose. He opens a door. It's a goat.

Should you switch? Does it make a difference?

Paul Crowley, let's try the brute force method. I'm pretty sure all the cases are equally likely.

WLOG, choose door 1
Case 1: Car is behind 1. Host reveals 2. You switch, you lose.
Case 2: Car is behind 1. Host reveals 3. You switch, you lose.
Case 3: Car is behind 2. Host reveals 2. You lose.
Case 4: Car is behind 2. Host reveals 3. You switch, you win.
Case 5: Car is behind 3. Host reveals 2. You switch, you win.
Case 6: Car is behind 3. Host reveals 3. You lose.

So, you have a 1/3 chance of being screwed. Barring that, you are equally well off switching or not switching.

The only difference between your situation and the Monty Haul problem is that in cases 3 and 6, the host will say, "Screw the coin flip, I'm not that evil," and choose the other door. So in those two cases, you'd win by switching, making a total of 2/3 chance of winning by switching.

This was always one of my favorite probability problems/late night discussion topics. But my all time favorite, just because I had so much fun "solving" it, is the following.

"There are two drawers, each of which contains cash. You don't know how much money is in the drawers, but you know that one drawer contains twice as much as the other. You are asked to pick one drawer, but not open it. Once you've picked your drawer, you are asked if you want to keep the money in that drawer or switch drawers and take the money in the other drawer. Are you better off keeping the money in the drawer you picked, switching, or doesn't it matter?"

be warned; when I say I had fun solving it, I mean I spend days being angry and frustrated, but I refused to give up until I knew why I couldn't decide which was the correct answer.

I am always confused by this thought: If the contestant would randomly pick one of the two remaining doors, that would be the same as flipping a coin. Heads would mean door number one, tails would mean door number two. And the coin would certainly not 'consider all of the information', before making a 'decision'. Now the car is behind exactly _one_ of the two doors and one of the two equally possible outcomes of the cointoss will result in opening exactly that door. Suppose the contestant randomly chose the same door she had already picked in round one. Why would her chances then only be 1/3rd? How could anybody ever know (suppose she wouldn't tell) if the contestant picked randomly or not? Can she really improve her chances by disregarding information?

John: It doesn't matter.

There's a strong symmetry argument; no matter what reasoning you may apply to argue for/against switching, it could be applied just as well if you had picked the other drawer initially.

Therefore, both drawers must be equivalent for decision-making purposes.

Credel: No, she can't. Because her first choice fixed the probabilities. Switching gives 2/3, sticking gives 1/3. The randomness results in a probability, pre-flip, of 1/2 - but that's because it's 1/2 of 2/3, and 1/2 of 1/3. multiply and add, you get 1/3 and 1/6, which of course add to 1/2. However, once the flip is made and the decision resulting from it is likewise made, the probability goes back to how it would be were the coin not involved.

By Michael Ralston (not verified) on 17 Nov 2007 #permalink

I guess my question just is, what is the overall odds to winning the price, when knowing about the solution to Monty Hall problem? The moment you change the understanding of the problem, the odds change. Just as how a good card player has higher odds of winning the jackpot than a bad card player. They both deal with the same deck and same odds of draw, but one is able to manipulate the situation better. After all, a good player would always change their doors, therefore making the initial door choice irrelevant.

Of course, I accept the solution to Monty Hall problem as it is explained, and I accept that sticking with your initial door only leaves you with 1/3 chance of winning.

Michael: That was the first argument I made when I initially heard the problem and I agree that it's obvious that it can't matter. What made me interested in the problem was my second reaction: It's clear that a double-or-nothing bet (with equal chances of either option) is an even return. In this case, it's a double-or-half bet. 50% of the time you'll get twice as much money as is in the drawer, and 50% of the time you'll get half as much. So from that point of view it is clear that you're better off changeing.

Actually, I think that when I heard the problem initially, you were supposed to open the drawer and look inside to see how much money is there before deciding if you want to switch. But I think that both arguments work equally well if you look or if you don't look.

John: well, seeing the money allows for the possibility that the "double or half" bet would affect your utility so as to make switching potentially good or bad.

But aside from that ... It does seem paradoxical. Because, yes, a double-or-half bet is one you nearly always want to take.

Hmm. What happens if we break it down.

Case 1) You pick the better drawer. 50% chance - and you should stick.
Case 2) You pick the worse drawer. 50% chance - and you should switch.

Indifferent, clearly.

Now, if we break it down...
Case 1a) You picked the better drawer and stuck. You get y.
Case 1b) you picked the better drawer and switched. You get y/2.
Case 2b) You picked the worse drawer and stuck. You get x.
Case 2b) You picked the worse drawer and switched. You get 2x.

The expected utility of deciding to stick is (y+x)/2, and the expected utility of deciding to switch is (y/2 + 2x)/2.

But of course y=2x.
So the expected utility is 75% of the total, either way.

The key is that the amount in the drawer you initially picked determined if you were looking at y or x.

Hmm. This convinces me, yet it doesn't. I'd like to hear other's opinions.

Daenku: If you pick via the coinflip, your odds are in fact 1/2. This is, of course, a decrease in probability from if you switch every time.

By Michael Ralston (not verified) on 18 Nov 2007 #permalink

"50% of the time you'll get twice as much money as is in the drawer, and 50% of the time you'll get half as much. So from that point of view it is clear that you're better off changeing."

I infer that the conclusion follows from the following argument:

Let x be the amount of money in the chosen drawer. Then the other drawer contains either 2x or x/2 with equal probability and the expected return by switching is:

1/2 (x/2+2x) = 1.25x

But this is wrong because the first sentence is an incorrect statement of the sample space as can be seen from the fact that there are three possible values (x/2, x, and 2x) for the contents of only two drawers.

The sample space is actually (x,2x) and (2x,x), where x is (re)defined as the lesser value in the two drawers. Switching or not is essentially randomly choosing between the values of the first or second of the tuples respectively. The average return is the same in each case - the intuitive result.

Also in keeping with intuition, seeing that one drawer contains D doesn't change the statistics since it doesn't help in determining whether the contents is x or 2x. However, it does determine the possible values of x, viz, D or D/2.

- Charles

Variant of the MH problem:

After the initial choice of a door, Monty asks if the contestant will commit to switching after he opens a door. Should the contestant commit to switching, and if so what should Monty do if the commitment is made?

- Charles

This was probably mentioned a thousand times--but I always thought the best intuitive, gut-instinct explanation for the correctness of the solution to the standard MH problem is to suppose there were a billion and two doors. After making your choice, Monty (who knows all) reveals a billion losing doors leaving you with a easy/obvious decision: switch to the one remaining door.

Funny, to me the "best intuitive gut-instinct explanation" has always been the argument that after Monty opens the door, there are two choices, each equally likely, so it doesn't matter whether you switch. From which I conclude that intuitive gut-instinct explanations should be avoided since they can be dead wrong - as is this one.

Since there is a very simple quasi-analytical argument available (doug b comment of 11/16, 4:30PM, explanation 2 with a few words added for more insight into what's going on), there seems to be no need for a non-analytical explanation that may well be wrong.

And in any event, I don't see what is added by going from three to a billion-plus-two doors. At the last step, there are still two doors left. Why is it more intuitive that you should switch having gotten there in a billion steps rather than one step? The logical argument leading to p=2/3 that the the prize is behind the last unopened door (other than the one initially selected) is exactly the same in either case.

Besides, think of the clean-up necessary with a billion and one goats on that TV production set.

- Charles

ctw,

OK, I retract my implied claim that the billion door example makes it obvious--clearly it doesn't for you.

By the way, I much prefer non-analytical arguments. While I could always agree with the math of the MH solution, I never really "got it" until I heard the billion door example.

Charles: I don't say why you say that the double-or-half statement of the choice is incorrect. Once you have opened the drawer, it has an amount Z. The other drawer has either Z/2 or 2*Z, with equal probability. So if you keep the drawer you opened, your return is Z. If you switch, it's (Z/2+2*Z)/2=1.25*Z, a 25% increase.

It can be confusing to redefine x as the smaller of the two, given that you don't know if yours is the bigger or smaller to begin with. Similarly, Michael's analysis of your expected return doesn't take into account that once you've opened the drawer, your return if you keep is exactly Z (which is NOT the average of the amounts in the two drawers), while the average return for switching is 1.25*Z.

[To be fair - I changed the problem to say that you look in the drawer before deciding, which is not initially the problem Michael had been responding to. I think that the argument is the same, whether or not you know what Z is, but looking in the drawer first makes it more concrete, and easier to show why changing must be better].

Right...

2/3 of the time you've chosen a goat;
1/3 of the time you've chosen the car.

If you don't switch: you will win the car 1/3 of the time.

If you do switch: 1/3 of the time you will switch from the car to a goat, but 2/3 of the time you will switch from a goat to the car (because you've eliminated the possibilty of switching from one goat to the other goat.)

By WoodyTanaka (not verified) on 19 Nov 2007 #permalink

Jason,

Let's Make A Deal went away a long time ago now. Let it go, man. Let it go.

If you want to dress up in funny costumes, there's still Halloween.

~David D.G.

By David D.G. (not verified) on 19 Nov 2007 #permalink

I thought that I understood the Monty Hall problem and accepted the �correct� answer, i.e. better to change. But then I read Brandon�s comment on the 17th in which he listed all the possibilities of a slightly modified situation.
When I list all the possibilities for the classic Monty Hall, I get the following:

1. Car behind door 1, you choose door 1, Monty Hall opens door 2. Change and you lose
2. Car behind door 1, you choose door 1, Monty Hall opens door 3. Change and you lose
3. Car behind door 1, you choose door 2, Monty Hall opens door 3. Change and you win
4. Car behind door 1, you choose door 3, Monty Hall opens door 2. Change and you win
5. Car behind door 2, you choose door 1, Monty Hall opens door 3. Change and you win
6. Car behind door 2, you choose door 2, Monty Hall opens door 1. Change and you lose
7. Car behind door 2, you choose door 2, Monty Hall opens door 3. Change and you lose
8. Car behind door 2, you choose door 3, Monty Hall opens door 1. Change and you win
9. Car behind door 3, you choose door 1, Monty Hall opens door 2. Change and you win
10. Car behind door 3, you choose door 2, Monty Hall opens door 1. Change and you win
11. Car behind door 3, you choose door 3, Monty Hall opens door 1. Change and you lose
12. Car behind door 3, you choose door 3, Monty Hall opens door 2. Change and you lose

That makes 12 possible scenarios, in six of which you would win by changing and in the other six, you would lose by changing. Therefore, no advantage in changing!!!

What have I done wrong? Jason, help me!

Macker,
You're double counting the possibilities where you got it right the first time. If you pick door one and the car is behind door one, then you lose if you change, no matter what Monty Hall shows you. So there are just 9 scenarios:

Car Youpick Result of changing
1 1 lose
1 2 win
1 3 win
2 1 lose
2 2 win
2 3 win
3 1 lose
3 2 win
3 3 win

The real question is how to PROVE that you shouldn't count your scenarios 1&2, 6&7, and 11&12 as independent options, and that's what makes the problem so confusing. I can't give you a textbook answer to say why they are not independent, and it's easy to SOUND convincing either way so I'm not going to bother with a "it makes sense because" argument.

There are clearly 3 options to begin with; car behind 1,2 or 3. Nine options once you've picked a door (3 car options x 3 picked door options). If Monty picks a door at random to reveal, you have 3x3x2 options (which is what Brandon showed, after defining the car to be behind door 1).
But when Monty knows something, it gets messy, and it all comes down to why you should say "Monty reveals door 2 or 3" is one option, rather than two different options. I always
'proved' it a different way; since he can always select a door with a goat no matter what you picked, nothing affects the original 1/3 chance of your door being correct, and therefore the option comes down to your original choice being right (1/3) or wrong (2/3). But I'd love to be able to prove it with the truth table approach, but I don't know how to prove that options 1&2 in your list are not really independent options.

Macker,

Because if you choose correctly, all-knowing Monty has two choices of which door to expose, but if you choose incorrectly, he has only one. Thus in a Monte (Monty?) Carlo simulation, the losing cases 1, 2, 6, 7, 11, and 12 only occur with half the frequency of the other, winning cases.

The Monte Carlo would generate, with equal frequency, these scenarios:

Car is behind 1, you choose 1 (L)
Car is behind 2, you choose 2 (L)
Car is behind 3, you choose 3 (L)

Car is behind 1, you choose 2 (W)
Car is behind 1, you choose 3 (W)
Car is behind 2, you choose 1 (W)
Car is behind 2, you choose 3 (W)
Car is behind 3, you choose 1 (W)
Car is behind 3, you choose 2 (W)

(you win twice as much.) You have bifurcated the three losing scenarios into six, by using the irrelevant fact of which of the doors Monty opens, but each of those six extended scenarios occurs only half as much as each of the six winning scenarios.

That is, if you did a computer program and your #3 happened N times (where N is large), then #4 would also happen about N times, while #1 and #2 would each happen only (roughly) N/2 times. Thus each "loser" would happen with half the frequency, and consequently the overall losing chance (after a switch) would be half the winning chance, or, again, 1/3 as opposed to 2/3.

On the drawer scenario:

Right, when we think about it, we get a counterintuitive result implying that switching is good, which is inconsistent with the "obvious" correct answer - and the one that's been falling out of our analyses.

I mean, I think we all agree that if someone says "You can take this envelope of X dollars, or you can take this envelope which has a 50% chance of 2X dollars, and a 50% chance of X/2 dollars", you should take the second envelope. There's no paradox there - it's a double-or-half bet and those are good.

So what's the difference between the envelopes and the drawers? Well ... the fact you got to choose which was revealed.
Well ... let's break it down into the two cases.
If we picked the "good" one first, we're going to calculate the expected value from switching as 1.25*G.
If we picked the "bad" one first, we're going to calculate the expected value from switching as 1.25*(G/2).
The expected value before anything is revealed is .75G, of course ...

So ... a 50% chance we'll think the expected value of switching is 2.5 times what the actual value is, and a 50% chance we'll think the expected value of switching is .625 what it really is.
Since those don't work out to 1, that seems - to me, although my stats background is limited - to indicate we're not calculating the expected value correctly... which makes sense, since the expected value for the problem as a whole SHOULD be .75G, so any calculation that doesn't give that (or, well, doesn't give that if we assign a 50% chance of switching) is wrong.

Hmm.
I think I'm just saying things for the sake of saying things at this point.

Onto the "commit to switching/sticking" thing: Assuming that if you don't commit to switching, you can't switch, the clear choice is TO switch. Especially since, unless MH can reveal the car, he can't engage in strategy. (And in a problem where n > 3, you should stick until the last one, /then/ switch.)

By Michael Ralston (not verified) on 19 Nov 2007 #permalink

I think the intuitive solution is intuitive for a reason. It's right.
I think the "sophisticated" mathematical solution fools us by the formulae and the joy of proving that feeling is wrong and thinking and reason are right.

It's a magician's trick. Distract the audience while the real stuff is going on. And the real stuff in this question is the "Monty Hall picks the item which is a goat" which gets lost in the shuffle.

The reason that the odds are 1/2 not the counter-intuitive but mathematical sounding 1/3 is....that the 1/3 odds are for a random pick of three choices.

But there is only one step at which there are three equal random choices.

Monty Hal is not picking the number 2 or 3 choice randomly. He knows which of the 2 or 3 spot has the goat. His choice changes the odds, because his choice IS NOT RANDOM, and the odds of 1/3 relate to the original RANDOM situation. As the situation changes, the odds change, but people gloss over the non-randomness of the Monty Hall step, just as they do in any good magician's show.

AFter his exposure of the known truth, there are ONLY TWO RANDOM CHOICES LEFT.

It does not matter how high the n, every step after the first one is a NON-RANDOM step. The eliminated item is already known before it is picked. It is no longer part of the pool of random choices. So the Random odds keep changing.

Tell me where I am wrong (without using an equation or a formula.)

And in any event, I don't see what is added by going from three to a billion-plus-two doors. At the last step, there are still two doors left.

Yeah, but for this explanation it's the first step that matters. At the first step, the contestant has probability 1/n of choosing the winning door, where n is the number of doors. Most people understand 1/3 probability as 'meh, not bad,' whereas a success probability of 1e-9 in a single trial is almost universally interpreted as 'no sodding chance.' So at the last step, you're left with two doors, one of which you know has no sodding chance of winning. Not a difficult decision.

Of course, this line of reasoning ought to work even when working with three doors. Unfortunately, it frequently doesn't, as the last commenter amply demonstrates. In my experience, the billion-door version does help some people to understand the general case.

Tell me where I am wrong (without using an equation or a formula.)

If the odds are even, then sticking with your original choice every time will let you win half of the time. That can only happen if your original choice was correct half of the time. If you can pick a winner from three indistinguishable options half the time, there's a million dollar cheque with your name on it.

bmkmd,

"AFter his exposure of the known truth, there are ONLY TWO RANDOM CHOICES LEFT."

But they are not EQUAL random choices. In other words, after Monty has eliminated one of the choices, there are three possibilities:

You originally chose the car and the other door is either Goat "A" or Goat "B";
You originally chose Goat "A" and the other door is the car;
You originally chose Goat "B" and the other door is the car.

"Tell me where I am wrong (without using an equation or a formula.)"

Here's where I think you are wrong: the decision whether to switch or not is not the same as choosing between two, equally possible outcomes.

I think you assume that, starting out, half the time you chose the car and half the time you chose a goat, so, after Monty eliminates one of the goat it doesn't matter whether you switch or stay; it's fifty-fifty either way.

But that's wrong. You don't choose the car 1/2 the time. You only choose the car 1/3 of the time. Two-thirds of the time you choose a goat. And since that is the case, after Monty eliminates one of the goats, switching every time means that two-thirds of the time you are switching from a goat to the car and only 1/3 of the time from the car to a goat.

By WoodyTanaka (not verified) on 20 Nov 2007 #permalink

Well, here might be a better way to think about it:

You get to pick either one door, or the other two doors. AND you get to pick which doors go into which group.

Obviously you should pick two doors! It's silly not to.

And that's precisely what happens when you switch.

By Michael Ralston (not verified) on 20 Nov 2007 #permalink

john:

In probability problems, it is critical to define the "sample space" - ie, the set of possible outcomes - carefully. The statement of the problem "there is twice as much money in one drawer as the other" is leading you astray in defining the sample space. To repeat, in your description of the problem there are three possible values in the drawers; in reality, there clearly are only two.

An alternative statement of the problem that might help in avoiding that error:

Assume one of two drawers contains D dollars, the other contains 2D dollars.

Now, when you open one drawer and there are x dollars in the drawer, there are two possibilities: x=D or x=2D, each with equal probability, and the other drawer contains 2D or D dollars respectively, again with equal probability. So the expected return from either switching or not switching is 1/2 (D+2D). QED.

General aside:

Despite the mixing of pop culture and math in the MH problem, it and its variants such as the drawer problem remain at base probability problems, and as such need to be solved using the established techniques of the field. The various intuitive explanations, gut feels, verbal arguments, etc, may or may not be useful in helping to resolve the paradoxical aspects of the problems once you know the answer, but a formal solution involves either defining the sample space and counting the outcomes or writing equations involving conditional probabilities. If one doesn't want to do either, one shouldn't claim to be solving the problems.

And one certainly shouldn't make sophomoric statements suggesting that knowing how to do so suggests a preference for reason over feeling. It's appropriate to tear up at the end of soppy movies and appropriate to work probability problems dry-eyed. That's called emotional maturity.

- Charles

Martin:

OK, now I see it. To the extent that I was thinking at all (for some reason, I have a strong aversion to that particular heuristic), I had 1/3 in mind even when it was really 1/billion. Thanks.

-c

"Obviously you should pick two doors! It's silly not to."

Which is the essence of my "variation" above (on which no one has bitten). It turns out that no matter what Monty does after you've chosen initially (excepting, of course, foolishly opening your door when it has the prize), you should switch since the best you can do if you don't switch is 1/3 and the worst you can do if you switch (excepting, of course, foolishly switching to an empty opened door) is also 1/3 (in the case where Monty opts to do nothing, ie, open no doors - analogous to the drawer problem).

- Charles

Isn't this just another version of the old coin flip confusion? Three heads have come up, so the student says 'I bet on tails, cause it's 1/16 to throw four heads in a row'. It's 1/16 at the beginning of time, before any flips. Facing the last flip it's 1/2 because a 1/8 event (three consecutive heads) has already occurred. Similarly, at the beginning it's 1/3 to pick the door with the car. After a door is eliminated, it's a 1/2 choice. All these complicated arguements are just attempts to combine two distinct time frames, which you can't do... Am I a simpleton?

By Paul Hrich (not verified) on 20 Nov 2007 #permalink

Charles: I did make a comment addressing your variation.

Paul: You can and must combine timeframes when the probability of various events are not independent.
In the coin flip problem, the probability of another throw of heads is 1/2 no matter what has happened so far - but that doesn't hold in the MH problem.

In the MH problem, when Monty reveals a door, the odds of him revealing each door are NOT the same. Consider, for a moment, a three-person variant.

We have The Player, Monty, and The Producer in this variant.
The Player and Monty work the same as in the normal variant - The Player picks a door, Monty reveals that some other door has a goat, then The Player decides if he wants to stay with the door he picked initially, or switch to the third door.

However! The Producer has to guess, after The Player has picked his door, which door Monty will reveal. He knows where the car is, and what door The Player picked, but that's it - he now has to decide where to have the camera zoom so as to watch Monty reveal a goat.

What are the odds he'll pick the right door, knowing that Monty is going to reveal a goat, and which door has the car?

By Michael Ralston (not verified) on 20 Nov 2007 #permalink

As an addition to my previous post, assume that if Monty gets to make choices, he makes each choice with the same probability, just to rule out any collusion or the like.

By Michael Ralston (not verified) on 20 Nov 2007 #permalink

"Similarly, at the beginning it's 1/3 to pick the door with the car. After a door is eliminated, it's a 1/2 choice."

That is true, if you randomly chose to either switch or stay at the end.

But you get 2/3 odds if you don't randomly choose but, instead, always switch. Your random choice in the begining is going to be wrong 2/3 of the time and Monty removes one of the wrong choices. So 2/3 of the time you'll switch from the wrong door to the right door.

By WoodyTanaka (not verified) on 20 Nov 2007 #permalink

"Am I a simpleton?"

Yes.

Just kidding, Paul! Your observation about combining time frames was perceptive, but you reached a wrong conclusion. As Michael observed, when there are multiple random variables (eg, choosing one of three doors can be represented by a random variable X which takes values 1, 2, or 3) involved in a problem, you must address the question of dependence. One tool for doing that is the concept of conditional probabilities, eg the probability that the value of one random variable (eg, X) is x given that the value of another random variable (eg, Y) is y. Symbolically, this is written P{X=x|Y=y), defined as:

P{X=x|Y=y} = P{X=x and Y=y}/P{Y=y}

[As an aside, note that if X and Y are independent, by definition

P{X=x and Y=y} = P{X=x} P{Y=y}

Then, P{X=x|Y=y} = P{X=x}, which agrees with one's sense that "independent" should mean that the value of Y doesn't affect the statistics of X. ]

As Michael also observed, in the MH problem the random variables

W = {door prize is behind}, C = {door contestant chooses}, and O = {door Monty opens}

are not independent Hence, the problem becomes to determine:

P{W=i|C=j,O=k}

To simplify the analysis a bit, we can assume that the contestant chooses door 1. For other choices, the specific door numbers will change, but the calculations will be the same. We can also assume that Monty opens door 3 since the calculations will again be the same if he opens door 2. These assumptions allow us to eliminate C and reduce the sought quantity to:

P{W=2|O=3}

Using the definitions above and manipulating (multiply numerator and denominator by P{W=2},

P{W=2|O=3} = P{W=2 and O=3}/P{O=3} = P{O=3|W=2}P{W=2}/P{O=3} .

Since we are assuming that the contestant chooses door 1, Monty can't open door 1 (otherwise, the contestant will know whether or not to switch). So, if the prize is behind door 2, Monty must open door 3, ie, P{O=3|W=2}=1. The unconditional probability P{W=2} is 1/3. Finally, Monty will open door 2 or door 3 with equal a priori (ie, unconditional) probability, which of the two doors depending on where the prize actually is. So, the unconditional probability P{O=3}=1/2. Thus,

P{W=2|O=3} = 1(1/3)/(1/2) = 2/3.

- Charles

Michael:

Sorry, I missed your response since it was at the end of a comment I didn't read in full.

"unless MH can reveal the car, he can't engage in strategy"

If I understand what you mean here, I believe it's wrong. If you commit to switching and the car isn't behind the door you choose initially, Monty should open that door. Then your choice between doors 2 and 3 is random, so the probability of winning becomes 2/3 x 1/2 = 1/3.

So, you should always commit to switching (just in case Monty is a dummy or is having an off day), but in principle Monty can counter that. In which case, of course, the game becomes of no interest.

- Charles

To understand this without maths:
The initial non-chosen set (of 2) has had its 'GOATiness' reduced, so its 'CARiness' is increased. Information has been gained.
The initial chosen set (of 1) hasn't had its GOATiness reduced. So its more GOATy. So switch.

Ok, maybe Im just crazy, stupid, or obtuse. But it seems pretty obvious to me. Lets pick up where Monty has opened a door. There is now a 1/2 chance the car is behind a given door. Dont you now have a 50% chance of getting the right door, regardless of whether or not you switch and regardless of whether or not you decided that ahead of time?

For those calling for a switch, it seems like youre saying that if you flip a coin 10 times and get heads each time that youre more likely to get tails on the 11th toss because the odds of getting heads 11 times in a row are so slight. Sure getting 10 heads in a row was pretty unlikely, but once you do, it has no effect on the next toss.

As to the two drawers with money, assuming you're playing in the US, you should stick with your first choice because it ends the game that much sooner, allowing you to claim your money and spend it before the dollar depreciates any farther. The longer you prolong the game the less whatever money you get will be worth.

By Abby Normal (not verified) on 21 Nov 2007 #permalink

Oops, I missed that Paul Hrich and Michael Ralston already discussed the coin flip analogy.

Michael, I submit that the events are independent. MHs revelation doesnt bridge the first round of door selection with the second, which I think is what youre implying.

By Abby Normal (not verified) on 21 Nov 2007 #permalink

"Ok, maybe Im just crazy, stupid, or obtuse. But it seems pretty obvious to me. Lets pick up where Monty has opened a door. There is now a 1/2 chance the car is behind a given door."

There is a 1/2 chance that either of the doors left contains the car, but there is only a 1/3 chance that the door you've chose in the first place had the car. Thus, there's a 2/3 chance that the door you didn't choose has the car behind it.

Dont you now have a 50% chance of getting the right door, regardless of whether or not you switch and regardless of whether or not you decided that ahead of time?

You CAN have a 50% chance if you choose at random, but that doesn't mean that 50% is the BEST odds you could have.
There are three potential strategies (technically there are more, but...)
1) Don't switch, ever. That will win you the car 1/3 of the time because in the initial choice, you'll only have picked the car 1/3 of the time.

2) Switch or stick at random. This will give you success 1/2 of the time because one of the two door remaining will have the car and the other will have the goat.

3) Always switch. Because you'll only have picked the car intially 1/3 of the time, and because Monty eliminates one of the goats, if you pick this option, 2/3 of the time you switch from goat to car.

----

"Michael, I submit that the events are independent. MHs revelation doesnt bridge the first round of door selection with the second, which I think is what youre implying."

No, the events are connected, because you will be coming out of the first round having chosen the goat 2/3 of the time. MH's revelation merely prevents you from switching from one goat to the other. Thus, after the revelation, every time you initially chose a goat (2/3 of the time) you will switch to the car.

By Woody Tanaka (not verified) on 21 Nov 2007 #permalink

Coming in after the goat is revealed DOES give you a 1/2 chance, but that's because you've lost information.

You see, MH's revelation *does* bridge the first round of door selection with the second, because what door he reveals is predicated on what door you chose in the beginning.

If you pick a goat door initially - which you will with 2/3rds probability (agreed on this, I hope?) - then he has no choice.
The door he doesn't reveal MUST be the car.

If you pick the car door initially, then he has a choice - but it doesn't matter, because either way he reveals a goat.

Now, we pick it up in phase two, and there are three doors. Here's the state of each door, as we know it:
The door you picked initially: Unopened, *could not have been opened*, no matter what was behind it. Can only be right if you were right initially.
The door MH opened: Opened, has a goat. Clearly not the right choice.
The third door: Unopened, *could have been opened*, if it doesn't have a goat. Must be right if you were wrong initially, because the other choice has already been eliminated.

The two remaining doors are not symmetric - and that's why it's not a 1/2 chance when you switch.

By Michael Ralston (not verified) on 22 Nov 2007 #permalink

I think what may throw people off is essentially a linguistic confusion. Let's revisit the problem statement:

After Monty opens a door, should you switch or not?

It seems that we typically translate this (possibly subconsciously) into:

After Monty opens a door, there are two options: switch to the other unopened door or don't switch. Which should you choose?

Given this description, it appears that you are faced with a random choice. But the translation is misleading. While it is true that the prize can be behind either door, the choice is not random if the objective is to maximize the probability of winning. This can be seen by noting that although the problem statement has Monty opening the door and then asking if you want to switch, the decision whether or not to switch has nothing at all to do with what Monty does. It is a priori known (by a savvy contestant) that to maximize your chance of winning you must switch. So, after the initial choice, the process is deterministic, a savvy contestant is never faced with randomly choosing between two doors; ie, no p=1/2 choice arises.

Note that this is not a proof, only a possibly helpful heuristic. It assumes the answer, which is unequivocally known to be that you should switch (almost every conceivable "proof" is in some comment above). A careful reading of Michael's last comment may help in seeing that to maximize the probability of winning, you must switch.

- Charles

Has it ever occurred to anyone that there are never actually 3 choices, and it only appears this way because of how it is explained?

Look at it this way. There are three doors (one prize and 2 non-prize). You don't choose one, but at some point Monty simply removes one of the non-prizes. You are then left with two doors. Behind one is the prize and the other the non-prize.

The initial step of choosing is just a kind dramatic effect, because without it the show would have been even more boring than it was.

By Benjamin RE Shepard (not verified) on 29 Nov 2007 #permalink

Let's say I run a game show where the prize fund varies from week to week. I tell my beautiful assistant, Vaner W., to put 1/3 of the prize fund for this week in one envelope and 2/3 in the other envelope. She gives me the two envelopes without me knowing which is which. I pull Jason out of the audience and give him one of the two envelopes at random. I tell him to look in the envelope without showing me the amount of money inside. After he looks, I tell him that there is a 50% chance the other envelope has twice the money and a 50% chance it has half the money and offer to trade envelopes.

Now Jason concludes that by trading he should receive (2+1/2)/2 = 5/4 as much money by trading and so he does. On the other hand I conclude that he has the same expected return regardless of whether he trades or not. Which of us is right? If I am, did I give Jason incorrect information on the odds? If I am right and the odds are right, then what's wrong with Jason's reasoning?

By Clif Davis (not verified) on 18 Nov 2008 #permalink