Monty Hall Problem (statistics/choice game)

pau17

King
Joined
Mar 12, 2008
Messages
836
Here's the game:

He shows you three closed doors, with a car behind one and a goat behind each of the others. If you open the one with the car, you win it. You start by picking a door, but before it’s opened Monty will always open another door to reveal a goat. Then he’ll let you open either remaining door.

Suppose you start by picking Door 1, and Monty opens Door 3 to reveal a goat. Now what should you do? Stick with Door 1 or switch to Door 2?

Now, what I think is that it doesn't matter at all what you do. Ok, so one of the other choices got knocked out, but that has nothing to do with the one you picked. Your odds just went from 1/3 to 1/2, and since you don't know which of the remaining two is the winning door, it's completely arbitrary to switch now. You could switch or stay and still have 1/2 odds.

The article's answer:

Spoiler :

You should switch doors.

This answer goes against our intuition that, with two unopened doors left, the odds are 50-50 that the car is behind one of them. But when you stick with Door 1, you’ll win only if your original choice was correct, which happens only 1 in 3 times on average. If you switch, you’ll win whenever your original choice was wrong, which happens 2 out of 3 times.


I don't agree...like I said above, your odds have increased for both of the remaining doors. The other door being exposed only tells you that one of the two remaining doors is the winner. You could pick either one, but it's completely arbitrary.

Or am I missing something here?

Source: http://www.nytimes.com/2008/04/08/science/08tier.html?_r=1&8dpc&oref=slogin
 
I'd switch to 2.
Just to clarify: 1/3 chance of being right on door one.
2/3 chance of being right on door two.
 
Imagine that, instead of 3 doors, there were 1,000 doors. You pick door number 276. Monty opens all doors except for door 276 and door 869. They're all empty. Which one do you think will have the prize -- the one that you randomly picked, or the one that Monty specifically didn't open because he knew that all the other ones had goats behind...

The point is, Monty knows which ones don't have prizes, and he'll always open one that doesn't. He has more information than you do, and so the choice that he leaves unopen is always more likely to have a prize behind it than the choice that you pick at random.
 
Imagine that, instead of 3 doors, there were 1,000 doors. You pick door number 276. Monty opens all doors except for door 276 and door 869. They're all empty. Which one do you think will have the prize -- the one that you randomly picked, or the one that Monty specifically didn't open because he knew that all the other ones had goats behind...

The point is, Monty knows which ones don't have prizes, and he'll always open one that doesn't. He has more information than you do, and so the choice that he leaves unopen is always more likely to have a prize behind it than the choice that you pick at random.

I don't see that as part of the game. He will randomly open one of the wrong doors for you; there's nothing to deduce from that.

The problem I see when it says that the first door only has a 1/3 chance of being right...well, so did the other door had you picked that one. There is nothing different about the doors and you are not given any information to suggest that it is so. Once one of the doors is eliminated, you are left with two identical doors, either one could be right, thus it's 50/50.

I get the reasoning that uses the fact that your first choice is only right 1/3 on average, and thus you get to lump your second choice into the 2/3 probability, and since one of the other doors is eliminated, you basically get the 2/3 chance for free in one shot. That just doesn't make sense because it compares two different situations...the two doors are identical, and the fact that he opens one door gives you further information for either choice you might make. Originally it was 1/3, yes, but now it's 1/2 for either one.

The idea of having a "chance" at guessing right only goes as far as the information you have. You have more information in the second round, thus your belief that the door you picked is one of the two remaining doors should give you the 50% chance.
 
Pau, the statistical reasoning is valid. If you don't believe it, try it for yourself. Do it 100 times -- it'll be the "other" door 66 times.

EDIT: Also, when you say he'll "randomly open one of the wrong doors for you", you're not really correct. If, on the 2 in 3 chance that you've picked a wrong door, he'll open the other wrong door. If, on the 1 in 3 chance that you've picked the right door, he'll "randomly open one of the wrong doors for you". The point is, he always opens a wrong door.
 
Pau, the statistical reasoning is valid. If you don't believe it, try it for yourself. Do it 100 times -- it'll be the "other" door 66 times.

EDIT: Also, when you say he'll "randomly open one of the wrong doors for you", you're not really correct. If, on the 2 in 3 chance that you've picked a wrong door, he'll open the other wrong door. If, on the 1 in 3 chance that you've picked the right door, he'll "randomly open one of the wrong doors for you". The point is, he always opens a wrong door.

Ah, that last bolded bit was the missing piece; I get it now. That's how the doors are differentiated; I was all hung up on the assumption that there was no informational difference but there in fact is. Nice explanation.
 
It's easiest to think that after picking a door, it'll be one of the other two 2/3 of the time. So it's still one of the other two 2/3 of the time after Monty opens one of them.
 
Make a probability tree if you're having such difficulties with this.

Let's assume that we always switch.

You have a 2/3 chance that you picked a goat door. When he opens (the other) goat door, you have a 100% chance that you'll pick the car if you switch. Subtotal is 2/3 chance of getting the car.

You have a 1/3 chance that you picked the car door. When he opens a goat door, you have a 0% chance that you'll pick the car if you switch.
Total is 2/3 chance of getting the car.

Let's assume that we never switch.

You have a 2/3 chance that you picked a goat door. When he opens (the other) goat door, you won't switch, so you have a 100% chance that you get a goat. Subtotal is 0% chance of getting the car.

You have a 1/3 chance that you picked the car door. When he opens a goat door, you won't switch, so you have a 100% chance that you get the car.
Total is 1/3 chance of getting the car.

QED
 
QED=quod erat demonstrandum, or that which was to be demonstrated. It's used at the end of a mathematical proof like this one.
 
Thanks, Kraznaya, but the meaning QED is not what stumped me here. And for that matter, the poster was hardly stating a mathematical proof :p

The use of QED seems irrelevant in this case, as there does not appear to be a statement that has been proven. All that's stated is the results of two (opposing) strategies. Therefore I don't understand why the poster is claiming that a proposition has been proven. :confused:

My question still stands: Is the best strategy to always switch?
 
My basic statistics agrees with you every time I hear about this problem and forget about it again :). I guess what we all miss is that solving the statistics of this problem is actually about counting the possible permutations of the problem.
http://en.wikipedia.org/wiki/Monty_Hall_problem#Solution

The fallacy we all make is that holding to your original blind choice, after seeing one of the other choices nixed, DOES NOT suddenly improve the odds of your original choice to 50%, even though it appears that way. Your chance of your original choice being right was 1/3 before you received extra info, and it was 1/3 after you receive the extra info, because regardless, it was a BLIND choice of 3 possibilities.

Your original choice only had a 1/3 chance of being right, and a 2/3 of being wrong. The other two choices had a 2/3 of being right, taken together. Monty revealing a part of that 2/3rds means you can't get that choice wrong, whichever door left of that 2/3rds has a 2/3rds chance of being right. (So what really happened is not that there is 2 random doors of 50-50 chance, but what happened was there was a 1/3 choice and a 2/3 choice, and your chance of screwing up the 2/3 choice (making it into a 1/3 choice) has gone to nil. So now the 2/3 choice is a better choice).

2/3 > 1/3 therefore those other 2 doors (now only 1 thanks to Monty) are the better choice. 66% vs. 33%. You might still lose by switching, but out of 2 trials, you're likely to win once by switching, whereas it'd take three trials for you to win once by always holding onto your original choice.

Seems crazy, but it's not.



Here's the game:



Or am I missing something here?

[/url]
 
The assumption that the two doors have equal chance is where most people's intuition fails them... generally if you ask them to justify that assumption they see the correct solution.
 
Thanks, Kraznaya, but the meaning QED is not what stumped me here. And for that matter, the poster was hardly stating a mathematical proof :p

My use of "QED" is very liberal and often facetious. In this case it was more a combination of both, mostly of the former. This is because...

The use of QED seems irrelevant in this case, as there does not appear to be a statement that has been proven. All that's stated is the results of two (opposing) strategies. Therefore I don't understand why the poster is claiming that a proposition has been proven. :confused:

My question still stands: Is the best strategy to always switch?

This is because I did not provide any conclusion. I left the conclusion up to the reader. I have shown that if you always switch, you have a 2/3 chance of winning the car, whereas if you never switch, you have a 1/3 chance of winning the car.

It's up to you to figure out which is the favourable strategy :p
 
The original choice has nothing to do with the odds of getting a car since after it a second choice is given, and new odds are created with that choice.

The first choice has no outcome to create odds for, since the outcome will always be the same: Monty creates a new 50/50 odds scenario.

The actual decision and odds are that 50/50. "Keeping" your original choice is the same as making a new choice of going with that door, which has a 50% chance of being right. The point is that you make a new decision with the new odds based on the new information, not that the first decision has any effective bearing on your outcome.

The issue is at what point the "odds" are being chosen, and this article confuses the choice by taking odds from the first choice and equating them with odds from the second choice, which is like comparing test scores from before a curve with those after a curve.

Well, I got an 85% originally, and Mary got a 93% after the curve, so Mary is the better student...
 
DNK, you're wrong, for the reasons already stated. There's many ways that you can think about the problem, several have been posted already.
 
The original choice has nothing to do with the odds of getting a car since after it a second choice is given, and new odds are created with that choice.

That is true, but the new odds aren't 50/50. That's where you go wrong.
 
If you guys want to argue about probability, try this one:

A random guy on the street tells you: "I have two kids. One of them is a daughter. What are the odds that the other one is a boy?"

The correct answer is 2/3. Yet we still end up in a huge argument every time I discuss this with my wife, who thinks the correct answer is 1/2 (the foolish girl!)

What do you say?
 
If you guys want to argue about probability, try this one:

A random guy on the street tells you: "I have two kids. One of them is a daughter. What are the odds that the other one is a boy?"

The correct answer is 2/3. Yet we still end up in a huge argument every time I discuss this with my wife, who thinks the correct answer is 1/2 (the foolish girl!)

What do you say?

50%, independent odds
 
Back
Top Bottom