Misplaced Pages

Talk:Monty Hall problem/Archive 3: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
< Talk:Monty Hall problem Browse history interactively← Previous editContent deleted Content addedVisualWikitext
Revision as of 21:52, 6 February 2006 editMike Van Emmerik (talk | contribs)496 edits Tricky Host Scenario: Answer question← Previous edit Latest revision as of 10:10, 2 February 2023 edit undoMalnadachBot (talk | contribs)11,637,095 editsm Fixed Lint errors. (Task 12)Tag: AWB 
(120 intermediate revisions by 27 users not shown)
Line 1: Line 1:
{{Automatic archive navigator}}
{{featured}}
{{Mainpage date|July 23|2005}}

{{oldpeerreview}}

I've moved the existing talk page to ], so the edit history is now with the archive page. I've copied back a few recent threads. Older discussions are in ]. Hope this helps, ] 15:28, 28 July 2005 (UTC)


==Actual rules for the gameshow== ==Actual rules for the gameshow==
Line 116: Line 111:
:I was initially quite dubious about this edit, but became less so when I looked up ] and realized that it does rather fit the Monty Hall problem after all, at least by the definition in the article. How does a player maximize his chances of walking off with the prize? Of course, this may mean that it's ] that has to be tweaked to clarify the matter. -- ] 23:08, 10 August 2005 (UTC) :I was initially quite dubious about this edit, but became less so when I looked up ] and realized that it does rather fit the Monty Hall problem after all, at least by the definition in the article. How does a player maximize his chances of walking off with the prize? Of course, this may mean that it's ] that has to be tweaked to clarify the matter. -- ] 23:08, 10 August 2005 (UTC)


::Game theory is used to analyze strategic situations, where "strategic" means that there are interacting interests between two players. For instance, Robert Gibbon's book on game theory starts with this sentence: "Game theory is the study of multiperson decision problems." This is echoed (somewhat less clearly) in our article: "A definitive feature of game theory that distinguishes it from decision theory whose main subject is also studying formalized incentive structures is that game theory encompasses decisions that are made in an environment or states of the world in which strategic interaction between various players occurs." (Actually this sentence is really confusing, I will fix it.) The monty hall problem is clearly not a multiperson decision problem, since there is only one interested actor, the player. I think ] or ] would be best descriptions of the problem. --best, kevin <font color="#BBBBBB">·</font><font color="#666666">·</font>·<small>] | ]</small>·<font color="#666666">·</font><font color="#BBBBBB">·</font> 23:55, August 10, 2005 (UTC) ::Game theory is used to analyze strategic situations, where "strategic" means that there are interacting interests between two players. For instance, Robert Gibbon's book on game theory starts with this sentence: "Game theory is the study of multiperson decision problems." This is echoed (somewhat less clearly) in our article: "A definitive feature of game theory that distinguishes it from decision theory whose main subject is also studying formalized incentive structures is that game theory encompasses decisions that are made in an environment or states of the world in which strategic interaction between various players occurs." (Actually this sentence is really confusing, I will fix it.) The monty hall problem is clearly not a multiperson decision problem, since there is only one interested actor, the player. I think ] or ] would be best descriptions of the problem. --best, kevin <span style="color:#BBBBBB;">·</span><span style="color:#666666;">·</span>·<small>] | ]</small>·<span style="color:#666666;">·</span><span style="color:#BBBBBB;">·</span> 23:55, August 10, 2005 (UTC)


:::Beter still, ]?--best, kevin <font color="#BBBBBB">·</font><font color="#666666">·</font>·<small>] | ]</small>·<font color="#666666">·</font><font color="#BBBBBB">·</font> 23:59, August 10, 2005 (UTC) :::Beter still, ]?--best, kevin <span style="color:#BBBBBB;">·</span><span style="color:#666666;">·</span>·<small>] | ]</small>·<span style="color:#666666;">·</span><span style="color:#BBBBBB;">·</span> 23:59, August 10, 2005 (UTC)


Ahem, don't we have an encyclopedia to write here. It seem that you are arguing about the distance between two points on a beach ... jeesh. Well, if it is ''really'' important for one of you to prevail here then fine, go at it. Every once in a while however, look around and notice what you are spending your valuable time doing. ;-( ] ] 02:15, 11 August 2005 (UTC) Ahem, don't we have an encyclopedia to write here. It seem that you are arguing about the distance between two points on a beach ... jeesh. Well, if it is ''really'' important for one of you to prevail here then fine, go at it. Every once in a while however, look around and notice what you are spending your valuable time doing. ;-( ] ] 02:15, 11 August 2005 (UTC)
Line 169: Line 164:
:If you have read the entire article then you missed the point. It's a scam, "''the game host (who knows what's behind the doors) must open another door, revealing a goat''". If you believe otherwise let me explain one more time (I'm the scammer and you're the scamee or mark). To make the example more obvious we are going to start with ten doors. You choose one of them and by default you un-choose nine of them. Would you agree that your chances at this point are one in ten or 10%? OK then, I then slowly and dramatically -''ta dahh''- open eight doors . '''It is critical at this point that you understand that I know where the prize is so the eight doors that I (the scammer) open are known to me to be non-winners (goats)'''. So, we're down to you with a 10% door and me with a 90% door. Do you ''still'' think it's a 50/50 choice? So, bring it back to three doors and '''so long as you realize that the host (scammer) knows which door is a non-winner which he procedes to open''' then you should obviously choose '''his''' remaining door whether you start with three or ten or one hundred doors. --] ] 21:33, 6 November 2005 (UTC) :If you have read the entire article then you missed the point. It's a scam, "''the game host (who knows what's behind the doors) must open another door, revealing a goat''". If you believe otherwise let me explain one more time (I'm the scammer and you're the scamee or mark). To make the example more obvious we are going to start with ten doors. You choose one of them and by default you un-choose nine of them. Would you agree that your chances at this point are one in ten or 10%? OK then, I then slowly and dramatically -''ta dahh''- open eight doors . '''It is critical at this point that you understand that I know where the prize is so the eight doors that I (the scammer) open are known to me to be non-winners (goats)'''. So, we're down to you with a 10% door and me with a 90% door. Do you ''still'' think it's a 50/50 choice? So, bring it back to three doors and '''so long as you realize that the host (scammer) knows which door is a non-winner which he procedes to open''' then you should obviously choose '''his''' remaining door whether you start with three or ten or one hundred doors. --] ] 21:33, 6 November 2005 (UTC)


:Also note that it doesn't matter if you think it's wrong. It's objectively right. Probability is nothing but a measure of the fraction of attempts that produce a given outcome. Empirical studies have demonstrated that the answer given in this article is right. No amount of argument or logic can change the fact that this article's conclusion matches reality. --] 22:10, 6 November 2005 (UTC) :Also note that it doesn't matter if you think it's wrong. It's objectively right. Probability is nothing but a measure of the fraction of attempts that produce a given outcome. Empirical studies have demonstrated that the answer given in this article is right. No amount of argument or logic can change the fact that this article's conclusion matches reality. --] 22:10, 6 November 2005 (UTC)


::It does matter to me. I'd like to think that we have explained this in a way that ] comes away with an understanding rather than a dogmatic. But then... --] ] 00:36, 7 November 2005 (UTC) ::It does matter to me. I'd like to think that we have explained this in a way that ] comes away with an understanding rather than a dogmatic. But then... --] ] 00:36, 7 November 2005 (UTC)


:::That's a noble goal, but with all due respect, I wasn't talking to you. :-) --] 14:58, 7 November 2005 (UTC) :::That's a noble goal, but with all due respect, I wasn't talking to you. :-) --] 14:58, 7 November 2005 (UTC)


:I think I see where you're getting confused, so I hope you'll let me try to explain. Let's generalize the problem as given to a new class of problems: :I think I see where you're getting confused, so I hope you'll let me try to explain. Let's generalize the problem as given to a new class of problems:
Line 206: Line 201:
::OK, how's this for a paraphrase: we do away with the numbers and use the layman's term "chances are." You pick a door. Chances are, it's a goat. Then, Monty opens a door that he knows is a goat door. Now, assuming you picked a goat, and of course Monty showed you a goat, the only door left is the car. You'd be a fool not to switch. Of course, it's much less likely that you'd pick the car at first, in which case you'd lose by switching. ] 17:44, 14 December 2005 (UTC) ::OK, how's this for a paraphrase: we do away with the numbers and use the layman's term "chances are." You pick a door. Chances are, it's a goat. Then, Monty opens a door that he knows is a goat door. Now, assuming you picked a goat, and of course Monty showed you a goat, the only door left is the car. You'd be a fool not to switch. Of course, it's much less likely that you'd pick the car at first, in which case you'd lose by switching. ] 17:44, 14 December 2005 (UTC)


I think this reasoning is too simplistic, and would give the wrong answer in some cases. For instance, would this reasoning apply to ]? I think it would lead to the wrong answer. (The right answer is that your odds are 50-50 in that case.) --] 21:31, 18 December 2005 (UTC) I think this reasoning is too simplistic, and would give the wrong answer in some cases. For instance, would this reasoning apply to ]? I think it would lead to the wrong answer. (The right answer is that your odds are 50-50 in that case.) --] 21:31, 18 December 2005 (UTC)


==Actual rules for the gameshow== ==Actual rules for the gameshow==
*Note: this was posted at the top of this talk page. <small>] ] 15:54, 22 November 2005 (UTC)</small> *Note: this was posted at the top of this talk page. <small>] ] 15:54, 22 November 2005 (UTC)</small>
I wonder if somewhere in the article it should be pointed out that under the actual rules for the “Lets Make a Deal” game show that this problem seems to be named after, switching doors didn’t actually increase your chances of winning. On the game show Monty would only offer the chance to switch ½ of the time if the player initially picked incorrectly, but would always offer the choice if the player initially picked correctly. This throws off the normal analysis in which the choice is always offered, since simply being offered the choice to switch increases the chances that your initial door pick was correct. {{unsigned|128.227.7.193}} I wonder if somewhere in the article it should be pointed out that under the actual rules for the “Lets Make a Deal” game show that this problem seems to be named after, switching doors didn’t actually increase your chances of winning. On the game show Monty would only offer the chance to switch ½ of the time if the player initially picked incorrectly, but would always offer the choice if the player initially picked correctly. This throws off the normal analysis in which the choice is always offered, since simply being offered the choice to switch increases the chances that your initial door pick was correct. <small>&mdash;''The preceding ] comment was added by'' ] (]&nbsp;&bull;&nbsp;]) {{{2|}}}.</small><!--Inserted with Template:Unsigned-->
#How do you know this to be true? #How do you know this to be true?
#New stuff goes at the '''bottom''' of this page. #New stuff goes at the '''bottom''' of this page.
Line 226: Line 221:
On the millionaire show, the contestant does not get to pick an answer and then have two wrong answers removed. Without picking, two are removed and if you then pick randomly there's a 50/50 chance. Even if you mentally (randomly) "pick" and your pick is one of the two remaining answers, the result is a 50/50 chance because your pick is not related to the process by which the other answers are removed. -- ] <small>(])</small> 16:28, 16 December 2005 (UTC) On the millionaire show, the contestant does not get to pick an answer and then have two wrong answers removed. Without picking, two are removed and if you then pick randomly there's a 50/50 chance. Even if you mentally (randomly) "pick" and your pick is one of the two remaining answers, the result is a 50/50 chance because your pick is not related to the process by which the other answers are removed. -- ] <small>(])</small> 16:28, 16 December 2005 (UTC)


Agreed. No amount of meditation before the removal of two choices will affect the probability of the final outcomes. You need to tell Monty your pick and have that affect his actions. --] 21:29, 18 December 2005 (UTC) Agreed. No amount of meditation before the removal of two choices will affect the probability of the final outcomes. You need to tell Monty your pick and have that affect his actions. --] 21:29, 18 December 2005 (UTC)


Note that, (In the English version at least), the player can tell the presenter which of the 4 answers they think it is before the 50:50 takes one away, but it is still possible that the one they picked could be taken away. Leaving two, and a true 50:50 chance of picking at random the correct answer. --] 21:10, 23 January 2006 (UTC) Note that, (In the English version at least), the player can tell the presenter which of the 4 answers they think it is before the 50:50 takes one away, but it is still possible that the one they picked could be taken away. Leaving two, and a true 50:50 chance of picking at random the correct answer. --] 21:10, 23 January 2006 (UTC)
Line 245: Line 240:


50% in all cases! 50% in all cases!

:So how do you explain the observed fact that the contestant tends to win twice as often when he switches? --] 16:40, 2 March 2006 (UTC)


Why? Why?
Line 278: Line 275:
==Monty Hall is a Markov Chain== ==Monty Hall is a Markov Chain==


Well, I may be thick, but I certainly don't get it. After the door has been opened, I am left with two doors. The story so far has been very entertaining, but in fact it has given me no information which will indicate if my first choice was right or not. That means it is now 1:2. How we got to this situation is irrelevant, unless the process of getting there gives me information which is relevant, but I don't see how it does. <br> Well, I may be thick, but I certainly don't get it. After the door has been opened, I am left with two doors. The story so far has been very entertaining, but in fact it has given me no information which will indicate if my first choice was right or not.
:However, <i>if</i> you were wrong, you now know <i>which other door would have been right!</i> And that <i>is</i> new information. -Scarblac 20060223
That means it is now 1:2. How we got to this situation is irrelevant, unless the process of getting there gives me information which is relevant, but I don't see how it does. <br>
The article correctly says that for some statistical calculations the past can be ignored, while for others it cannot. The major fault of the article is that it does not go on to say why in this case the past is relevant. The article cites card counting as an example where the past cannot be ignored - if I know that some cards have already gone (and I know WHICH) then I have information which affects the probability of the next card being an ace, so the past is relevant for future probability. But if we have a sequence of events which are separate events, then we have a ], and previous events in the chain do not affect the next one: for example a series of coin tosses. <br> The article correctly says that for some statistical calculations the past can be ignored, while for others it cannot. The major fault of the article is that it does not go on to say why in this case the past is relevant. The article cites card counting as an example where the past cannot be ignored - if I know that some cards have already gone (and I know WHICH) then I have information which affects the probability of the next card being an ace, so the past is relevant for future probability. But if we have a sequence of events which are separate events, then we have a ], and previous events in the chain do not affect the next one: for example a series of coin tosses. <br>
The point about Markov is that there is a difference in perspective before and after any event in the sequence. If the chances of tossing a coin and getting heads is 1:2, obviously the chances of doing it twice is 1:4, but if I toss a coin and get heads, the chances of now doing it a second time are 1:2, because the perspective after the first toss no longer takes the past probability into account. This applies to all sequences of probability, unless there is a CAUSAL link between the past event and the current probability (e.g. the last toss dented the coin so that it now falls differently). The Monty hall problem looks to me like a Markov chain. If I am wrong, then the article has to show that. It does not address this problem, and I seriously dout any of you can. <br> The point about Markov is that there is a difference in perspective before and after any event in the sequence. If the chances of tossing a coin and getting heads is 1:2, obviously the chances of doing it twice is 1:4, but if I toss a coin and get heads, the chances of now doing it a second time are 1:2, because the perspective after the first toss no longer takes the past probability into account. This applies to all sequences of probability, unless there is a CAUSAL link between the past event and the current probability (e.g. the last toss dented the coin so that it now falls differently). The Monty hall problem looks to me like a Markov chain. If I am wrong, then the article has to show that. It does not address this problem, and I seriously dout any of you can. <br>
Line 326: Line 325:


:If you still don't see how absurd it is, then let me pose this: when the door is opened, a goat ''could'' be clean or it could be dirty. So that means there are ''four'' goat possibilities: the host could pick a clean goat 1, a dirty goat 1, a clean goat 2, or a dirty goat 2! Just by considering whether the goat is clean or dirty, we've elevated the chance that the player initially picks the car to four out of six! Now you're saying to yourself "that's ridiculous -- whether the goat is clean or dirty can't change the probabilities!" Yes, exactly right -- and neither can considering ''which'' of two identical goats Monty shows change the probabilities. -- ] 16:04, 25 January 2006 (UTC) :If you still don't see how absurd it is, then let me pose this: when the door is opened, a goat ''could'' be clean or it could be dirty. So that means there are ''four'' goat possibilities: the host could pick a clean goat 1, a dirty goat 1, a clean goat 2, or a dirty goat 2! Just by considering whether the goat is clean or dirty, we've elevated the chance that the player initially picks the car to four out of six! Now you're saying to yourself "that's ridiculous -- whether the goat is clean or dirty can't change the probabilities!" Yes, exactly right -- and neither can considering ''which'' of two identical goats Monty shows change the probabilities. -- ] 16:04, 25 January 2006 (UTC)

::Since you brought it up in my Discussion Post, I just wanted to point out that the 2 goats actually are physically seperate. Solving a problem which involves some abstraction where the 2 goats are just 1 "not car" with a 2/3 chance of being chosen is not solving the original problem. It is solving a paralell problem which only gives the same answer when specific conditions are met: namely that both only winning the car matters and the goats are revealed with the same probabillity and are both identical in being not goats. All the wiki explanations depend on this abstraction and give no mention of the dependencies which need to be met in order for this approach to work. Considering the goats seperate (as they actually are) leads you to the 2/3 probability to get the car by switching. <small>&mdash;''The preceding ] comment was added by'' ] (]&nbsp;&bull;&nbsp;]) {{{2|}}}.</small><!--Inserted with Template:Unsigned-->

:::No, I'm sorry. You're really just wasting our time, here, because you keep introducing factors which are ''not in the problem'' and claiming that the problem hasn't been fully discussed if we aren't addressing your introduced variations. When we talk about a probability puzzle and we say "X rolls a six-sided die" we do not need to explicitly spell out that the die is not loaded. If the puzzle states that the die ''is'' loaded, then we address that factor, but it is just useless complication to insist that in all cases it be explored what happens if the die is loaded or if Monty really loves one goat a lot more than the other or whatever else it is. -- ] 18:18, 21 February 2006 (UTC)


::I'm not sure anymore whether the argument is that switching improves your odds, or whether the whole business is a scam. It reminds me of the card game scam that Wednesday uses to cheat the waitress in ]. One thing I didn't like is that the sums of the probabilities at the bottom of the probability diagram add up to 200%, when they are only allowed to go to 100%. ie. The odds when player picks goat x and does whatever should only be a portion of the original slice. By the time you get to the end, you are pretending that you have 2 pies. ::I'm not sure anymore whether the argument is that switching improves your odds, or whether the whole business is a scam. It reminds me of the card game scam that Wednesday uses to cheat the waitress in ]. One thing I didn't like is that the sums of the probabilities at the bottom of the probability diagram add up to 200%, when they are only allowed to go to 100%. ie. The odds when player picks goat x and does whatever should only be a portion of the original slice. By the time you get to the end, you are pretending that you have 2 pies.
Line 339: Line 342:


:::Frankly, I think the diagram is not too good, and should be replaced with two trees -- one showing what happens if you use a "switching" strategy each time, one showing what happens if you use a "staying" strategy each time. The current diagram seems to confuse a lot of people into thinking that staying or switching is going to be picked ''for'' them, which is of course not the point. -- ] 02:51, 2 February 2006 (UTC) :::Frankly, I think the diagram is not too good, and should be replaced with two trees -- one showing what happens if you use a "switching" strategy each time, one showing what happens if you use a "staying" strategy each time. The current diagram seems to confuse a lot of people into thinking that staying or switching is going to be picked ''for'' them, which is of course not the point. -- ] 02:51, 2 February 2006 (UTC)

:::: I've tried to draw a full scenario for this, and my finding is both switching and not switching have the same odds of 50%. In total, I have 24 scenarios : half of them will result in lose, another half is winning scenario. Here it is :
:::::
:::::Car is put at Door (D)1 - Player picks D1 - Host picks D2 - P is not switching - WIN
:::::Car is put at Door (D)1 - Player picks D1 - Host picks D2 - P is switching - LOSE
:::::Car is put at Door (D)1 - Player picks D1 - Host picks D3 - P is not switching - WIN
:::::Car is put at Door (D)1 - Player picks D1 - Host picks D3 - P is switching - LOSE
:::::Car is put at Door (D)1 - Player picks D2 - Host picks D3 - P is not switching - LOSE
:::::Car is put at Door (D)1 - Player picks D2 - Host picks D3 - P is switching - WIN
:::::Car is put at Door (D)1 - Player picks D3 - Host picks D2 - P is not switching - LOSE
:::::Car is put at Door (D)1 - Player picks D3 - Host picks D2 - P is switching - WIN
:::::(repeat the same scenario for Car is put at Door 2 and 3).

::::So, in total, we will have 24 scenarios with :
::::6 scenarios of player switching and win
::::6 scenarios of player not switching and win
::::6 scenarios of player switching and lose
::::6 scenarios of player not switching and lose

::::Therefore, player's strategy to switch and not to switch stand equal chance to win (50%)
] 10:12, 13 February 2006 (UTC) Hartono Zhuang
:::::The events are not equal probability. For example, according to the above, when the car is at D1, the player will pick D2 2/8 times (1/4), but D1 4/8 times (1/2). The player has no reason to pick D1 more than door 2. This is a common fallacy: distinct events feel like they should have all the same probability. Clearly, here, they don't. --] 10:32, 13 February 2006 (UTC)


==Markov again== ==Markov again==
Line 383: Line 408:
Somebody please help - I think I understand the correct answer, but my mind is still struggling to get around the following: Somebody please help - I think I understand the correct answer, but my mind is still struggling to get around the following:
Imagine contestant A chooses door 1. Monty hall then opens door 3 to reveal a goat. At this point you introduce contestant B. Contestant B has no prior knowledge of the game. He is told he has been "allocated" door 1, he does not know why door 3 is open. He is effectively in the same position as contestant A, but he does not know that the game is fixed. This time both contestant A and B are offered the choice to switch or stick. Surely the percentage chance for contestant B is 50/50. If so how can the same two doors have different probabilities of a prize at the same time for two people standing in front of them? If contestant B does not have 50/50 why not? Imagine contestant A chooses door 1. Monty hall then opens door 3 to reveal a goat. At this point you introduce contestant B. Contestant B has no prior knowledge of the game. He is told he has been "allocated" door 1, he does not know why door 3 is open. He is effectively in the same position as contestant A, but he does not know that the game is fixed. This time both contestant A and B are offered the choice to switch or stick. Surely the percentage chance for contestant B is 50/50. If so how can the same two doors have different probabilities of a prize at the same time for two people standing in front of them? If contestant B does not have 50/50 why not?

: If I may interject out of sequence... Imagine a third player C who has been told which door conceals the car. The probabilities for him are clearly 100% and 0% for the two doors. Does that mean it's also 100% and 0% for A and B? It depends on your point of view. Probability is nothing but a quantitative measure of uncertainty, and all three players A, B, and C have different levels of uncertainty because they have different information. --] 02:54, 1 March 2006 (UTC)

: Yes, the chances for B are 50/50; B is playing a different game. For A, there were initially three choices, all equally likely. For B, he sees one open door, with a goat in it. Note also how one of the rules of the game comes into play here: there is always ONE car and TWO goats. But for B, since door 3 is not available as a choice, there are only two choices, one of which has a car, the other a goat. The fact that B has been "allocated" the first door doesn't matter. Summary: A has three choices initially, all equally likely to have the car. B has two choices, both equally likely to have the car. After the host action, A has two choices, NOT equally likely (stay with door 1, or swap to door 2). B arrives after the host action, so there is no before and after for him. Look at it another way: the game is vastly different for B because the whole 1/3 chance that door three might have the car has been taken away, and distributed randomly into the two remaining doors. Lucky B! --] 21:52, 6 February 2006 (UTC) : Yes, the chances for B are 50/50; B is playing a different game. For A, there were initially three choices, all equally likely. For B, he sees one open door, with a goat in it. Note also how one of the rules of the game comes into play here: there is always ONE car and TWO goats. But for B, since door 3 is not available as a choice, there are only two choices, one of which has a car, the other a goat. The fact that B has been "allocated" the first door doesn't matter. Summary: A has three choices initially, all equally likely to have the car. B has two choices, both equally likely to have the car. After the host action, A has two choices, NOT equally likely (stay with door 1, or swap to door 2). B arrives after the host action, so there is no before and after for him. Look at it another way: the game is vastly different for B because the whole 1/3 chance that door three might have the car has been taken away, and distributed randomly into the two remaining doors. Lucky B! --] 21:52, 6 February 2006 (UTC)

:*No!!! It doesn't change anything - make B the siamese twin of A - it makes no difference who's lips are moving. ] ] 11:35, 9 February 2006 (UTC)

::Oops! I agree I was wrong. Sorry for the misinformation. It's so easy to let your intuition lead you astray with this one. B's chances are the same as A's, because the allocated door is not chosen randomly, nor is the door which is shown to be open. --] 22:18, 7 February 2006 (UTC)
::If I may interject -- there is a ''very big difference'', which I feel we're muddying here, between what one's chances ''actually'' are and what one ''perceives'' one's chances to be. B's chances ''actually'' are 2/3, the same as A. B may ''perceive'' his chances as 50/50, based on the information available to him, but we know in this case that there is important information that B does not have which changes the odds. If B, on some random whim, decided to adopt a strategy of "I'll always pick a door other than the one allocated to me", he would win 2/3rds of the time, even though his incomplete knowledge suggests incorrectly that he should only be winning 1/2 of the time. -- ] 23:53, 6 February 2006 (UTC)

No, B's chances of correctly choosing the winning door, when fully utilizing the information available to him, is 50/50 because he cannot differentiate one door from the other—that is, he doesn't know which of the two doors you picked. What is <i>Monty's</i> chance of picking the correct door, when fully utilizing the available to him? It's certain, of course. So you cannot make the case that the odds can't change, because they are distinct between differing amounts of knowledge. Of course this is the case. A's chances are 2/3 because he is fully utilizing the information available to him. In his case, the extra bit of information is that he knows that of the two doors you didn't pick, Monty will never open the winning door for you, thus his actions are constrained by the fact that <i>he</i> knows something and his actions tell A <i>partly</i> what that information is. The one thing that A doesn't know that Monty knows, is anything more about the door he initially chose than he ever did. The odds of that particular door being correct are 1/3, which they have always been, and will continue to be.

I wrote the first specific web page on the MHP ten years ago, it has been referenced widely, and I have corresponded with countless people about this. No one way of presenting this problem is always effective at realizing comprehension. But one approach has had good success, and that's the million door version of the problem. If you pick a door from among a million, obviously your odds of it being the winning door are a million-to-one. However, if Monty opens all remaining 999,999 doors <b>excepting</b> the winning door, and presents you with two doors, you choice and the remaining door, is it better to switch or to stay? And if someone else walks along at that moment, not knowing anything about which door you picked and which doors Monty opened, their best odds for correctly picking the right door are 50-50. But the other guy is nearly assured of choosing the correct door—it's the one in almost a million doors that Monty very specifically, knowingly, did not open. There's a very small possibility that he didn't avoid any particular door because <i>your</i> door was the correct door. But the chances of you having correctly chosen the winning door from a million is 999,999 to one. It's extremely unlikely. (kmellis@kmellis.com) ] 06:27, 7 February 2006 (UTC)

:We need to remain clear about this. B's chances of winning if he follows the same strategy as A are 2/3, just as A's chances are. The fact that he has no way of knowing that there ''is'' an optimal strategy or what it is doesn't alter the chances he would have if he followed that strategy. We might as well flip a double-headed coin and say that B has a 50/50 chance of being right if he guesses it'll come up "tails".

:Now if you ask a ''different'' question, which is "can an optimal strategy be deduced from the information ''which B has''?" then the answer is "no". But two people in a row stated that B's chances are actually 50/50; they are not. The question is "is there an optimal strategy?" not "would someone deprived of certain vital pieces of information about the game ''know'' that there is an optimal strategy?" -- ] 15:41, 7 February 2006 (UTC)

::Very true. In this scenario, nothing has changed; B is still better off switching. But for all B knows, the game is designed such that he's always allocated the ''winning'' door, another door is opened, and the host tries to trick him into switching. Or perhaps (in B's mind, anyway) the game is that the host always picks the correct door, then he's "allocated" one of two doors, neither of which will ever win. He simply doesn't know.

::Still, the laws of probability don't change simply based on what you know. But as far as B can tell from the available information, switching or staying might be a better option, or it might not make a difference. This means his chances balance out to 50/50 (I think), but only if he has a 50/50 chance of switching or staying (1/2*1/3 + 1/2*2/3 = 1/2). And all it takes is telling B "you should always switch doors" to bump his odds back up to 2/3. &ndash; ] 16:07, 7 February 2006 (UTC)

:::<i>"...laws of probability don't change simply based on what you know"</i>. In this sense, they certainly do. By the statement of the problem with regard to A and B, B enters the room after both A and Monty have made decisions. B has no information with which to differentiate A and B and has exactly a 50-50 of picking the winning door. I don't understand why people would try to determine some Platonic ideal probability value for the door and the prize independent of someone making a choice. It makes no sense. Or, if it doesn, the probability is null. If I flip a coin that is perfectly constructed, catch it in my hand and look at it, and you try to guess which side it is, based upon what you know, you have exactly a 50-50 chance of being correct. If I "guess", based upon what I know, I have a 100% chance of getting it right. There is no independent value for the probability of guessing <i>outside the context of someone guessing</i>.

:::In the Monty Hall Problem, you can understand the varying probabilities by evaluating what each person knows: Monty, A, and B. Monty knows everything there is to know, and B knows nothing (other than that there actually is a prize behind only one door). A, however, has a 66% probability of having been told by Monty's actions everything that Monty knows, specifically where the prize is. A third of the time Monty tells A nothing. 2/3 of the time, he tells A everything. If A bases his decision on the assumption that Monty has told him everything, he will win 2/3 of the time. He will only lose when his first choice of door was correct.

:::There's a couple of ambiguities in how you guys are talking about this. The first is the implicit assumption that we're talking about whether or not A or B knows there is a winning strategy. That's a very confusing diversion. It's sort of a meta-MHP. In discussing the problem, we're taking a God's eye view of the matter and are evaluating the odds of picking the winning door if using various strategies. In that context, all evaluation assumes that A assumes that a particular strategy is the winning strategy and acts upon it in each respective tree. That you want to talk about where the winning door is likely to be depending upon whether or not A knows the optimal strategy is a mixing of levels of analysis. And even though you know and understand, at least to some degree, the correctness of this answer to the MHP, your desire to mix levels is indicative of exactly why people's intution about the problem is misleading. They don't know what perspective to take. Or, alternatively, they implicitly take B's perspective even though the problem statement allows them A's perspective. Or, alternatively, they attempt some sort of ideal perspective, which they assume is essentially B's. Here's why that's misleading: they could also incorrectly choose to take <i>Monty's</i> perspective (that is, they know where the prize is). What do the concepts of "staying" and "switching" and "winning" mean in that context? There's still "winning", but the concept of staying/switch seems absurd because it's deliberately ignoring the fact that from this perspective you already know where the prize is. Similarly, assuming B's perspective <i>also</i> renders the concept of saying/switching meaningless.

:::The reason we don't say that probabilities vary by "how much someone knows" is because when we evaluate probabilities we're assuming that everything that is possible to be known in a given problem statement is known, excepting the outcome. (We even do this with regard to past events, which is dubious, but that's a different discussion.) The MHP quite clearly states the problem as an evaluation of probabilities from A's perspective. Thus we assume that A knows everything that A can know and we proxy for A in evaluating various strategies. In doing so, we learn that when we're (or A is) in that exact situation, switching will on average allow us to win 2/3 of the time. If you want to talk about something that is very like the MHP but takes a different perspective, then you must very deliberately state the problem from that perspective so that it's clear how one should evaluate it to discover the answer.

::::''I don't understand why people would try to determine some Platonic ideal probability value for the door and the prize independent of someone making a choice.'' Because that is the only way to compare apples to apples, instead of apples to oranges. If you look at the suggestion by 193.129.187.183 which started us down this whole road of discussing B, you'll see that he/she asked "how can the same two doors have different probabilities of a prize at the same time for two people standing in front of them?" It was therefore important to clarify that in terms of what ''actual'' probabilities the doors had, they ''did not'' have different probabilities. It's only when you ''change'' the question from "what is the ''actual'' probability of the two doors" to "what is the probability that ''this person'' will find the winning door" that you actually see the probabilities being different for different people. -- ] 01:42, 9 February 2006 (UTC)


:::::I think you should carefully consider what, if anything, your statement about "the 'actual' probability of the two doors" could possibly mean. To shortcircuit the Socratic method, I'll just claim that if it means anything, it means that there is a probability of 100% that the prize is behind the door that it is behind. This system taken in isolation isn't probabilistic, it's already determined. The only probabilistic perspectives are those where an observer has less than complete information. And each one of those perspectives constitutes an independent problem. A, having picked a door and then watching Monty opening a door and knowing that that implies, should switch, and he'll win 2/3 of the time. B, not having picked a door, nor knowing which door Monty opened sees only two doors that he cannot differentiate from each other in any way. There can be no "switching" or "staying" in a problem statement for this B fellow, the best he—and we—can do is say that if he randomly chooses a door, he has a 50% chance of chosing the right door. Finally, Monty, who knows where the prize actually is, would pick the winning door because he knows which door the winning door is. A problem statement about Monty would be something like "should Monty pick the door he knows hides the prize, or the door he knows hides the goat?" And of course the answer to that is that he'll always win if he chooses the winning door and he'll always lose if he chooses the losing door. These are three different problems. Only Monty's perspective is arguably the supposedly privileged perspective; but I suspect the most rigorous analysis would show that the only thing you could possibly mean by thinking of a privileged view, an inherent probability between the two doors, is a determined system that is completely known. You always "win" (''Keith M Ellis, kmellis@kmellis.com, www.montyhallproblem.com'').
::::::Your analysis is correct but is only apt to confuse people by leading them away from the actual crux of the problem. Yes, if we have ''only'' as much information as B does, we only have a 50/50 chance of picking the right door. Yes, if we have the same information that A does (and apply it optimally, of course) then we have a 2/3 chance of picking the right door. And yes, if we have the same information that Monty does, then we have a 100% chance of picking the right door. What those who are not grasping the problem yet struggle with, however, is not why B's chances are 50/50 but why A's ''aren't''. There are two ways to clarify this: One is to say "Monty gave A information about which door ''doesn't'' have the car; B actually has this same information too, but he is missing information about what happened before the number of doors was reduced to two, so he doesn't know that there were three different ways that they could have arrived at two doors and two of those three ways result in the car being behind Monty's door." The other is to go straight to the heart of the matter and say "There were three different ways that they could have arrived at two doors and two of those three ways result in the car being behind Monty's door." Obviously I think the latter is preferable. -- ] 18:28, 9 February 2006 (UTC)

(unwrapping back to left:)
You state the following:
<blockquote>B, '''not having picked a door''', nor knowing which door Monty opened sees only two doors that he cannot differentiate from each other in any way. There can be no "switching" or "staying" in a problem statement for this B fellow, &hellip;</blockquote>
But the scenario we were discussing states the following:
<blockquote>Contestant B has no prior knowledge of the game. He is told he has been "allocated" door 1, he does not know why door 3 is open. He is effectively in the same position as contestant A, but he does not know that the game is fixed.</blockquote>
Hence, there is indeed the concept of switching, due to the "allocation" of a door. If B picks to switch or stay randomly, he has 50-50 chances. If you tell B to switch, he has a two thirds chance. This is why I say the probability (of winning via switching versus via staying) does not change based on what you know. If you want to say that B doesn't even know what door has been allocated, you may, but that's a whole different problem. &ndash; ] 04:26, 10 February 2006 (UTC)

==This page is monstrous==

I edited this page some time ago to clearly state the problem and the answer. Since then it has been edited back into a state. The page is a mess because there is no single, clear problem and answer statement. It is long and rambling. There are several statements of the gameshow, alternative versions, multiple versions of the answer, anecdotes, and all manner of irrelevant nonsense, and the important information is simply lost. This is an example of an article where the lack of an authority and a team of expert researchers, as is found in a "real" encyclopaedia, leads to a polarisation of opinion and hence more and more verbose explanation in order to convince those who fail to understand the right answer to prevent them from editing incorrectly.

By the way, the correct answer is yes, you should switch. If you don't believe it then set up the game yourself with an accomplice, try it 100 times and see what answer you get. Alternatively, use a computer simulation, or even do the probability theory from first principles (but do it properly). Any mathematical explanation that reaches a different answer has a flaw in it. Ignore your intuition. PK

:I agree, but aside from deleting most of the page and then locking it, how can you solve this? Besides, although '''we''' know the correct answer, it can take a lot to convince someone else that their intuition is wrong &mdash; typically by explaining it to him or her in exactly the right terms. By debating it on the talk page, explaining it different ways until the opposing party "gets it", and then putting that method on the page, we have organically come up with a system that (judging from the reduced debate on this page) convinces most people of the correctness of the answer.

:I would rather a long and overly verbose page, that expresses the truth in a way that almost anyone can understand, than a short and concise page that most people would look at and think "that's bogus" &mdash; and either edit it, or just walk away convinced that Misplaced Pages is full of lies. &ndash; ] 17:13, 10 February 2006 (UTC)

== Dual data sets being ignored ==

On these types of problems, there are dual data sets at work. Set one is the information which allows us to determine the "odds" of finding the item being sought. Set two is the information which allows us to determine what the underlying statistical distribution of winning choices is. Originally, the "odds" of both data sets are the same, but they do deviate when additional information is acquired. The additional information is provided by the certainty that the removed choice is a loser. Because of this, the choosing party is no longer making a guess, but instead is making an informed calculation. Please look at the definition for ''"To predict (a result or an event) without sufficient information"''. Please take note that when one has sufficient informatiton, one is no longer guessing. The removal of one choice provides us with more information and we move from the position of a mere guess to that of an educated guess, which is not the same thing. Now as to the "statistical distribution" angle: If you have 10 shoe boxes on your desk and one of them contains and egg, the distribution percentange is 1/10 or 10%. Those numbers never change. However, when we gain more information about the contents - say by opening a few, our odds of finding the egg increase. People argue about these problems because of the tricky idea that there is true "guessing" involved when there is not. And also because they forget that the numbers regarding the original distrubution of winning choices is fixed. Only the odds of finding the item improve, not the statistical likelyhood that it actually existed. ] 05:48, 11 February 2006 (UTC)

:It seems that no matter how this is explained (including your logical choice of words) the argument will not end. There are some who will follow the theatrics rather than the logic no matter what. Give it a go in the article if you think it will help. ] ] 01:11, 12 February 2006 (UTC)

== Another way to explain it ==

If you can accept this following fact, it could be easier to see that switching will make the probability for getting the car 2/3.

Fact: If you choose a door with a goat behind, switching will gett you a car, if you choose a door with the car behind it, swithing door will gett you a goat.

Lets take it from the start. You choose a door, lets say door C. Its now a 1/3 chanse that you choose the door with the car behind it. The host now revals one of the door with the goat behind, lets say he revals door A with the goat behind. Its now ONE goat behind a door wich you cant see, and ONE car behind a door you cant see. You now have 2 doors witch you dont know whats behind, the only thing you know is that its one car and one goat left and they har hiding behind these two doors. (Just to make it perfectly clear, it COULD be that behind door C there is a goat, and therefore behind door B there is a car, it also COULD be that behind door C there is a car and therefore there is a goat behind door B, it CANNOT be that behind the two doors left there is one goat each, becouse the host has already revaled a door with a goat behind). Now comes the fact that will make it easier to understand: If you switch door, its now GUARANTEED that IF its a Goat behind Door C ( the door you first choose) you are swithing to the door with the car behind (door B), IF its a car behind door C then its GUARANTEED that you are switching to a goat (door B).
As you first choose door C it was 1/3 chanse of selecting the door with the goat behind, when you swicth its GUARANTEED that you are swithing to a door witch the content not being the same as the inital door you choose (read that sentence twice). And therfore if you switch its 2/3 chanse of getting a car.

:That seems like a pretty convoluted way of saying "picking a goat and switching will get you a car, and you have 2/3rds chance of picking a goat". :) Really, that's the absolute simplest explanation. &ndash; ] 00:10, 14 February 2006 (UTC)

Oki, I agree, what I was meaning was here is another way to explain it.

== (deleted) Disagreement with explanations ==

This whole section has become an attack and counter-attack on each other's arguing techniques. If you want to continue this, you can do it on your user talk pages. --] 17:21, 27 February 2006 (UTC)

==One more time==
Start with the usual. You pick ONE door and the host gets the remaining TWO doors. Stop. Now would you think it a good idea to swap your ONE door for the host's TWO doors? If you say NO, then I have nothing more for you, go away.

But if you say YES, I want to swap my ONE door for the host's TWO doors then you're almost there. That's it, except for the host's theatrics of opening a losing (and he knows it) door. What you're doing is swapping your ONE door for his TWO doors, even if he tried to mess with your head by opening one of them (oh, sure, like he's going to open a car door) and the diversions like the audience screaming "swap" - "don't swap" and the crew saying "cue the flashing lights" and the "host's silly grin". It's all about messing with your head remember. you're swapping your ONE door for his TWO doors.

To dramatize the situation, start with ten doors. You get ONE and the host gets NINE (they would never do this as it would expose the whole thing). Now, the host (between commercials) opens EIGHT of his NINE doors (never ever opening a car door and he knows it). Wacha think now? ] ] 02:25, 18 February 2006 (UTC)

==Completeness of the explanations==
The lesson to learn from the Monty Hall problem isn't one of mathematics, but of psychology -- that your instincts might be incorrect. With that in mind, the explanations given don't have to be complete. They have these two goals: 1) to be correct, 2) to be easy to understand. I tried wading into the Bayes theorem article and just couldn't make my way, so I don't think it meets goal #2 as a means of explanation.

I find it difficult to see in what way T.Z.K. disagrees with the article. I didn't like the chart that showed probabilities adding up to 200%, so I changed it. If T.Z.K. would like an article that actually is directed to mathematics, then I suggest ]. I editted the grammar there, but the phrasing of the math is still weak. If he wants to revise wording then he can give that a shot too.

I like the extension he proposed to the problem. If one goat is blue and the other red, and you know that Monty always picks the red goat when he has a choice, then if he opens a door with a blue goat you must switch for it is 100% that your original pick was the red goat; if he shows a red goat it is now 50-50 whether or not to switch.

Such extensions are better off left out of the article lest they add to confusion.

Lastly, I like Antanaeus. He is remarkably patient with those who come here for help. ] 18:35, 22 February 2006 (UTC)

:Thank you, Jethro -- you don't know how good that is to hear. Sometimes, even though I work hard at it, I feel like patience is what I'm worst at! -- ] 20:49, 22 February 2006 (UTC)

::Well, you're certainly more patient than I &mdash; hence my current silence. ;) &ndash; ] 04:18, 23 February 2006 (UTC)

:::Antaeus, "''The superior man is modest in his speech, but exceeds in his actions.''" <small>-Confucius</small> ] ] 13:01, 23 February 2006 (UTC)

:Actually it has nothing to do with mathematics. One thing that many self centered psuedo intellectuals such as feldspar fail to recognize is that one person's intution isn't everyone's intuition. Many people have read the current explanation and found it utterly useless because of obvious problems. The most glaring is that it ignores all information given when a goat is revealed as well as ignoring the fact that you are given a chance to change your choice with the new information and changed probabilities. This alone causes many people to outright reject the explanation because to them it is "obviously incorrect" regardless of whether or not the answer it arrives at is correct. Anotherwords their intuition is different from yours.

:To see what I am talking about just make a real venn diagram of the actual problem, with 3 circles Car, Goat 1, and Goat 2. Then draw 2 ovals one overlapping car and goat 1 (goat 2 revealed), and the other overlapping car and goat 2 (goat 1 revealed). Each circle has a total of 1/3 probability and since a goat is always revealed all probability is in a circle and an oval. The cars circle has 1/6 in each oval inside it (make sure the ovals arent overlapping). Now when a goat is revealed simply cover up the oval that represents the other goat being revealed. You should be left with a 1/6 car chance and 1/3 chance that you chose the other goat. This is the real solution in a form that is easiest to see.

:On one hand the solution is better because it is universal. Feldspar's argument is that to him it is obvious that the real problem is just like this other problem where the goats are one abstract "not car" with a 2/3 probability of being chosen. He would fail in convincing many people of that. However noone can doubt a solution which looks at the problem this way, because it's the actual problem ie the goats are physically seperate and have seperate chances of being chosen etc.

:As far as whether or not it is correct and is easy to understand, because there is no omnipotent creature to evaluate whether or not something is correct and tell us all we can do is make sure our methods are sound. If O.J.'s lawyer tells someone "If the glove don't fit, you must aquit!" there are people who would call this an easy to understand explanation that gives the correct answer. And if OJ just happened to be not guilty, it would be. Are you comfortable with such an "simple and correct explanation" even in that case? I hope not because noone wants people setting murderers free in other cases because some random glove didn't fit. Simple and correct often = sounds like it makes sense but often gives the wrong answer.

:As a simple warning this feldspar character is not at all what he tries to appear as. He resorts to very devious tactics in arguments such as editing his opponents writings to make them appear wrong about something and then pretends to be objective. I am glad that people such as he only have limited power over something like wikipedia. There are many forums run by opinion nazis like feldspar that simply boot those who disagree and then come to the conclusion that since everyone on their forum agrees with them they must be right therefore they are justified in booting people who disagree. The sad thing is that type of system actually seems to work.. Rather than fight about something most people are apathetical enough to just agree with whoever is in charge wherever they are. Keep an open mind and never allow yourself to reason like "someone is correct because they are always correct" or "I agree because everyone agrees" etc as they are circular arguments. Decide things for yourself carefully. Of course many people would not be convinced. <small><span class="autosigned">—Preceding ] comment added by ] (] • ]) </span></small><!-- Template:Unsigned -->

::Two things, 69.180.7.137. First, you are badly violating the policy of ]. Secondly, you have been repeatedly doing exactly what you accuse me of, "editing his opponents writings to make them appear wrong about something", , and several times before that. Please stop doing this; not only it is very rude and against policy, you can hardly imagine that having been caught at it several times, you can get away with it trying it again. -- ] 17:16, 27 February 2006 (UTC)

== Number of doors vs. number of ways of getting there ==

The answer to the problem is NO. The logical fallacy of the reasoning which leads to conclude in the positive solution lies in the fact of considering the second goat when there is no more second goat.

Think of this: two doors remain. One has a goat, the other one has a car. Chances are 50/50. It does not matter as to whether my door conceals the car or the goat. My switching or not is like a new decision, a new choice. In this new choice, I chose to keep my door (which is the same as to say that I re-pick that door), or to pick the other door. The open door does no longer count. The chance to switch creates a new choice with new alternatives, erasing the old ones.

:The answer really is YES. Even though there are only two doors left, there are ''three'' ways of getting there: by picking goat #1, by picking goat #2, or by picking the car. Two out of those three ways result in switching being the right answer. -- ] 21:27, 24 February 2006 (UTC)

== Does it matter how Monty picks which goat to show? ==

Recently an editor brought up an interesting point for discussion: namely, the possibility that the answer to the problem might change depending on whether Monty picks a goat to show (when he has a choice) with equal probability for each goat, or with unequal probability. I'd like to discuss why the answer is "no, the answer to the problem does not change."

To see why not, let's start with a discussion of how to calculate probabilities in a case where one probability determines what situation you are in, and the situation determines your chance of "winning" from that situation. For instance, consider a game where you and an opponent each pick a card from separate decks; if your cards are both face cards (J, Q, K, A) or both non-face cards (2-10) you win; otherwise you lose. The way to calculate the ''total'' probability is to multiply your chance of getting into each situation by the chance of winning in that situation, and add those products together. So, in our example game, you have a 4 in 13 chance of picking a face card, which gives you a 4 in 13 chance that your opponent will pick a face card and you'll win, and similarly you have a 9 in 13 chance of getting a non-face card, which gives you a 9 in 13 chance of winning; the total probability of winning is thus ((4/13)*(4/13)) + ((9/13)*(9/13)) = 16/169 + 81/169 = 97/169.

So let us say that Monty picks a goat to show (''when'' he has a choice) with possibly-unequal probability: for every ''x'' times that he chooses to show Goat 1, he chooses to show Goat 2 ''y'' times. Thus, when he has a choice of which goat to show, he shows Goat 1 ''x/(x+y)'' of the time, and Goat 2 ''y/(x+y)'' of the time.

How often does Monty show Goat 1 in total, then? Well, Monty only ''has'' a choice about which goat to show when the player chooses the car initially. By the conditions of the problem, the player's initial choice is equally likely to be the car, Goat 1 or Goat 2. Therefore, if Monty ''has'' a choice of goats ''x+y'' times, then ''x+y'' times he must ''have'' to show Goat 1 because the player picked Goat 2. Therefore, Monty shows Goat 1 ''2x+y'' times in all -- ''x'' times because he chooses to, ''x+y'' times because he has to. By similar logic we can determine that Monty shows Goat 2 ''x+2y'' times.

The chance that the player sees Goat 1 is therefore ''(2x+y)/(3x+3y)'', and the chance that the player sees Goat 2 is likewise ''(x+2y)/(3x+3y)''. We can now look at what the chances are of winning if we employ a switching strategy in these situations. We already saw that Monty shows Goat 1 ''2x+y'' times in total. ''x+y'' of those times, he had to show Goat 1 because the player had chosen Goat 2; these are cases where the player must switch to get the car. The remaining ''x'' times, Monty had a choice of which goat to show; having a choice means that the player picked the car initially. Therefore, when the player is looking at Goat 1, his chance of winning with a switching strategy is ''(x+y)/(2x+y)''.

Before we do the same for Goat 2, let's stay with Goat 1 a little further. We already know we're going to multiply the probability of seeing Goat 1, ''(2x+y)/(3x+3y)'', by the probability of winning by switching when looking at Goat 1, ''(x+y)/(2x+y)''. Since the numerator of the first number is the same as the denominator of the second, they cancel out, and the product of the two probabilities is ''(x+y)/(3x+3y)'' -- which we can see always works out to 1/3, no matter what values ''x'' and ''y'' have! The same logic must apply to Goat 2, as well. This accords with what we already know about the problem: 1/3 of the time, the player has chosen Goat 1, and must switch to win; 1/3 of the time, the player has chosen Goat 2, and must switch to win. The remaining 1/3 of the time, the player chooses the car initially; no matter how Monty makes his choice about which goat to reveal, it does not change the fact that in this case, switching will lose and staying will win.

(I want to address one more minor point on this matter, but I'm out of time at this computer; I'll be at another computer in about ninety minutes.) -- ] 21:27, 1 March 2006 (UTC)

All right...

There's one point that isn't addressed by the above, and that is the question of whether ''x'' and ''y'' could take on values such that the optimal strategy when looking at Goat 1 is ''different'' from the strategy that is optimal when looking at Goat 2. If this was the case, then one could devise an overall strategy superior to any strategy that didn't take the identity of the goat into account.

It can be shown that the answer is "no". We will start by assuming the contrary: that ''x'' and ''y'' have values which make staying the best strategy for Goat 1 and switching the best strategy when looking at Goat 2; our overall strategy then would be "stay when looking at Goat 1; switch when looking at Goat 2."

In order for staying to be the best strategy, the following would have to be true: out of all of the times that the player ends up looking at Goat 1, more of them are due to the player having initially picked the car, and Monty making the choice to show Goat 1, than are due to the player having picked Goat 2. In terms of ''x'' and ''y'', this works out to "''x'' (the number of times that the player picked the car, and Monty picked Goat 1 to show) is greater than ''x+y'' (the number of times that Monty had to show Goat 1 because the player picked Goat 2)" -- in other words, ''x > x+y'', which simplifies to ''0 > y'' -- ''y'' is less than 0.

However, 0 is ''the lowest'' value possible for ''y''; you could say "1 of every 5 times, Monty picks Goat 2" or "0 of every 5 times, Monty picks Goat 2" but "-1 of every 5 times, Monty picks Goat 2" has no meaning. The closest we can get to the situation we were trying to arrange is where ''y'' equals 0. This is the situation when Monty ''always'' picks Goat 1 if given a choice. Under these conditions, seeing Goat 2 means switching ''always'' wins; Monty ''must'' be switching because he has no choice. However, the player only sees Goat 2 1/3rd of the time; the other 2/3rds of the time, the player sees Goat 1, and the chances are exactly even that: a) Monty had to show Goat 1 because the player picked Goat 2, b) Monty had a choice of Goat 1 or Goat 2 (because the player picked the car) and showed Goat 1. There is no optimal strategy when the player is looking at Goat 1; no matter ''what'' strategy the player adopts, the chances of winning from this situation are 1 in 2. And as with every other set of values for ''x'' and ''y'', the end result is 2/3: ((1/3)*(1/1)) + ((2/3)*(1/2)) = 1/3 + 1/3 = 2/3.

(Now, I ask others -- is there any point to addressing any part of this in the article itself?) -- ] 23:34, 1 March 2006 (UTC)

:No! ] ] 23:39, 1 March 2006 (UTC)

== More "catchy"?... sexy etc. ? ==

When I first heard about the Monty Hall Problem, I was very drawn into it by its simplicity ... this article, while certainly fantastic, has an introduction so full of disclaimers and strings that the problem is convoluted and no longer interesting, and catchy, IMHO.... is it possible that we can rework the article to state the problem simply at the beginning and then put all the assumptions and stuff a bit later ..... it just seems to me that over time people have inserted little disclaimer words to the point where the problem itself is no longer fascinating. -] 05:36, 4 March 2006 (UTC)

:The problem is that it tends to go in a cycle: someone comes along, thinks the description is too verbose and complicated, and pares it down. Someone else comes along, thinks that it fails to account for this possibility or that one, and expands it again. -- ] 22:42, 4 March 2006 (UTC)

::What I am suggesting is a very simple statement of the problem at the beginning which, IMHO, the article currently lacks. -- THEN a detailed description of the constraints, etc. ? ... - ] 04:44, 6 March 2006 (UTC)

:::That's actually what I'm talking about: at times when we have ''had'' a very simple statement of the problem, people have asserted that it needs changing because it doesn't specify one constraint or another. Which constraints are you thinking should be moved or removed to make the initial description simple? -- ] 17:54, 6 March 2006 (UTC)

== Game theory redux ==

I know there was some conflict about this some time ago, I thought I would send some feelers out. Would anyone mind if I took this article out of ] and put it in ]? Thanks! --best, kevin <b>]<b>]]<b>]</b> 05:12, 8 March 2006 (UTC)

:It is my opinion that it belongs in neither category. The problem is specifically one of probability. One could just as easily say it belongs in the category, "Engilsh Language Problems" or "Game Show Related Trivia" etc. etc. It is a probability problem, plain and simple, not a game. - ] 17:34, 8 March 2006 (UTC)

::I agree that its not a ] problem, but can you explain why you think its not a problem in ]? --best, kevin <b>]<b>]]<b>]</b> 20:01, 8 March 2006 (UTC)

:::This problem is designed to illustrate a principle in probability, not one in decision theory. It's not a "real" problem or game. Decision theory/game theory have a branch of "theory" because their problems are related to real-life examples. The same way I would say the three cards problem is not related to game theory or decision theory. I am glad you agree that it is not part of game theory... the MHP *definately* has nothing to do with game theory. I 100% support removing it from the Game Theory category. If you think it should be moved to ] I will abstain from any objection... but as it stands I strongly strongly support anyone who wants to remove all references to game theory from the article. In fact, the article, after re-reading it, is really a huge, huge mess IMHO. -] 05:02, 9 March 2006 (UTC)

::::And in only eight months since its Mainpage-FA exposure! ] ] 14:31, 9 March 2006 (UTC)

:::::Don't know if that is sarcasam or not -- but this article in its current state is nothing remotely like the way it was when it was a FA. - ] 14:58, 9 March 2006 (UTC)

::::::My point exactly. The hundreds of edits since July 23 2005 have ''not'' made this article better and I suspect that it would not withstand FA scrutiny in its current state. It may be time to bring it through the process once again, it is a good subject. ] ] 04:45, 10 March 2006 (UTC)
::::::::Not sure, but perhaps this is related to ]. -- ] <small>(])</small> 04:55, 10 March 2006 (UTC)

:::::::Time to vote for a complete rewrite? I'd do it, but someone would revert it I'm sure... -] 04:51, 10 March 2006 (UTC)

::::::::Rick, I think that the reason for this (besides the Second law) is because as the non-believers raise their arguments, someone goes in an makes a patch to address that particular nit or pick and so the article no longer flows in a coherent way, death by a thousand cuts or something. And Absicissa has a point about a rewrite which is why bringing to FAC status may be the way to go. Just think Rick, you get to do it all over again. :-) I'd like to hear from Antaeus about this, he has been putting a great deal of effort into this "project". ] ] 05:12, 10 March 2006 (UTC)

== Article Rewrite ==

There is some discussion above about rewriting the article, from scratch, incorporating most of what is currently in the article but editing it heavily... (see above) but there are some strange things about the article right now that someone needs to look at. I propose the new article look something like (or at least incorporate these elements in some order, if someone can rework it to be slightly better):

* A brief statement of the problem (and why it might be considered significant) (FOUR SENTENCES MAX!)
* The origin of the problem, the history of the problem. But NOT variants of the problem. -- I would start with Marilyn. And probably finish with her.
* A statement of the problem with the express contraints.
* The solution to the problem, with subsections on the various ways of understanding the problem. Perhaps also, starting with the simplest and ending with the most complex. Like Bayes's Theorem? WTF? Is someone who does not understand the solution at this point seriously going to understand that?
* Similar problems with:
:* Further history of the problem (e.g. Gardner, &c.)
:* Links to similar problems (three card problem, boy girl problem, &c.)

In summary, I think this article could be 50% of the size its now and 200% of the quality. Are there some people who would be willing to work on this with me? Or others who think it is a bad idea and that the article is best in its current form? - ] 06:27, 10 March 2006 (UTC)

:I have no argument against the need but I am somewhat troubled by the suggested process. The proper way of dealing with a seriously deteriorated FA is to list it at ] (see some of the candidates there) with the reasons that it no longer meets ]. Only after a consensus is achieved should the article be rewritten. After that, if we have the desire to do so, the article can be submitted for ] and/or ]. --] ] 12:34, 10 March 2006 (UTC)

::Sure, sounds good to me. - ] 12:52, 10 March 2006 (UTC)

Latest revision as of 10:10, 2 February 2023

This is an archive of past discussions about Monty Hall problem. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.
Archive 1Archive 2Archive 3Archive 4Archive 5Archive 10

Actual rules for the gameshow

Analysis of the errors in the intuitive answer

(as opposed to the correctness of the mathematical answer)

It is not enough to describe why the mathematically derived solution is correct. To resolve the paradox to the satisfaction of all, one must also describe why the intuitive solution is wrong. I think this requires three steps.

1) an understanding that the "intuitive solution" is just a dismissive term for a first analysis that turned out to have a flaw

2) a description of the logical steps that were employed in coming to the intuitive solution

3) an analysis of that logic, to find its flaw(s)

(1) I not argue the point, but rather just hope that people agree with it.

(2) I think that the intuitive solution takes the following steps:

  • transform the game into a simpler yet completely isomorphic game
  • calculate (or intuit) the probabilities for that game
  • extrapolate the answer back to the original Monty Hall game

Here is the game that I believe is used intuitively (I’ll call it the Silly game):

  • There are three doors, A, B, and C.
  • Door A is open and has a goat behind it.
  • Doors B and C are closed, and there is a goat behind one and a car behind the other.
  • The contestant just guessed that door B has the car behind it, and is now being given a chance to change his mind.

Should he change it or not?

Clearly the answer to this is that the chance is 50% either way, and it does not matter whether the contestant changes his mind.

Now compare the Silly game to an original Monty Hall game at a point half way through, in which the contestant has guessed door B and Monty has opened door A.

  • There are three doors, A, B, and C
  • Door A is open and has a goat behind it.
  • Doors B and C are closed and there is a goat behind one and a car behind the other.
  • The contestant just guessed that door B has the car behind it, and is now being given a chance to change his mind

It seems that the Silly game is exactly the same as the original Monty Hall game at this point. To a viewer who has just ‘tuned in’ and does not know what has previously happened, the games look identical. Therefore, logic would dictate that the answer to the original question is that it doesn't matter whether the contestant changes their mind; the probability is 50% either way.

(3) It turns out, of course because the two games are not completely isomorphic. There is a crucial piece of information missing from the Silly game that is needed to make it isomorphic with the Monty Hall game, as follows: Monty (who knows where the car is)

  • was required to open a door to reveal a goat,
  • was given an opportunity to open door C,
  • chose not to open door C.

Monty has thus said something concrete about door C (“I didn’t choose it, perhaps because I couldn’t choose it”), but nothing about door B. This is the source of the asymmetry between doors B and C, and the reason that door C is more likely to not have a goat behind it. Happyharris 20:57, 25 July 2005 (UTC)

Perhaps, what we need in the main project page is an explanation to "why the intuitive answer is NOT the Monty Hall problem". Something like: Extrapolating the probabilities of an isomorphic game back to the Monty Hall game is the cause of much of the controversy of the game, since answering the probabilities of the game with randomized strategy (1/2) is NOT answering the probabilities of each strategy of switching (2/3) and not switching (1/3). aCute 08:50, 27 July 2005 (UTC)
I believe I've given a start at the top of the Aids to understanding section. Most people I've seen argue the incorrect position assume you can forget past events and look at it as a fifty-fifty chance (as they can with, say, coin flipping). The more-tenacious ones cannot be persuaded from their little optimization, when you ignore that they are using it. I plant the seed of doubt by showing that their premise fails in a case, card counting, in which they will almost undoubtedly acknowledge its failure.
Their premises always trump yours in their reasoning. If you don't try to correct improper premises, no argument you make will matter. This topic inspires debates to no end on Usenet because people ignore that. They end up, effectively, arguing over definitions, which is the prime example of a useless and stupid debate.
Logically, every single one of the article's diagrams can be redrawn and all the article's explanations can be rewritten while simply ignoring the past. Logically, any isomorphism you use can be discounted as an incorrect choice because it contradicts their assumptions, so you must be using hocus pocus. — 131.230.133.185 04:52, 10 August 2005 (UTC)

Breaking it down into steps

For the people who still find it hard to see why the probability is not 50/50 when there are two doors left, I hope the following illustration may help. I'm going to show that the Monty Hall problem is a specific case of a more general game; I'll call all the games that differ in their parameters "Hall games" for ease of use.

The basic idea behind a Hall game is this: We start with one large set of secret-hiding items -- they can be doors to be opened, or cards to be turned over, it doesn't actually matter. What matters is that there are n cards, but only one of them is the Prize card. The rest are Null cards.

Step One: The cards are divided into two hands. Each hand must have at least one card. For simplicity's sake, we call the number of cards in the first and second hands h1 and h2.

Step Two: Someone who can see which cards are Nulls can discard some number of Nulls from one hand, the other, or both. We call the number of cards remaining in each hand after the discarding of Nulls r1 and r2 (like h1 and h2, they must be at least one.)

Step Three: The player makes a guess at which of the two hands contains the Prize.

Step Four: The player makes a guess at which card out of the hand he selected is the Prize.

It's clear that to win the game, the player has to make correct guesses in both Step Three and Step Four. What are the chances of picking the correct hand? The first hand is correct h1/n of the time; the second hand is correct h2/n of the time. If we want, we could enumerate the cases: if h1=2 and h2=5, then the Prize could be the first card of the first hand, the second card of the first hand, the first card of the second hand ... et cetera, et cetera.

Now this is the part that many people find counter-intuitive. If we enumerate the cases, and then we reduce Nulls to meet any legal value of r1 and r2, we find that in no case can the removal of Nulls switch the Prize from one hand to the other. This means that the chance of the card being in the first hand or the second always stays at h1/n and h2/n -- even if the sizes of the hands do not stay at h1 and h2. If it's dealt to that hand, it stays in that hand; therefore the chance of it being in a particular hand is always equal to the chance that it was dealt to that hand.

What about the removal of Nulls? Does it affect anything after all? Yes, it does -- it affects the player's chances in Step Four. The chance that the Prize is in a particular hand is dependent upon h1 and h2 -- how many cards each hand started with. But the chance of finding the Prize in a hand (assuming it's the right hand) is based on how many cards that hand contains after Nulls have been removed -- since there's only one Prize, the chance of finding it in a hand of r1 cards is 1/r1.

Now, what are the chances of making both guesses correctly? If no Nulls get removed from either hand, then the chances of picking the Prize are either h1/n x 1/h1 (since r1 is equal to h1 when no Nulls have been removed) or by similar logic h2/n x 1/h2, which also multiplies out to 1/n. If, however, one of the hands -- say, hand 1 -- has been reduced down to one card (r1 = 1), then the chances of finding the Prize in that hand is h1/n x 1/1 -- if you've correctly guessed that the Prize is in that hand, you have a 100% chance of finding it in that hand when it's the only card in the hand.

With this being the general structure of the Hall game, we can see that the Monty Hall problem is really just the case where n=3, h1=1 and h2=2, r1=1 and r2=1. The chance that the Prize is in the player's hand is h1/n -- 1/3. The chance that it's in Monty's hand is 2/3. Before one Null is removed from Monty's hand, the chance of finding the Prize in his hand is 2/3 x 1/2 -- i.e., 1/3. But when the Null is removed, the chance is now 2/3 x 1/1 -- i.e., 2/3! -- Antaeus Feldspar 03:44, 26 July 2005 (UTC)

three prisoners problem

kudos to those contributors to this article. i have been thinking about this for days (despite the fact that i "got it" ... after about 20 minutes or so). i find the "two sets" explanation the most clear, though i'm sure some people will love the bayes' theorem explanation. i also found the talk page very entertaining. some contributors are clearly manifesting what kahneman and tversky (writers on cognitive biases - the former won a nobel prize) refer to as "belief perseverance." geez, sit down with a friend and three cards for 10 minutes and you'd realize you're wrong. anyway... i thought you might find this page interesting:

http://econwpa.wustl.edu:8089/eps/exp/papers/9906/9906001.html

besides elaborating on many of the key assumptions (often unstated) underlying the MHP, the paper presents an interesting (and supposedly earlier) problem of the same form, the "three prisoners problem." here's the upshot...

There are three prisoners, you and Prisoners A and B. Two of you are to be executed, while one will be pardoned. The prison warden knows who will be executed and who will be pardoned (like monty must know where the goats and car are). According to policy, the warden is NOT allowed to tell any prisoner if he/she is to be pardoned. You point out to the warden that if he tells you if A or B will be executed, he will not be violating any rule. The warden says C is to be executed. What is the chance that you will be pardoned?

the answer presented, like that to the the MHP, is that your chances of being pardoned remain at 1/3, while the the chances of B being pardoned, GIVEN WHAT THE WARDEN HAS TOLD YOU, is 2/3.

I found looking at this problem reinforced how *$&%^@# hard it is to get one's head around the MHP, because despite having read about the latter for days, i still had to think about the three prisoners problem for several minutes to get my head around IT, despite knowing that it was essentially the same as the MHP. K-razy.

someone mentioned the idea of blocking edits to featured articles for some period of time. while apparently there is a policy against such things, i agree with the suggestion, at least where articles that are sources of dispute. the featuring of an article doubtless attracts many potential editors, some of who may simply not know enough about the subject to edit appropriately. for example, the first time i read the monty hall page (when it was featured), someone had altered the solution section to express the misguided (given certain assumptions, which too often are unstated) 50/50 approach. needless to say, i was confused by the fact that all other sections of the article contradicted the solution.

good job. Contributed by 24.89.202.141.

The Mueser and Granberg paper is already one of the references, and the three prisoner's problem is the Gardner problem referred to in both the "The problem" section and the "Origins" section (although Grardner's version is not described). I suspect the point of allowing featured articles to be edited is to reinforce the notion that wikipedia is really, no kidding, editable by anyone. I find this to be a very principled, and quite admirable, stance even though it does require some effort (and I suspect protecting the main page was only done with the greatest reluctance). I cannot take credit for much more than nominating the article as a WP:FAC, but thanks for the compliment. -- Rick Block (talk) 01:01, July 27, 2005 (UTC)

Game Theory

I copied this back in from the archive:

I would argue it's a problem in probability. Rich Farmbrough 12:44, 23 July 2005 (UTC)
I have switch the statement back because it is certainly a problem of probability. Apparently MathWorld categorizes the problem under game theory, but in my opinion the connection is tenuous. In the standard statement of the problem there are no "conflicting interests" because the host is not an active player in the sense that he cannot make choices that affect the outcome. Because the problem clearly could generalize into a game theoretical topic, I would have no objections to a Wiki categorization as such. In the text, however, this would require some explanation, so in my opinion it isn't appropriate for the first line. Certainly it does not supersede probability. Davilla 16:21, 23 July 2005 (UTC)

Hecatonchires originally changed 'probability' into 'game theory', and then did it again two times. I've changed it back and posted a notice on his/her Talk page, pointing him/her to this discussion. I've invited him/her to start a discussion here if s/he wants it changed to 'game theory' again. Phaunt 10:54, 10 August 2005 (UTC)

I was initially quite dubious about this edit, but became less so when I looked up game theory and realized that it does rather fit the Monty Hall problem after all, at least by the definition in the article. How does a player maximize his chances of walking off with the prize? Of course, this may mean that it's game theory that has to be tweaked to clarify the matter. -- Antaeus Feldspar 23:08, 10 August 2005 (UTC)
Game theory is used to analyze strategic situations, where "strategic" means that there are interacting interests between two players. For instance, Robert Gibbon's book on game theory starts with this sentence: "Game theory is the study of multiperson decision problems." This is echoed (somewhat less clearly) in our article: "A definitive feature of game theory that distinguishes it from decision theory whose main subject is also studying formalized incentive structures is that game theory encompasses decisions that are made in an environment or states of the world in which strategic interaction between various players occurs." (Actually this sentence is really confusing, I will fix it.) The monty hall problem is clearly not a multiperson decision problem, since there is only one interested actor, the player. I think probability theory or utility theory would be best descriptions of the problem. --best, kevin ···Kzollman | Talk··· 23:55, August 10, 2005 (UTC)
Beter still, decision theory?--best, kevin ···Kzollman | Talk··· 23:59, August 10, 2005 (UTC)

Ahem, don't we have an encyclopedia to write here. It seem that you are arguing about the distance between two points on a beach ... jeesh. Well, if it is really important for one of you to prevail here then fine, go at it. Every once in a while however, look around and notice what you are spending your valuable time doing.  ;-( hydnjo talk 02:15, 11 August 2005 (UTC)

I can't believe this, they're still at it! Jeesh. hydnjo talk 02:31, 16 August 2005 (UTC)

Back and forth, the edits revert again and again. Can we get some consensus on this? It's two words... I don't mind the current compromise of mentioning both terms in the same sentence, but would prefer only mentioning probability. But is there any way we can get people to stop changing it every week or so? Fieari 05:22, August 26, 2005 (UTC)

I, for one, think this qualifies as one of the lamest edit wars ever.
If this is an edit war, it seems to be a rather slow one. Note that the last two reverts were on 10 and 15 August. The 23 August revert had to do with Increasing the number of doors, see below.
Anyway, I don't really have a problem with either formulation, if there's a majority. My problem is just that the 'game theorists' refuse to discuss this here on the talk page, even after having been explicity invited. That was the reason for the last 8/10 and 8/15 reverts.
I haven't voiced my own opinion yet; I like 'decision theory'. The problem with 'game theory' is that there is only one player. Phaunt 00:44, 27 August 2005 (UTC)

My own understanding is that within the context of game theory, "Monty Hall Problem" refers to some modification of the problem from the one currently given in the introduction to this article. Specifically, some range of behaviors is permitted to the host. In this case, each of the two players chooses a strategy from within their ranges of possible behaviors, and the task is to identify a Nash equilibrium. The Mueser and Granberg paper describes this approach to the question. --Wmarkham 21:05, 5 September 2005 (UTC)

Reverted addition under Increasing the number of doors

I just reverted the additions by User:62.99.223.20 under Increasing the number of doors, because they were copied verbatim from . This page was linked to, but that doesn't change the copyvio. Also, it didn't really belong there (under aids to understanding) but rather under Variants -> n doors, where a shorter discussion of this variant already exists. I invite User:62.99.223.20 to expand on this if he feels it's too short (but without violating copyright, of course).

Congratulations

It's a bit late, but I want to congratulate and thank everyone who worked on this and helped it become a featured article. Good job! Phaunt 11:53, 27 August 2005 (UTC)

"the assumptions explicitly stated below"

The intro refers to "the assumptions explicitly stated below" that do not appear (to me) to exist in the article.

From the introduction to the article:

In this puzzle a player is shown three closed doors; behind one is a car, and behind each of the other two is a goat. The player is allowed to open one door, and will win whatever is behind the door. However, after the player selects a door but before opening it, the game host (who knows what's behind the doors) must open another door, revealing a goat. The host then must offer the player an option to switch to the other closed door. Does switching improve the player's chance of winning the car? With the assumptions explicitly stated below, the answer is yes — switching results in the chances of winning the car improving from 1/3 to 2/3.

The description of the puzzle appears to be quite clear on the constrants on the host's behavior, and I believe that any mention of assumtions is unnecessary here. I happen to be someone who does quibble over the statement of the problem, and I find this one to be quite clear, so this is probably not problematic.

Unfortunately, there are references to "the assumptions" throughout the article. Further unfortunately, my own, possibly nonstandard, position is that the "Monty Hall problem" is really a related family of problems. The ones in Selvin's letter and the Parade article each pose slightly different problems than the one stated above. The stated result (or any clear result) can only be obtained in those cases if additional assumptions are made. In my opinion, understanding the nature of some of the confusion surrounding the Monty Hall problem is made easier if the existence of these (nontrivial, IMO) assumptions is made clearer.

Since the article is already quite good, and my editions could be construed as having an agenda, my hope is that one of the perennial maintainers of the article is willing to adjust it in order to reflect my concerns, in a manner that is true to what the article describes. My guess is that the only changes needed are to move the existing comments about assumptions from the "Anecdotes" section to the introduction of the Parade article, and to eliminate the reference to assumptions from the intro. In fact, after consideration, I think it is safe for me to make the latter change myself. My observation is that these words currently serve no purpose one way or another, and hopefully I am a good representative of the view that the existence of assumptions is important. I will do this shortly.

--Wmarkham 19:44, 5 September 2005 (UTC)

The words referred to the "Mueser and Granberg" constraints in the next section, although I agree the current problem statement in the lead-in is unambiguous without this forward reference. As far as I can tell, the problem itself is generally viewed to be the specific one described in the lead-in and not the family of problems framed by any ambiguous statements of it. The Parade controversy, in particular, was not primarily due to the missing assumptions but the apparent inconsistency between there being two doors to choose but the choice not being 50/50. Marilyn vos Savant has said (in the 1991 NY Times article) that virtually no one complained that the assumptions were not clear and that she thought the constraints on the host others have added as explicit assumptions were implicit in the statement of the problem published in Parade magazine. The constraints on the host are described in "The solution". Perhaps this section and "Anecdotes" could be strengthened with slightly more discussion about the ambiguities in the Parade magazine problem statement, referencing the 1991 NY Times article. -- Rick Block (talk) 21:40, September 5, 2005 (UTC)

Why I think it's wrong...

When you're given the choice to switch or not, it still doesn't matter, because switching or not switching is a binary decision. To give the probability of 1/3 to the scenario where you decide not to switch doesn't make any sense; when the host gives you the option, the scenario and therefore the rules change. You're no longer choosing between three doors, you're choosing between two.

It'd be different though, if you were not given the choice to switch. When one door is revealed to not be winning, you couldn't say "now my chances are 50/50" though from an intuitive standpoint you could say that, because without any ability to act on the events that occur, the probability really doesn't change from the standpoint of the start (though if we were weather forecasters we would say the chances were now 50%, but this isn't a forecast that continually updates, gamblers want to know what their overall chances are from the start, because that's where they're stuck making their decisions)

The thing here is when someone decides to base chance on when one assesses the situation or base the chance on when a choice was made. Gamblers will want to base it on when they made their choice, but weather forecasters will want to assess the situation continuously.

If you have read the entire article then you missed the point. It's a scam, "the game host (who knows what's behind the doors) must open another door, revealing a goat". If you believe otherwise let me explain one more time (I'm the scammer and you're the scamee or mark). To make the example more obvious we are going to start with ten doors. You choose one of them and by default you un-choose nine of them. Would you agree that your chances at this point are one in ten or 10%? OK then, I then slowly and dramatically -ta dahh- open eight doors . It is critical at this point that you understand that I know where the prize is so the eight doors that I (the scammer) open are known to me to be non-winners (goats). So, we're down to you with a 10% door and me with a 90% door. Do you still think it's a 50/50 choice? So, bring it back to three doors and so long as you realize that the host (scammer) knows which door is a non-winner which he procedes to open then you should obviously choose his remaining door whether you start with three or ten or one hundred doors. --hydnjo talk 21:33, 6 November 2005 (UTC)
Also note that it doesn't matter if you think it's wrong. It's objectively right. Probability is nothing but a measure of the fraction of attempts that produce a given outcome. Empirical studies have demonstrated that the answer given in this article is right. No amount of argument or logic can change the fact that this article's conclusion matches reality. --Doradus 22:10, 6 November 2005 (UTC)
It does matter to me. I'd like to think that we have explained this in a way that 69.246.138.166 comes away with an understanding rather than a dogmatic. But then... --hydnjo talk 00:36, 7 November 2005 (UTC)
That's a noble goal, but with all due respect, I wasn't talking to you.  :-) --Doradus 14:58, 7 November 2005 (UTC)
I think I see where you're getting confused, so I hope you'll let me try to explain. Let's generalize the problem as given to a new class of problems:
  1. The host presents X doors, one of which has the prize behind it, and the others of which are "misses".
  2. The player divides these doors into two sets, each of which must have at least one door in it.
  3. The host can then adjust either or both sets by removing "miss" doors or by adding new miss doors (the player cannot distinguish just-added doors from the doors that were originally there.) There must still be one door in each set and the host can only add or remove misses, not the door with the prize.
  4. Challenge: The player must correctly guess which of the two sets contains the door with the prize.
  5. Challenge: The player (assuming they picked the correct set) must correctly guess which door in the set contains the prize.
Now, we can quickly confirm that the Monty Hall problem is just a specific case of the general problem. The host presents three doors (step 1); the player divides them into a one-door set and a two-door set (step 2); the host then removes a miss door from the two-door set (step 3).
As for the challenges, let's look at the second (step 5) before the first (step 4). The effect of the host removing a miss door in step 3 is to eliminate any actual "challenge" from the challenge of step 5; you can't make a right choice or a wrong choice if you have no choice to make! This means that the odds for the first challenge, step 4, become the odds for the whole problem.
So what are the odds for step 4? Well, in step 2, the player divided the door into two sets with no idea of which one was the prize door. There's a 1/3 chance that the prize door ended up in the one-door set, and a 2/3 chance that the prize door ended up in the two-door set. Now, 69.246.138.166's challenge to the correctness of the stated solution is that "when the host gives you the option, the scenario and therefore the rules change." My question in response is: "How?" The host can remove a miss door from a set; in our expanded general problem, he can remove multiple miss doors or add multiple miss doors to either set. But nothing he can do can change which set the prize door is in; therefore the odds of which set the prize door can be found in must be exactly the same in step 4 as they were after step 2. If you disagree, reply and spell out exactly how the prize door could change from one set to the other in step 3.
So let's recap. In the official Monty Hall problem, the player divides the doors into two sets; the set with one door has a 1/3 chance of containing the prize door; the set with two doors has a 2/3 chance. The host removes a miss door from the two-door set, and with it he removes the chance that the player could pass the first challenge and fail the second. Even though both sets are now down to one door each, the "two-door set" still has a 2/3 chance of containing the prize door. To pick the correct set is to pick the correct door, so picking the door that was in the two-door set gives you a 2/3 chance of winning. Again, if you disagree, don't just assert that the situation does change; explain how it could have changed. -- Antaeus Feldspar 20:06, 7 November 2005 (UTC)

Easy peasy

Jesus Christ, why are people so thick. It's like this..

  1. Choose a door; you're lucky it's a goat! but you had a 2/3 chance of choosing a goat so the odds were on your side.
  2. Monty reveals the other goat.
  3. You switch; it has to be the car as the other goat is gone.
  4. You win!
  5. If you had chosen not to switch you would have lost.
  6. By not switching you're stuck with that 2/3 chance of getting a goat. Switching meant that you turned it into a 2/3 chance of winning the car.
  7. Easy peasy. End of story. Nighty Night Jooler 02:09, 8 November 2005 (UTC)
But if we let the proles know how simple this is, they might start thinking for themselves, and then where will we be? Bonalaw 14:31, 22 November 2005 (UTC)
OK, how's this for a paraphrase: we do away with the numbers and use the layman's term "chances are." You pick a door. Chances are, it's a goat. Then, Monty opens a door that he knows is a goat door. Now, assuming you picked a goat, and of course Monty showed you a goat, the only door left is the car. You'd be a fool not to switch. Of course, it's much less likely that you'd pick the car at first, in which case you'd lose by switching. Dyfsunctional 17:44, 14 December 2005 (UTC)

I think this reasoning is too simplistic, and would give the wrong answer in some cases. For instance, would this reasoning apply to #paragraph_about_Who_Wants_to_be_a_Millionaire? I think it would lead to the wrong answer. (The right answer is that your odds are 50-50 in that case.) --Doradus 21:31, 18 December 2005 (UTC)

Actual rules for the gameshow

I wonder if somewhere in the article it should be pointed out that under the actual rules for the “Lets Make a Deal” game show that this problem seems to be named after, switching doors didn’t actually increase your chances of winning. On the game show Monty would only offer the chance to switch ½ of the time if the player initially picked incorrectly, but would always offer the choice if the player initially picked correctly. This throws off the normal analysis in which the choice is always offered, since simply being offered the choice to switch increases the chances that your initial door pick was correct. The preceding unsigned comment was added by 128.227.7.193 (talk • contribs) .

  1. How do you know this to be true?
  2. New stuff goes at the bottom of this page.
  3. As this is your first edit from your IP I'll wait a day or so before moving it down so as to help you find this response. --hydnjo talk 19:46, 17 November 2005 (UTC)

Comment moved from article

  • Note- i'm not the original writer and i may be wrong, but it seems to me like the probability is actually 50/50. it's 2/3 for a person to lose. BUT say you're player 2. if player 1 is eliminated, he had the goat. which means that player 2 either got the car, or it's left. Which means there's an equal chance for him to win or lose by switching... right? -- 24.196.238.213
    • This question relates to the variant where there are two players, one is eliminated, and the question is "should you switch". Assuming you're not eliminated the answer is no. and the probability is 2/3 that you have the car. There are indeed two outcomes left, i.e. you have the car or switching gets it, but they have unequal probabilities in this variant - just like the two outcomes in the original problem have unequal probabilities. The key is the realization that N possible outcomes doesn't mean each one must have a 1/N chance. To make this one more obvious, consider a similar game with 10 doors, 1 car, and 9 contestants. If none of the 9 choose the car, 8 are eliminated randomly. If any of the 9 choose the car the other 8 are eliminated. This game ends up in the same situation, a player and a door to potentially switch to, with the same two possible outcomes with what I hope are clearly not even probabilities. In the 3-door, 2-person case, I assume you agree the unchosen door has a 1/3 chance of having the car at the beginning of the game. Eliminating one of the players doesn't change this, but since the car is either behind the unchosen door (still 1/3 chance) or one of the players has it, when there's only one player left the probability is (1 - 1/3) = 2/3. -- Rick Block (talk) 14:47, 23 November 2005 (UTC)

paragraph about Who Wants to be a Millionaire

I deleted the following paragraph about the Who Wants to be a Millionaire show:

The game show "Who wants to be a millionaire" has the same problem. The player is given 4 possible answers to a question. You can ask the host to remove 2 wrong answers, leaving you with 2 answers, one right and one wrong. Assuming you have no idea which of the 4 is right you can guess one (Lets say A), remove two and be left with two . If A is not removed then there is a 1/4th chance that A is right and a 3/4th chance that the other one is right.

On the millionaire show, the contestant does not get to pick an answer and then have two wrong answers removed. Without picking, two are removed and if you then pick randomly there's a 50/50 chance. Even if you mentally (randomly) "pick" and your pick is one of the two remaining answers, the result is a 50/50 chance because your pick is not related to the process by which the other answers are removed. -- Rick Block (talk) 16:28, 16 December 2005 (UTC)

Agreed. No amount of meditation before the removal of two choices will affect the probability of the final outcomes. You need to tell Monty your pick and have that affect his actions. --Doradus 21:29, 18 December 2005 (UTC)

Note that, (In the English version at least), the player can tell the presenter which of the 4 answers they think it is before the 50:50 takes one away, but it is still possible that the one they picked could be taken away. Leaving two, and a true 50:50 chance of picking at random the correct answer. --JP Godfrey 21:10, 23 January 2006 (UTC)

this problem is easier than stated

Maybe I missed it ,but it seems noone has brought up the claim that this is a pseudo antiintuative paradox.. The Wierdness of the probablilty is ONLY a result when a convergance (to that specicifed probablilty) is reached over an infinte number of cases. Per this particular singal case ,when you are in a REAL game switching the door doen't chagne AT ALL YOUR specific case chance of winning. "there ar e3 kinds of lies in this world : lies ,damn lies and statistics" The Procrastinator 14:14, 30 December 2005 (UTC)

In a real game that followed the rules set down in the problem, yes, switching the door does change your chance of winning. Picture the following situation: instead of doors, you and Monty have cards, and instead of three cards, you have ten cards, one of which is the Ace. Monty shuffles the cards, lets you pick one, and keeps the other nine in his hand. What are the chances that the card is in Monty's hand? Obviously, nine to one. Now, Monty discards eight non-Ace cards from his hand. Can the Ace possibly change from one hand to the other during this step? Clearly not, so the chance that the Ace is in Monty's hand is still nine to one, even though the actual size of Monty's hand is now one. -- Antaeus Feldspar 15:34, 30 December 2005 (UTC)

Let's revert to the fundamentals of probability

To determine the probability of an outcome you list all the possible outcomes and then count the number of times the outcome you are interested in turns up. Look at the decision tree under the Venn diagrams in the main entry. How many possible outcomes are there under each of the possible contestant choices? How many of these outcomes result in the contestant winning the car?

50% in all cases!

So how do you explain the observed fact that the contestant tends to win twice as often when he switches? --Doradus 16:40, 2 March 2006 (UTC)

Why?

Because the problem is mis-stated. When the contestant has chosen Goat 1 the quizmaster reveals Goat 2 - he doesn't have a choice. When the contestant has chosen Goat 2 the quizmaster reveals Goat 1 - he doesn't have a choice. When the contestant has chosen the car the quizmaster has to choose whether to reveal Goat 1 or Goat 2. These are independent possibilities and should not be selectively aggregated for the purposes of determining probabilities.

If you dispute this and believe that revealing Goat 1 or Goat 2 are aspects of the same event because the quizmaster only reveals one of them then you don't understand how probability works. The quizmaster also has to make a decision if the contestant picks the car and this decision must be included in the calculation of probabilities, as the decision tree correctly shows. The numbers allocated to the various decisions are meaningless, it is the counts that count.

Wherever you find a paradox, there's a fallacy lurking. (Hodgson's law - you read it here first) 80.47.80.51 01:45, 18 January 2006 (UTC) Graham Hodgson, 17 January 2006

Sorry, nope. They are not independent possibilities. There is a one-in-three chance that the contestant initially picks the car. The host can only choose between revealing Goat 1 or revealing Goat 2 when this one-in-three chance has already happened, and he must choose one of the two; therefore the total probability of these two chances must be exactly one-in-three. No more, no less. If we were to display the probabilities visually on a pie chart, we would see the total size of the user-picks-the-car slice stay exactly the same size as it was divided into the (not-significant) possibilities of "host reveals Goat 1" and "host reveals Goat 2". Are you seriously suggesting that that slice of the pie must actually get bigger because it's being divided into a larger number of slices? -- Antaeus Feldspar 02:18, 18 January 2006 (UTC)
The purpose of this page is to demonstrate that there's a sucker born every minute. The (number of ways to explain the problem logically and correctly) divided by the (number of dissenting opinions) will always be less than one. All you do by trying yet one more logically correct perspective in hopes of persuading only one more true believer is to perpetuate the ranks of non-believers by more than one. Thus the numerator will grow more slowly than the denominator and the ratio will continue to be less than one, (believers ÷ nonbelievers < 1), always. ;-) hydnjo talk 05:03, 18 January 2006 (UTC)
Of course - the chances of making a wrong initial choice are twice as great as the chances of making a correct initial choice, and therefore the chances of improving on the initial choice are also twice as great. I withdraw covered in confusion. 213.78.64.39 12:03, 18 January 2006 (UTC)Graham Hodgson
This has nothing to do with true-believers and non-believers, it's not a philosophical or moral dilemma, it's a simple mathematical problem whose solution is counterintuitive. Some understand it, some don't. Tailpig 19:37, 18 January 2006 (UTC)
I was using the terms believers and non-believers metaphorically for those that do understand and those that don't.  :-) hydnjo talk 19:48, 25 January 2006 (UTC)

Töff's analysis

This is my own plain-language explanation & debunk of "switching increases your odds." (It's essentially the Markov Chain). I hope Tailpig will read it and become a "believer." :)

Well, first off, let me say that I've never really liked that diagram; I know how the problem goes and what the 'trick' of it is and I find it incredibly difficult to see how that diagram relates to it.
With that said, however, your analysis is fundamentally flawed. Let me quote from your analysis: "Let's say you choose Goat1. The host shows Goat2. At that point, you have two paths: switch(Y) or not(N) ... and you have equal 50-50 chances to take either path." (emphasis in original) Well, that's the source of your confusion right there, because that is in no way the original problem. The essence of the problem is that the player chooses his strategy, whether to switch or stay -- he does not have it randomly selected for him with 50-50 probability! It's no wonder that your calculations show the player's chances as 50-50; if the player has one of two strategies randomly assigned to him, that will make his overall chances 50-50 no matter what the probability is for a given strategy.
If you doubt this point, let me illustrate. I will roll six ten-sided dice in a row. If and only if all six of them come up "10" will I put the prize in Box A; otherwise I'll put the prize in Box B. Elementary analysis should confirm that the strategy of picking Box B will pay off 999,999 times out of 1,000,000. If you are allowed to choose your strategy, you can win 999,999 times out of 1,000,000; if, however, I then randomly pick a 'strategy' and therefore a box for you, with equal probability of either, you now win the prize only 1 in 2 times.
Now that you know that the problem you've been dealing with isn't the actual Monty Hall problem, please let us know if you have any problem seeing why the answer of the actual Monty Hall problem is that switching gives you a 2/3 chance of winning. -- Antaeus Feldspar 00:08, 26 January 2006 (UTC)

Monty Hall is a Markov Chain

Well, I may be thick, but I certainly don't get it. After the door has been opened, I am left with two doors. The story so far has been very entertaining, but in fact it has given me no information which will indicate if my first choice was right or not.

However, if you were wrong, you now know which other door would have been right! And that is new information. -Scarblac 20060223

That means it is now 1:2. How we got to this situation is irrelevant, unless the process of getting there gives me information which is relevant, but I don't see how it does.
The article correctly says that for some statistical calculations the past can be ignored, while for others it cannot. The major fault of the article is that it does not go on to say why in this case the past is relevant. The article cites card counting as an example where the past cannot be ignored - if I know that some cards have already gone (and I know WHICH) then I have information which affects the probability of the next card being an ace, so the past is relevant for future probability. But if we have a sequence of events which are separate events, then we have a Markov chain, and previous events in the chain do not affect the next one: for example a series of coin tosses.
The point about Markov is that there is a difference in perspective before and after any event in the sequence. If the chances of tossing a coin and getting heads is 1:2, obviously the chances of doing it twice is 1:4, but if I toss a coin and get heads, the chances of now doing it a second time are 1:2, because the perspective after the first toss no longer takes the past probability into account. This applies to all sequences of probability, unless there is a CAUSAL link between the past event and the current probability (e.g. the last toss dented the coin so that it now falls differently). The Monty hall problem looks to me like a Markov chain. If I am wrong, then the article has to show that. It does not address this problem, and I seriously dout any of you can.
Think about this: I have three cards, and I pick two of them and put them on the table in front of you. I tell you to pick one, and if you pick the higher of the two cards, you win. We all agree that your chances are 1:2. The fact that I have three cards doesn't affect the choice I gave you, and your chances would be the same if I had four cards or only the two. The fact that there WAS another door has no bearing on the chances of getting the car NOW.
(BTW, the article says that "hundreds of maths professors" have attested that the proabability is 1:2. Is that not something you 1:3 proponents should be worried about - this smugness is incredibly arrogant! If the article is right, it needs to give serious maths authorities as sources, not internet sites.) --Doric Loon 11:24, 24 January 2006 (UTC)

You say the problem looks to you like a Markov chain. I assure you it is not, for reasons already explained in the article. The CAUSAL link is the constraint that the host MUST open a door, CANNOT open the door you've picked, and the opened door MUST NOT show the car (i.e. the host is NOT opening a random door). Your initial pick is a random event, but the host's action is not. The Bayes' theorem section is effectively a proof of the result explained in numerious other ways in the article. The references section already cites "serious maths authorities, not internet sites" (as you request). I assume you've read the article and the previous discussions on this talk page, and you still think the probability is 50/50. If you seriously want to understand I suggest you either print this article and take it to your maths teacher to discuss, or I can try to help here. -- Rick Block (talk) 15:09, 24 January 2006 (UTC)

Oh sure, I stand by the "assume good faith" principle and wouldn't be here if I didn't really want to understand it. The point is, though, that what Markov proved is that the probability of a future action depends entirely on the present situation and not on how we got here. How we got here is only relevant if it alters the present situation; the probabilities involved in getting here are not in themselves relevant for the probability of the next event. Now, I understand that the show host has no choice. What you haven't explained to me is what I learn from that which makes my next choice (switch or no switch) into an informed choice rather than an arbitrary (i.e. 50:50) one.
I'm not a mathematician, so of course I know I could easily be stumbling in the dark. But I do wonder if you (and the article) are not confusing two different things. Remember Markov and the coin: The chances of tossing a coin heads up twice are 1:4, but after I have tossed it heads up once, the chances of doing it a second time are 1:2. Now is it not possible that here too there are two different phases with different probabilities:

  1. the game is about to begin, I have to choose between the three doors, and know that I will later get the chance to switch. Are my chances better if I plan to switch? Yes.
  2. we are in the middle of the game, the host has opened a goat-door, and I now have the chance to switch. Are my chances better if I switch? No.

In other words, in terms of game theory, if we do it many times, I can optimise my chances by having a switch policy, but in the particular case, when I stand before two doors, it is 50:50. As with Markov's coin tossing, this seems intuitively wrong, but makes sense mathematically. I think. If that is true, then it explains why there are two strongly held views. They are answering different questions. In that case, though, the top of the article needs to rephrase the problem. --Doric Loon 16:11, 24 January 2006 (UTC)

Doric, the Monty Hall problem is the probability equivalent of an optical illusion: the situation is carefully chosen to make the mind jump to false conclusions based on the interpretive shortcuts that speed up everyday processing.
In this case, the situation has tricked you into thinking that there's more than one random event. There isn't; the only random event is the player choosing one door out of the three. (Technically, you can argue that when the player picks the car, there's another random event because Monty has to choose which one of two doors both containing goats should be opened. However, since there is no distinction between the goats, this is not a significant random event; either way, the result is exactly the same, that Monty winds up with one remaining door which has a goat behind it.)
Now, let's look at that random event, of the player choosing one door out of the three. The effect of this choice is to divide the doors into two sets, the player's set of one door and Monty's set of two doors. The car is either in the player's set, or it's in Monty's set; it should be fairly easy to see that the chances are only one in three that the car is in the player's set.
Next comes the other part that most often tricks people: Monty opens a door from his set which he knows to contain a goat. This reduces the size of his set from two down to one; this often fools people into thinking that because the two sets are now the same size, they must have the same probability of containing the goat. However, this is not the case: the car cannot move from one set to the other during this step, so obviously the chance of the car being in Monty's set must still be two in three, as it was when the only random event of the problem happened.
(Note: some people get fooled for a different reason when contemplating this step, especially if the problem is phrased incorrectly or ambiguously. Some people think that there is a chance under the rules of the game for Monty to open a door and reveal the car, and that when the problem says "Monty opens the door to reveal a goat", it means that we are to eliminate those cases as not having happened. However, the correctly stated problem makes it clear that Monty knows which doors contain goats, and will always choose a door which contains a goat.)
So, in summary: the player makes a guess, which he has a one-in-three chance of getting right, of where the car is. Monty then reduces the size of his set to one, so that choosing the right set is equivalent to choosing the right door. If the player guessed right the first time, staying wins; if the player guessed wrongly the first time, switching wins. The probabilities are still determined by that first and only random event, the one-in-three chance of the player getting the car in his set on the first try. -- Antaeus Feldspar 17:18, 24 January 2006 (UTC)

First of all, congratulations, that's the clearest presentation of your argument I have heard, and better than what is in the article. In particular, what you say about viewing both parts as one event is helpful. I am now comfortable with the 1:3 solution, provided we are talking about the probability of the whole. In my last comment I accepted that for the sake of argument, but now I accept it without difficulty. Standing at the beginning of the game I am cool about saying, my chances are improved by switching when the time comes.
But can you see my problem about coming into the thing half way through? The article begins by asking about the probability of picking the right door out of two AFTER a random choice has been made. Taking all three doors into account means we are including past events in the calculation (or past phases of the event, if you prefer). But that is selective. Perhaps, unbeknown to me, there were five doors, with three cars and two goats, and two car-doors were eliminated which I never heard about. That would reverse the probability. Taking the past into account is therefore dangerous. This is just instinct, but I sense your idea of viewing both parts as one event is only legitimate when you stand back and look at the whole thing, not when you are standing in the middle with one part done and the next part to be thought about.
But I WILL take your advice about asking a maths prof. --Doric Loon 18:48, 24 January 2006 (UTC)

If you come in half way through (two closed doors, one open, player having originally picked one), unless you know what has happened you would likely think the probability is 50/50. It's not. In the Markov case the next event (the next coin toss) is an independent random event, unrelated to previous coin tosses. In this case, the probabilty of the player's chosen door having the car is related to the conditions in effect at the time this choice was made. This is the only Markov event. By varying the initial conditions, we could make the probability with two doors left anything we'd like. Start with 100,000 doors and one car. You choose one. The host opens 99,998. There are now two left. It doesn't matter whether you watched this happened from the beginning, came in with 50,000 closed doors, or only at the very end. The initially chosen door has a 1:100,000 chance the whole time (just like when it was picked). At the end the other door has a 99,999:100,000 chance. When there are 101 doors left (the player's and 100 more), as a group the 100 that aren't the player's have a 99,999:100,000 chance so each individually has a 9,999:100,000 chance. Start with 100,000 doors and 99,999 cars. The host opens 99,998 doors with cars. Now the selected door has a 99,999:100,000 chance the whole time (just like when it was picked) and the other door has a 1:100,000 chance. The point is opening the doors has no effect on the probability when the player picks, and unless we reveal enough information to remove any uncertainty (making the "probablity" 1 or 0) this probability doesn't change (if doors are not randomly opened). -- Rick Block (talk) 21:36, 24 January 2006 (UTC)


OK, I've got it. In maths I'm a plodder, but even plodders get there. Thanks both. --Doric Loon 07:11, 25 January 2006 (UTC)

The fallacy of distinct goats

Just wanted to note something:

  • The player picks goat number 1. The game host picks the other goat. Switching will win the car.
  • The player picks goat number 2. The game host picks the other goat. Switching will win the car.

Umm.. let's say:

  • The player picks the car. The game host picks goat number 1! Switching will lose.
  • The player picks the car. The game host picks goat number 2! Switching will lose.

The chances are now 50-50. As it logically would be. Your chances of winning the car at all has actually increased your chance of winning from 1/3 to 1/2. Now you have 2 choices and one of them contains a car.

I'm afraid not. You are confusing the fact that four possibilities can be enumerated separately with the idea that they must all be equally probable. That is not the case; since the last two possibilities you mention are both dependent upon the player initially picking the car, they can only divide between them the cases where that in fact occurred. Thus, the possibilities could be written like this:
  • The player picks goat number 1: 1/3
  • The player picks goat number 2: 1/3
  • The player picks the car (1/3) AND the host picks goat 1 (1/2): 1/6
  • The player picks the car (1/3) AND the host picks goat 2 (1/2): 1/6
If you still don't see how absurd it is, then let me pose this: when the door is opened, a goat could be clean or it could be dirty. So that means there are four goat possibilities: the host could pick a clean goat 1, a dirty goat 1, a clean goat 2, or a dirty goat 2! Just by considering whether the goat is clean or dirty, we've elevated the chance that the player initially picks the car to four out of six! Now you're saying to yourself "that's ridiculous -- whether the goat is clean or dirty can't change the probabilities!" Yes, exactly right -- and neither can considering which of two identical goats Monty shows change the probabilities. -- Antaeus Feldspar 16:04, 25 January 2006 (UTC)
Since you brought it up in my Discussion Post, I just wanted to point out that the 2 goats actually are physically seperate. Solving a problem which involves some abstraction where the 2 goats are just 1 "not car" with a 2/3 chance of being chosen is not solving the original problem. It is solving a paralell problem which only gives the same answer when specific conditions are met: namely that both only winning the car matters and the goats are revealed with the same probabillity and are both identical in being not goats. All the wiki explanations depend on this abstraction and give no mention of the dependencies which need to be met in order for this approach to work. Considering the goats seperate (as they actually are) leads you to the 2/3 probability to get the car by switching. The preceding unsigned comment was added by 69.180.7.137 (talk • contribs) .
No, I'm sorry. You're really just wasting our time, here, because you keep introducing factors which are not in the problem and claiming that the problem hasn't been fully discussed if we aren't addressing your introduced variations. When we talk about a probability puzzle and we say "X rolls a six-sided die" we do not need to explicitly spell out that the die is not loaded. If the puzzle states that the die is loaded, then we address that factor, but it is just useless complication to insist that in all cases it be explored what happens if the die is loaded or if Monty really loves one goat a lot more than the other or whatever else it is. -- Antaeus Feldspar 18:18, 21 February 2006 (UTC)
I'm not sure anymore whether the argument is that switching improves your odds, or whether the whole business is a scam. It reminds me of the card game scam that Wednesday uses to cheat the waitress in American Gods. One thing I didn't like is that the sums of the probabilities at the bottom of the probability diagram add up to 200%, when they are only allowed to go to 100%. ie. The odds when player picks goat x and does whatever should only be a portion of the original slice. By the time you get to the end, you are pretending that you have 2 pies.
Pick goat 1 and switch: 1/6
Pick goat 1 and stand pat: 1/6
Pick goat 2 and switch: 1/6
Pick goat 2 and stand pat: 1/6
Pick car, see goat 1 and switch: 1/12
Pick car, see goat 1 and stand pat: 1/12
Pick car, see goat 2 and switch: 1/12
Pick car, see goat 2 and stand pat: 1/12
If you choose blindly between the last two doors, your odds are winning are 50/50. This stays true regardless of how many doors you initially begin with. That must be why the show has the 'no switching once you picked' rule, otherwise they'd give away a lot of cars. Of course, you don't have to choose blindly, in the example you are allowed to choose the door with better odds. -- JethroElfman 23:37, 1 February 2006 (UTC)
Frankly, I think the diagram is not too good, and should be replaced with two trees -- one showing what happens if you use a "switching" strategy each time, one showing what happens if you use a "staying" strategy each time. The current diagram seems to confuse a lot of people into thinking that staying or switching is going to be picked for them, which is of course not the point. -- Antaeus Feldspar 02:51, 2 February 2006 (UTC)
I've tried to draw a full scenario for this, and my finding is both switching and not switching have the same odds of 50%. In total, I have 24 scenarios : half of them will result in lose, another half is winning scenario. Here it is :
Car is put at Door (D)1 - Player picks D1 - Host picks D2 - P is not switching - WIN
Car is put at Door (D)1 - Player picks D1 - Host picks D2 - P is switching - LOSE
Car is put at Door (D)1 - Player picks D1 - Host picks D3 - P is not switching - WIN
Car is put at Door (D)1 - Player picks D1 - Host picks D3 - P is switching - LOSE
Car is put at Door (D)1 - Player picks D2 - Host picks D3 - P is not switching - LOSE
Car is put at Door (D)1 - Player picks D2 - Host picks D3 - P is switching - WIN
Car is put at Door (D)1 - Player picks D3 - Host picks D2 - P is not switching - LOSE
Car is put at Door (D)1 - Player picks D3 - Host picks D2 - P is switching - WIN
(repeat the same scenario for Car is put at Door 2 and 3).
So, in total, we will have 24 scenarios with :
6 scenarios of player switching and win
6 scenarios of player not switching and win
6 scenarios of player switching and lose
6 scenarios of player not switching and lose
Therefore, player's strategy to switch and not to switch stand equal chance to win (50%)

202.152.170.254 10:12, 13 February 2006 (UTC) Hartono Zhuang

The events are not equal probability. For example, according to the above, when the car is at D1, the player will pick D2 2/8 times (1/4), but D1 4/8 times (1/2). The player has no reason to pick D1 more than door 2. This is a common fallacy: distinct events feel like they should have all the same probability. Clearly, here, they don't. --Mike Van Emmerik 10:32, 13 February 2006 (UTC)

Markov again

Coming back to Markov. I suspect the reason most people have difficulty is because at school they were taught the Markov principle (usually not under that name) and this really does look like it at first sight. Fooled me for long enough. As Antaeus Feldspar says above, it is an optical illusion in this respect. The article doesn't really help here. At the top of the "Aids to understanding" section it points to this issue, saying that "The most common objection to the solution is the idea that, for various reasons, the past can be ignored when assessing the probability." But I am not sure that what follows really helps most people see why it cannot be ignored: I certainly read it the first time with a sense of frustration that the key point was evading me. For me the eurika effect came with Antaeus' pointing to the (in retrospect obvious) fact that the car can't move. I've been mulling over how to explain the difference between Markov and Monty Hall. I think the difference is that after I toss a coin, I don't toss it a second time from where it landed, but rather I pick it up first: the way it landed last time doesn't affect the next toss because I return it to a neutral position before the next event in the sequence. The equivalent of returning to a neutral position would be if, after the first door has been opened, the game organisers were to remove the car and remaining goat and reallocate them by a random principle. THEN the second phase would be unaffected by the first, and that would be Markov. But they don't. What everyone understands is that if we play the game three times, the first guess will be right once and wrong twice; what is so easy to miss is that that cannot change unless the car moves. Now this all seems so obvious, but it is in fact the massive blind spot which makes the optical illusion trick people. I wonder if it would be worth having a short paragraph on Markov in this article, and discuss the difference properly - and perhaps less chaotically than I can do it. --Doric Loon 15:24, 31 January 2006 (UTC)

I think you've got a good point, Doric. I'm not even sure that really is the most common objection, either: my experience with explaining it to people tends to be evenly split between those who think that probability automatically corresponds to the size of the sets, and those who think that Monty's removing a door has "reset" the probabilities and has generated a new random event, with probabilities independent from previous random events. Though... now that I typed that out, I'm wondering if they're really the same thing, after all. -- Antaeus Feldspar 15:37, 2 February 2006 (UTC)

Staying or switching at random

"Note that switching at random is quite distinct from just keeping the original choice. Having arrived at A, B, C, or D, if the contestant then blindly flips a coin to choose whether to switch or stand pat, then there is a 50% chance of ending up with the car. However, the choice doesn't have to be made at random. The coin flip gives a 50% chance of being the option with 1/3 likelyhood, and 50% of being the option with 2/3 likelyhood, so they balance."

This has nothing to do with the Monty Hall problem. The Monty Hall problem is about whether choosing a particular strategy can increase or lower your chances. Talking about what would happen if a coin flip decided your strategy for you only confuses the issue and makes it harder for people to understand the real problem. -- Antaeus Feldspar 16:02, 2 February 2006 (UTC)

Yes, perhaps the old diagram was just irritating me too much. It showed the odds at 50/50 and was confusing. Still, I think people consider choosing at random to be strategy in itself and extrapolate from that to the mistaken notion that both doors are equal. The number of cries for help here on this talk page indicates to me that there's still edits to be done so the article makes its point better. That's my idea with the random thing; to tell people that their gut instinct of it being 50/50 is correct, but only if they choose were choosing blindly. If the page keeps any of the new diagrams I'll make better images to replace these quickies. -- JethroElfman 02:47, 3 February 2006 (UTC)
I agree there's still room for improvement. I think the card game experiment is maybe the best aid to understanding, since people can actually carry it out themselves and see why Monty opening a goat-door doesn't change any probabilities. Would anyone object if I moved that to the top of the "Aids to understanding" section? -- Antaeus Feldspar 16:00, 3 February 2006 (UTC)
Hey, I think that would be a good move. In fact, I would eliminate the 3-card version (or make it the extrapolation) and begin stratight away with a 52 card pack where the objective is to find the ace of spades. It is important to get this done early because the length of the article means people may not make it to the end. JethroElfman 16:58, 3 February 2006 (UTC)
I would have to disagree about making the 52-card rather than the 3-card version the primary version; I've seen a lot of people respond to attempts to prove the 2/3 result through simulation with "well, if your simulation comes out with what's obviously the wrong result, it proves that you mis-programmed your simulation!" I'd rather make the simulation correspond as precisely as possible to the actual problem so that there's less room for people to think that some significant factor differs between the door version and the car version. -- Antaeus Feldspar 18:00, 3 February 2006 (UTC)
Additionally, what about a "Common misperceptions" section, explaining common ways in which people either misunderstand the ground rules of the problem or misunderstand the effects of those ground rules on the probabilities? -- Antaeus Feldspar 16:02, 3 February 2006 (UTC)

Simply.

Looking only at winning posibilites.

If you are going to swap you must pick a goat in the first place and your odds are 2/3 of doing that. If you are not going to swap you must pick the car in the first place and your odds of doing that are 1/3.

--81.79.90.68 14:55, 4 February 2006 (UTC)Tim Robinson.

Tricky Host Scenario

I'm not sure I would switch. The Monty Hall problem states that the host allows you to switch only AFTER you have picked a door already (It was not part of the initial rules). So if the host knows that you picked a door with a goat, he COULD directly open that one. This would give you a much reduced probability of getting the car, and you could only get the car if you didn't switch.

No, sorry. That's not possible. Once you pick a door, the host must pick another door to open. He can't choose to open the door that you picked. -- Antaeus Feldspar 02:45, 4 February 2006 (UTC)
But that begs the question, why didn't he tell you about the switch in the first place.
How is that relevant? -- Antaeus Feldspar 03:13, 4 February 2006 (UTC)

Somebody please help - I think I understand the correct answer, but my mind is still struggling to get around the following: Imagine contestant A chooses door 1. Monty hall then opens door 3 to reveal a goat. At this point you introduce contestant B. Contestant B has no prior knowledge of the game. He is told he has been "allocated" door 1, he does not know why door 3 is open. He is effectively in the same position as contestant A, but he does not know that the game is fixed. This time both contestant A and B are offered the choice to switch or stick. Surely the percentage chance for contestant B is 50/50. If so how can the same two doors have different probabilities of a prize at the same time for two people standing in front of them? If contestant B does not have 50/50 why not?

If I may interject out of sequence... Imagine a third player C who has been told which door conceals the car. The probabilities for him are clearly 100% and 0% for the two doors. Does that mean it's also 100% and 0% for A and B? It depends on your point of view. Probability is nothing but a quantitative measure of uncertainty, and all three players A, B, and C have different levels of uncertainty because they have different information. --Doradus 02:54, 1 March 2006 (UTC)
Yes, the chances for B are 50/50; B is playing a different game. For A, there were initially three choices, all equally likely. For B, he sees one open door, with a goat in it. Note also how one of the rules of the game comes into play here: there is always ONE car and TWO goats. But for B, since door 3 is not available as a choice, there are only two choices, one of which has a car, the other a goat. The fact that B has been "allocated" the first door doesn't matter. Summary: A has three choices initially, all equally likely to have the car. B has two choices, both equally likely to have the car. After the host action, A has two choices, NOT equally likely (stay with door 1, or swap to door 2). B arrives after the host action, so there is no before and after for him. Look at it another way: the game is vastly different for B because the whole 1/3 chance that door three might have the car has been taken away, and distributed randomly into the two remaining doors. Lucky B! --Mike Van Emmerik 21:52, 6 February 2006 (UTC)
Oops! I agree I was wrong. Sorry for the misinformation. It's so easy to let your intuition lead you astray with this one. B's chances are the same as A's, because the allocated door is not chosen randomly, nor is the door which is shown to be open. --Mike Van Emmerik 22:18, 7 February 2006 (UTC)
If I may interject -- there is a very big difference, which I feel we're muddying here, between what one's chances actually are and what one perceives one's chances to be. B's chances actually are 2/3, the same as A. B may perceive his chances as 50/50, based on the information available to him, but we know in this case that there is important information that B does not have which changes the odds. If B, on some random whim, decided to adopt a strategy of "I'll always pick a door other than the one allocated to me", he would win 2/3rds of the time, even though his incomplete knowledge suggests incorrectly that he should only be winning 1/2 of the time. -- Antaeus Feldspar 23:53, 6 February 2006 (UTC)

No, B's chances of correctly choosing the winning door, when fully utilizing the information available to him, is 50/50 because he cannot differentiate one door from the other—that is, he doesn't know which of the two doors you picked. What is Monty's chance of picking the correct door, when fully utilizing the available to him? It's certain, of course. So you cannot make the case that the odds can't change, because they are distinct between differing amounts of knowledge. Of course this is the case. A's chances are 2/3 because he is fully utilizing the information available to him. In his case, the extra bit of information is that he knows that of the two doors you didn't pick, Monty will never open the winning door for you, thus his actions are constrained by the fact that he knows something and his actions tell A partly what that information is. The one thing that A doesn't know that Monty knows, is anything more about the door he initially chose than he ever did. The odds of that particular door being correct are 1/3, which they have always been, and will continue to be.

I wrote the first specific web page on the MHP ten years ago, it has been referenced widely, and I have corresponded with countless people about this. No one way of presenting this problem is always effective at realizing comprehension. But one approach has had good success, and that's the million door version of the problem. If you pick a door from among a million, obviously your odds of it being the winning door are a million-to-one. However, if Monty opens all remaining 999,999 doors excepting the winning door, and presents you with two doors, you choice and the remaining door, is it better to switch or to stay? And if someone else walks along at that moment, not knowing anything about which door you picked and which doors Monty opened, their best odds for correctly picking the right door are 50-50. But the other guy is nearly assured of choosing the correct door—it's the one in almost a million doors that Monty very specifically, knowingly, did not open. There's a very small possibility that he didn't avoid any particular door because your door was the correct door. But the chances of you having correctly chosen the winning door from a million is 999,999 to one. It's extremely unlikely. (kmellis@kmellis.com) 69.254.138.180 06:27, 7 February 2006 (UTC)

We need to remain clear about this. B's chances of winning if he follows the same strategy as A are 2/3, just as A's chances are. The fact that he has no way of knowing that there is an optimal strategy or what it is doesn't alter the chances he would have if he followed that strategy. We might as well flip a double-headed coin and say that B has a 50/50 chance of being right if he guesses it'll come up "tails".
Now if you ask a different question, which is "can an optimal strategy be deduced from the information which B has?" then the answer is "no". But two people in a row stated that B's chances are actually 50/50; they are not. The question is "is there an optimal strategy?" not "would someone deprived of certain vital pieces of information about the game know that there is an optimal strategy?" -- Antaeus Feldspar 15:41, 7 February 2006 (UTC)
Very true. In this scenario, nothing has changed; B is still better off switching. But for all B knows, the game is designed such that he's always allocated the winning door, another door is opened, and the host tries to trick him into switching. Or perhaps (in B's mind, anyway) the game is that the host always picks the correct door, then he's "allocated" one of two doors, neither of which will ever win. He simply doesn't know.
Still, the laws of probability don't change simply based on what you know. But as far as B can tell from the available information, switching or staying might be a better option, or it might not make a difference. This means his chances balance out to 50/50 (I think), but only if he has a 50/50 chance of switching or staying (1/2*1/3 + 1/2*2/3 = 1/2). And all it takes is telling B "you should always switch doors" to bump his odds back up to 2/3. – Wisq 16:07, 7 February 2006 (UTC)
"...laws of probability don't change simply based on what you know". In this sense, they certainly do. By the statement of the problem with regard to A and B, B enters the room after both A and Monty have made decisions. B has no information with which to differentiate A and B and has exactly a 50-50 of picking the winning door. I don't understand why people would try to determine some Platonic ideal probability value for the door and the prize independent of someone making a choice. It makes no sense. Or, if it doesn, the probability is null. If I flip a coin that is perfectly constructed, catch it in my hand and look at it, and you try to guess which side it is, based upon what you know, you have exactly a 50-50 chance of being correct. If I "guess", based upon what I know, I have a 100% chance of getting it right. There is no independent value for the probability of guessing outside the context of someone guessing.
In the Monty Hall Problem, you can understand the varying probabilities by evaluating what each person knows: Monty, A, and B. Monty knows everything there is to know, and B knows nothing (other than that there actually is a prize behind only one door). A, however, has a 66% probability of having been told by Monty's actions everything that Monty knows, specifically where the prize is. A third of the time Monty tells A nothing. 2/3 of the time, he tells A everything. If A bases his decision on the assumption that Monty has told him everything, he will win 2/3 of the time. He will only lose when his first choice of door was correct.
There's a couple of ambiguities in how you guys are talking about this. The first is the implicit assumption that we're talking about whether or not A or B knows there is a winning strategy. That's a very confusing diversion. It's sort of a meta-MHP. In discussing the problem, we're taking a God's eye view of the matter and are evaluating the odds of picking the winning door if using various strategies. In that context, all evaluation assumes that A assumes that a particular strategy is the winning strategy and acts upon it in each respective tree. That you want to talk about where the winning door is likely to be depending upon whether or not A knows the optimal strategy is a mixing of levels of analysis. And even though you know and understand, at least to some degree, the correctness of this answer to the MHP, your desire to mix levels is indicative of exactly why people's intution about the problem is misleading. They don't know what perspective to take. Or, alternatively, they implicitly take B's perspective even though the problem statement allows them A's perspective. Or, alternatively, they attempt some sort of ideal perspective, which they assume is essentially B's. Here's why that's misleading: they could also incorrectly choose to take Monty's perspective (that is, they know where the prize is). What do the concepts of "staying" and "switching" and "winning" mean in that context? There's still "winning", but the concept of staying/switch seems absurd because it's deliberately ignoring the fact that from this perspective you already know where the prize is. Similarly, assuming B's perspective also renders the concept of saying/switching meaningless.
The reason we don't say that probabilities vary by "how much someone knows" is because when we evaluate probabilities we're assuming that everything that is possible to be known in a given problem statement is known, excepting the outcome. (We even do this with regard to past events, which is dubious, but that's a different discussion.) The MHP quite clearly states the problem as an evaluation of probabilities from A's perspective. Thus we assume that A knows everything that A can know and we proxy for A in evaluating various strategies. In doing so, we learn that when we're (or A is) in that exact situation, switching will on average allow us to win 2/3 of the time. If you want to talk about something that is very like the MHP but takes a different perspective, then you must very deliberately state the problem from that perspective so that it's clear how one should evaluate it to discover the answer.
I don't understand why people would try to determine some Platonic ideal probability value for the door and the prize independent of someone making a choice. Because that is the only way to compare apples to apples, instead of apples to oranges. If you look at the suggestion by 193.129.187.183 here which started us down this whole road of discussing B, you'll see that he/she asked "how can the same two doors have different probabilities of a prize at the same time for two people standing in front of them?" It was therefore important to clarify that in terms of what actual probabilities the doors had, they did not have different probabilities. It's only when you change the question from "what is the actual probability of the two doors" to "what is the probability that this person will find the winning door" that you actually see the probabilities being different for different people. -- Antaeus Feldspar 01:42, 9 February 2006 (UTC)


I think you should carefully consider what, if anything, your statement about "the 'actual' probability of the two doors" could possibly mean. To shortcircuit the Socratic method, I'll just claim that if it means anything, it means that there is a probability of 100% that the prize is behind the door that it is behind. This system taken in isolation isn't probabilistic, it's already determined. The only probabilistic perspectives are those where an observer has less than complete information. And each one of those perspectives constitutes an independent problem. A, having picked a door and then watching Monty opening a door and knowing that that implies, should switch, and he'll win 2/3 of the time. B, not having picked a door, nor knowing which door Monty opened sees only two doors that he cannot differentiate from each other in any way. There can be no "switching" or "staying" in a problem statement for this B fellow, the best he—and we—can do is say that if he randomly chooses a door, he has a 50% chance of chosing the right door. Finally, Monty, who knows where the prize actually is, would pick the winning door because he knows which door the winning door is. A problem statement about Monty would be something like "should Monty pick the door he knows hides the prize, or the door he knows hides the goat?" And of course the answer to that is that he'll always win if he chooses the winning door and he'll always lose if he chooses the losing door. These are three different problems. Only Monty's perspective is arguably the supposedly privileged perspective; but I suspect the most rigorous analysis would show that the only thing you could possibly mean by thinking of a privileged view, an inherent probability between the two doors, is a determined system that is completely known. You always "win" (Keith M Ellis, kmellis@kmellis.com, www.montyhallproblem.com).
Your analysis is correct but is only apt to confuse people by leading them away from the actual crux of the problem. Yes, if we have only as much information as B does, we only have a 50/50 chance of picking the right door. Yes, if we have the same information that A does (and apply it optimally, of course) then we have a 2/3 chance of picking the right door. And yes, if we have the same information that Monty does, then we have a 100% chance of picking the right door. What those who are not grasping the problem yet struggle with, however, is not why B's chances are 50/50 but why A's aren't. There are two ways to clarify this: One is to say "Monty gave A information about which door doesn't have the car; B actually has this same information too, but he is missing information about what happened before the number of doors was reduced to two, so he doesn't know that there were three different ways that they could have arrived at two doors and two of those three ways result in the car being behind Monty's door." The other is to go straight to the heart of the matter and say "There were three different ways that they could have arrived at two doors and two of those three ways result in the car being behind Monty's door." Obviously I think the latter is preferable. -- Antaeus Feldspar 18:28, 9 February 2006 (UTC)

(unwrapping back to left:) You state the following:

B, not having picked a door, nor knowing which door Monty opened sees only two doors that he cannot differentiate from each other in any way. There can be no "switching" or "staying" in a problem statement for this B fellow, …

But the scenario we were discussing states the following:

Contestant B has no prior knowledge of the game. He is told he has been "allocated" door 1, he does not know why door 3 is open. He is effectively in the same position as contestant A, but he does not know that the game is fixed.

Hence, there is indeed the concept of switching, due to the "allocation" of a door. If B picks to switch or stay randomly, he has 50-50 chances. If you tell B to switch, he has a two thirds chance. This is why I say the probability (of winning via switching versus via staying) does not change based on what you know. If you want to say that B doesn't even know what door has been allocated, you may, but that's a whole different problem. – Wisq 04:26, 10 February 2006 (UTC)

This page is monstrous

I edited this page some time ago to clearly state the problem and the answer. Since then it has been edited back into a state. The page is a mess because there is no single, clear problem and answer statement. It is long and rambling. There are several statements of the gameshow, alternative versions, multiple versions of the answer, anecdotes, and all manner of irrelevant nonsense, and the important information is simply lost. This is an example of an article where the lack of an authority and a team of expert researchers, as is found in a "real" encyclopaedia, leads to a polarisation of opinion and hence more and more verbose explanation in order to convince those who fail to understand the right answer to prevent them from editing incorrectly.

By the way, the correct answer is yes, you should switch. If you don't believe it then set up the game yourself with an accomplice, try it 100 times and see what answer you get. Alternatively, use a computer simulation, or even do the probability theory from first principles (but do it properly). Any mathematical explanation that reaches a different answer has a flaw in it. Ignore your intuition. PK

I agree, but aside from deleting most of the page and then locking it, how can you solve this? Besides, although we know the correct answer, it can take a lot to convince someone else that their intuition is wrong — typically by explaining it to him or her in exactly the right terms. By debating it on the talk page, explaining it different ways until the opposing party "gets it", and then putting that method on the page, we have organically come up with a system that (judging from the reduced debate on this page) convinces most people of the correctness of the answer.
I would rather a long and overly verbose page, that expresses the truth in a way that almost anyone can understand, than a short and concise page that most people would look at and think "that's bogus" — and either edit it, or just walk away convinced that Misplaced Pages is full of lies. – Wisq 17:13, 10 February 2006 (UTC)

Dual data sets being ignored

On these types of problems, there are dual data sets at work. Set one is the information which allows us to determine the "odds" of finding the item being sought. Set two is the information which allows us to determine what the underlying statistical distribution of winning choices is. Originally, the "odds" of both data sets are the same, but they do deviate when additional information is acquired. The additional information is provided by the certainty that the removed choice is a loser. Because of this, the choosing party is no longer making a guess, but instead is making an informed calculation. Please look at the definition for guess "To predict (a result or an event) without sufficient information". Please take note that when one has sufficient informatiton, one is no longer guessing. The removal of one choice provides us with more information and we move from the position of a mere guess to that of an educated guess, which is not the same thing. Now as to the "statistical distribution" angle: If you have 10 shoe boxes on your desk and one of them contains and egg, the distribution percentange is 1/10 or 10%. Those numbers never change. However, when we gain more information about the contents - say by opening a few, our odds of finding the egg increase. People argue about these problems because of the tricky idea that there is true "guessing" involved when there is not. And also because they forget that the numbers regarding the original distrubution of winning choices is fixed. Only the odds of finding the item improve, not the statistical likelyhood that it actually existed. Merecat 05:48, 11 February 2006 (UTC)

It seems that no matter how this is explained (including your logical choice of words) the argument will not end. There are some who will follow the theatrics rather than the logic no matter what. Give it a go in the article if you think it will help. hydnjo talk 01:11, 12 February 2006 (UTC)

Another way to explain it

If you can accept this following fact, it could be easier to see that switching will make the probability for getting the car 2/3.

Fact: If you choose a door with a goat behind, switching will gett you a car, if you choose a door with the car behind it, swithing door will gett you a goat.

Lets take it from the start. You choose a door, lets say door C. Its now a 1/3 chanse that you choose the door with the car behind it. The host now revals one of the door with the goat behind, lets say he revals door A with the goat behind. Its now ONE goat behind a door wich you cant see, and ONE car behind a door you cant see. You now have 2 doors witch you dont know whats behind, the only thing you know is that its one car and one goat left and they har hiding behind these two doors. (Just to make it perfectly clear, it COULD be that behind door C there is a goat, and therefore behind door B there is a car, it also COULD be that behind door C there is a car and therefore there is a goat behind door B, it CANNOT be that behind the two doors left there is one goat each, becouse the host has already revaled a door with a goat behind). Now comes the fact that will make it easier to understand: If you switch door, its now GUARANTEED that IF its a Goat behind Door C ( the door you first choose) you are swithing to the door with the car behind (door B), IF its a car behind door C then its GUARANTEED that you are switching to a goat (door B).

As you first choose door C it was 1/3 chanse of selecting the door with the goat behind, when you swicth its GUARANTEED that you are swithing to a door witch the content not being the same as the inital door you choose (read that sentence twice). And therfore if you switch its 2/3 chanse of getting a car.

That seems like a pretty convoluted way of saying "picking a goat and switching will get you a car, and you have 2/3rds chance of picking a goat". :) Really, that's the absolute simplest explanation. – Wisq 00:10, 14 February 2006 (UTC)

Oki, I agree, what I was meaning was here is another way to explain it.

(deleted) Disagreement with explanations

This whole section has become an attack and counter-attack on each other's arguing techniques. If you want to continue this, you can do it on your user talk pages. --Doradus 17:21, 27 February 2006 (UTC)

One more time

Start with the usual. You pick ONE door and the host gets the remaining TWO doors. Stop. Now would you think it a good idea to swap your ONE door for the host's TWO doors? If you say NO, then I have nothing more for you, go away.

But if you say YES, I want to swap my ONE door for the host's TWO doors then you're almost there. That's it, except for the host's theatrics of opening a losing (and he knows it) door. What you're doing is swapping your ONE door for his TWO doors, even if he tried to mess with your head by opening one of them (oh, sure, like he's going to open a car door) and the diversions like the audience screaming "swap" - "don't swap" and the crew saying "cue the flashing lights" and the "host's silly grin". It's all about messing with your head remember. you're swapping your ONE door for his TWO doors.

To dramatize the situation, start with ten doors. You get ONE and the host gets NINE (they would never do this as it would expose the whole thing). Now, the host (between commercials) opens EIGHT of his NINE doors (never ever opening a car door and he knows it). Wacha think now? hydnjo talk 02:25, 18 February 2006 (UTC)

Completeness of the explanations

The lesson to learn from the Monty Hall problem isn't one of mathematics, but of psychology -- that your instincts might be incorrect. With that in mind, the explanations given don't have to be complete. They have these two goals: 1) to be correct, 2) to be easy to understand. I tried wading into the Bayes theorem article and just couldn't make my way, so I don't think it meets goal #2 as a means of explanation.

I find it difficult to see in what way T.Z.K. disagrees with the article. I didn't like the chart that showed probabilities adding up to 200%, so I changed it. If T.Z.K. would like an article that actually is directed to mathematics, then I suggest expected number. I editted the grammar there, but the phrasing of the math is still weak. If he wants to revise wording then he can give that a shot too.

I like the extension he proposed to the problem. If one goat is blue and the other red, and you know that Monty always picks the red goat when he has a choice, then if he opens a door with a blue goat you must switch for it is 100% that your original pick was the red goat; if he shows a red goat it is now 50-50 whether or not to switch.

Such extensions are better off left out of the article lest they add to confusion.

Lastly, I like Antanaeus. He is remarkably patient with those who come here for help. JethroElfman 18:35, 22 February 2006 (UTC)

Thank you, Jethro -- you don't know how good that is to hear. Sometimes, even though I work hard at it, I feel like patience is what I'm worst at! -- Antaeus Feldspar 20:49, 22 February 2006 (UTC)
Well, you're certainly more patient than I — hence my current silence. ;) – Wisq 04:18, 23 February 2006 (UTC)
Antaeus, "The superior man is modest in his speech, but exceeds in his actions." -Confucius hydnjo talk 13:01, 23 February 2006 (UTC)
Actually it has nothing to do with mathematics. One thing that many self centered psuedo intellectuals such as feldspar fail to recognize is that one person's intution isn't everyone's intuition. Many people have read the current explanation and found it utterly useless because of obvious problems. The most glaring is that it ignores all information given when a goat is revealed as well as ignoring the fact that you are given a chance to change your choice with the new information and changed probabilities. This alone causes many people to outright reject the explanation because to them it is "obviously incorrect" regardless of whether or not the answer it arrives at is correct. Anotherwords their intuition is different from yours.
To see what I am talking about just make a real venn diagram of the actual problem, with 3 circles Car, Goat 1, and Goat 2. Then draw 2 ovals one overlapping car and goat 1 (goat 2 revealed), and the other overlapping car and goat 2 (goat 1 revealed). Each circle has a total of 1/3 probability and since a goat is always revealed all probability is in a circle and an oval. The cars circle has 1/6 in each oval inside it (make sure the ovals arent overlapping). Now when a goat is revealed simply cover up the oval that represents the other goat being revealed. You should be left with a 1/6 car chance and 1/3 chance that you chose the other goat. This is the real solution in a form that is easiest to see.
On one hand the solution is better because it is universal. Feldspar's argument is that to him it is obvious that the real problem is just like this other problem where the goats are one abstract "not car" with a 2/3 probability of being chosen. He would fail in convincing many people of that. However noone can doubt a solution which looks at the problem this way, because it's the actual problem ie the goats are physically seperate and have seperate chances of being chosen etc.
As far as whether or not it is correct and is easy to understand, because there is no omnipotent creature to evaluate whether or not something is correct and tell us all we can do is make sure our methods are sound. If O.J.'s lawyer tells someone "If the glove don't fit, you must aquit!" there are people who would call this an easy to understand explanation that gives the correct answer. And if OJ just happened to be not guilty, it would be. Are you comfortable with such an "simple and correct explanation" even in that case? I hope not because noone wants people setting murderers free in other cases because some random glove didn't fit. Simple and correct often = sounds like it makes sense but often gives the wrong answer.
As a simple warning this feldspar character is not at all what he tries to appear as. He resorts to very devious tactics in arguments such as editing his opponents writings to make them appear wrong about something and then pretends to be objective. I am glad that people such as he only have limited power over something like wikipedia. There are many forums run by opinion nazis like feldspar that simply boot those who disagree and then come to the conclusion that since everyone on their forum agrees with them they must be right therefore they are justified in booting people who disagree. The sad thing is that type of system actually seems to work.. Rather than fight about something most people are apathetical enough to just agree with whoever is in charge wherever they are. Keep an open mind and never allow yourself to reason like "someone is correct because they are always correct" or "I agree because everyone agrees" etc as they are circular arguments. Decide things for yourself carefully. Of course many people would not be convinced. —Preceding unsigned comment added by 69.180.7.137 (talkcontribs)
Two things, 69.180.7.137. First, you are badly violating the policy of Misplaced Pages:no personal attacks. Secondly, you have been repeatedly doing exactly what you accuse me of, "editing his opponents writings to make them appear wrong about something", here, and several times before that. Please stop doing this; not only it is very rude and against policy, you can hardly imagine that having been caught at it several times, you can get away with it trying it again. -- Antaeus Feldspar 17:16, 27 February 2006 (UTC)

Number of doors vs. number of ways of getting there

The answer to the problem is NO. The logical fallacy of the reasoning which leads to conclude in the positive solution lies in the fact of considering the second goat when there is no more second goat.

Think of this: two doors remain. One has a goat, the other one has a car. Chances are 50/50. It does not matter as to whether my door conceals the car or the goat. My switching or not is like a new decision, a new choice. In this new choice, I chose to keep my door (which is the same as to say that I re-pick that door), or to pick the other door. The open door does no longer count. The chance to switch creates a new choice with new alternatives, erasing the old ones.

The answer really is YES. Even though there are only two doors left, there are three ways of getting there: by picking goat #1, by picking goat #2, or by picking the car. Two out of those three ways result in switching being the right answer. -- Antaeus Feldspar 21:27, 24 February 2006 (UTC)

Does it matter how Monty picks which goat to show?

Recently an editor brought up an interesting point for discussion: namely, the possibility that the answer to the problem might change depending on whether Monty picks a goat to show (when he has a choice) with equal probability for each goat, or with unequal probability. I'd like to discuss why the answer is "no, the answer to the problem does not change."

To see why not, let's start with a discussion of how to calculate probabilities in a case where one probability determines what situation you are in, and the situation determines your chance of "winning" from that situation. For instance, consider a game where you and an opponent each pick a card from separate decks; if your cards are both face cards (J, Q, K, A) or both non-face cards (2-10) you win; otherwise you lose. The way to calculate the total probability is to multiply your chance of getting into each situation by the chance of winning in that situation, and add those products together. So, in our example game, you have a 4 in 13 chance of picking a face card, which gives you a 4 in 13 chance that your opponent will pick a face card and you'll win, and similarly you have a 9 in 13 chance of getting a non-face card, which gives you a 9 in 13 chance of winning; the total probability of winning is thus ((4/13)*(4/13)) + ((9/13)*(9/13)) = 16/169 + 81/169 = 97/169.

So let us say that Monty picks a goat to show (when he has a choice) with possibly-unequal probability: for every x times that he chooses to show Goat 1, he chooses to show Goat 2 y times. Thus, when he has a choice of which goat to show, he shows Goat 1 x/(x+y) of the time, and Goat 2 y/(x+y) of the time.

How often does Monty show Goat 1 in total, then? Well, Monty only has a choice about which goat to show when the player chooses the car initially. By the conditions of the problem, the player's initial choice is equally likely to be the car, Goat 1 or Goat 2. Therefore, if Monty has a choice of goats x+y times, then x+y times he must have to show Goat 1 because the player picked Goat 2. Therefore, Monty shows Goat 1 2x+y times in all -- x times because he chooses to, x+y times because he has to. By similar logic we can determine that Monty shows Goat 2 x+2y times.

The chance that the player sees Goat 1 is therefore (2x+y)/(3x+3y), and the chance that the player sees Goat 2 is likewise (x+2y)/(3x+3y). We can now look at what the chances are of winning if we employ a switching strategy in these situations. We already saw that Monty shows Goat 1 2x+y times in total. x+y of those times, he had to show Goat 1 because the player had chosen Goat 2; these are cases where the player must switch to get the car. The remaining x times, Monty had a choice of which goat to show; having a choice means that the player picked the car initially. Therefore, when the player is looking at Goat 1, his chance of winning with a switching strategy is (x+y)/(2x+y).

Before we do the same for Goat 2, let's stay with Goat 1 a little further. We already know we're going to multiply the probability of seeing Goat 1, (2x+y)/(3x+3y), by the probability of winning by switching when looking at Goat 1, (x+y)/(2x+y). Since the numerator of the first number is the same as the denominator of the second, they cancel out, and the product of the two probabilities is (x+y)/(3x+3y) -- which we can see always works out to 1/3, no matter what values x and y have! The same logic must apply to Goat 2, as well. This accords with what we already know about the problem: 1/3 of the time, the player has chosen Goat 1, and must switch to win; 1/3 of the time, the player has chosen Goat 2, and must switch to win. The remaining 1/3 of the time, the player chooses the car initially; no matter how Monty makes his choice about which goat to reveal, it does not change the fact that in this case, switching will lose and staying will win.

(I want to address one more minor point on this matter, but I'm out of time at this computer; I'll be at another computer in about ninety minutes.) -- Antaeus Feldspar 21:27, 1 March 2006 (UTC)

All right...

There's one point that isn't addressed by the above, and that is the question of whether x and y could take on values such that the optimal strategy when looking at Goat 1 is different from the strategy that is optimal when looking at Goat 2. If this was the case, then one could devise an overall strategy superior to any strategy that didn't take the identity of the goat into account.

It can be shown that the answer is "no". We will start by assuming the contrary: that x and y have values which make staying the best strategy for Goat 1 and switching the best strategy when looking at Goat 2; our overall strategy then would be "stay when looking at Goat 1; switch when looking at Goat 2."

In order for staying to be the best strategy, the following would have to be true: out of all of the times that the player ends up looking at Goat 1, more of them are due to the player having initially picked the car, and Monty making the choice to show Goat 1, than are due to the player having picked Goat 2. In terms of x and y, this works out to "x (the number of times that the player picked the car, and Monty picked Goat 1 to show) is greater than x+y (the number of times that Monty had to show Goat 1 because the player picked Goat 2)" -- in other words, x > x+y, which simplifies to 0 > y -- y is less than 0.

However, 0 is the lowest value possible for y; you could say "1 of every 5 times, Monty picks Goat 2" or "0 of every 5 times, Monty picks Goat 2" but "-1 of every 5 times, Monty picks Goat 2" has no meaning. The closest we can get to the situation we were trying to arrange is where y equals 0. This is the situation when Monty always picks Goat 1 if given a choice. Under these conditions, seeing Goat 2 means switching always wins; Monty must be switching because he has no choice. However, the player only sees Goat 2 1/3rd of the time; the other 2/3rds of the time, the player sees Goat 1, and the chances are exactly even that: a) Monty had to show Goat 1 because the player picked Goat 2, b) Monty had a choice of Goat 1 or Goat 2 (because the player picked the car) and showed Goat 1. There is no optimal strategy when the player is looking at Goat 1; no matter what strategy the player adopts, the chances of winning from this situation are 1 in 2. And as with every other set of values for x and y, the end result is 2/3: ((1/3)*(1/1)) + ((2/3)*(1/2)) = 1/3 + 1/3 = 2/3.

(Now, I ask others -- is there any point to addressing any part of this in the article itself?) -- Antaeus Feldspar 23:34, 1 March 2006 (UTC)

No! hydnjo talk 23:39, 1 March 2006 (UTC)

More "catchy"?... sexy etc. ?

When I first heard about the Monty Hall Problem, I was very drawn into it by its simplicity ... this article, while certainly fantastic, has an introduction so full of disclaimers and strings that the problem is convoluted and no longer interesting, and catchy, IMHO.... is it possible that we can rework the article to state the problem simply at the beginning and then put all the assumptions and stuff a bit later ..... it just seems to me that over time people have inserted little disclaimer words to the point where the problem itself is no longer fascinating. -Abscissa 05:36, 4 March 2006 (UTC)

The problem is that it tends to go in a cycle: someone comes along, thinks the description is too verbose and complicated, and pares it down. Someone else comes along, thinks that it fails to account for this possibility or that one, and expands it again. -- Antaeus Feldspar 22:42, 4 March 2006 (UTC)
What I am suggesting is a very simple statement of the problem at the beginning which, IMHO, the article currently lacks. -- THEN a detailed description of the constraints, etc. ? ... - Abscissa 04:44, 6 March 2006 (UTC)
That's actually what I'm talking about: at times when we have had a very simple statement of the problem, people have asserted that it needs changing because it doesn't specify one constraint or another. Which constraints are you thinking should be moved or removed to make the initial description simple? -- Antaeus Feldspar 17:54, 6 March 2006 (UTC)

Game theory redux

I know there was some conflict about this some time ago, I thought I would send some feelers out. Would anyone mind if I took this article out of Category:Game theory and put it in Category:Decision theory? Thanks! --best, kevin 05:12, 8 March 2006 (UTC)

It is my opinion that it belongs in neither category. The problem is specifically one of probability. One could just as easily say it belongs in the category, "Engilsh Language Problems" or "Game Show Related Trivia" etc. etc. It is a probability problem, plain and simple, not a game. - Abscissa 17:34, 8 March 2006 (UTC)
I agree that its not a game theory problem, but can you explain why you think its not a problem in decision theory? --best, kevin 20:01, 8 March 2006 (UTC)
This problem is designed to illustrate a principle in probability, not one in decision theory. It's not a "real" problem or game. Decision theory/game theory have a branch of "theory" because their problems are related to real-life examples. The same way I would say the three cards problem is not related to game theory or decision theory. I am glad you agree that it is not part of game theory... the MHP *definately* has nothing to do with game theory. I 100% support removing it from the Game Theory category. If you think it should be moved to Category:Decision theory I will abstain from any objection... but as it stands I strongly strongly support anyone who wants to remove all references to game theory from the article. In fact, the article, after re-reading it, is really a huge, huge mess IMHO. -Abscissa 05:02, 9 March 2006 (UTC)
And in only eight months since its Mainpage-FA exposure! hydnjo talk 14:31, 9 March 2006 (UTC)
Don't know if that is sarcasam or not -- but this article in its current state is nothing remotely like the way it was when it was a FA. - Abscissa 14:58, 9 March 2006 (UTC)
My point exactly. The hundreds of edits since July 23 2005 have not made this article better and I suspect that it would not withstand FA scrutiny in its current state. It may be time to bring it through the process once again, it is a good subject. hydnjo talk 04:45, 10 March 2006 (UTC)
Not sure, but perhaps this is related to Second law of thermodynamics. -- Rick Block (talk) 04:55, 10 March 2006 (UTC)
Time to vote for a complete rewrite? I'd do it, but someone would revert it I'm sure... -Abscissa 04:51, 10 March 2006 (UTC)
Rick, I think that the reason for this (besides the Second law) is because as the non-believers raise their arguments, someone goes in an makes a patch to address that particular nit or pick and so the article no longer flows in a coherent way, death by a thousand cuts or something. And Absicissa has a point about a rewrite which is why bringing to FAC status may be the way to go. Just think Rick, you get to do it all over again.  :-) I'd like to hear from Antaeus about this, he has been putting a great deal of effort into this "project". hydnjo talk 05:12, 10 March 2006 (UTC)

Article Rewrite

There is some discussion above about rewriting the article, from scratch, incorporating most of what is currently in the article but editing it heavily... (see above) but there are some strange things about the article right now that someone needs to look at. I propose the new article look something like (or at least incorporate these elements in some order, if someone can rework it to be slightly better):

  • A brief statement of the problem (and why it might be considered significant) (FOUR SENTENCES MAX!)
  • The origin of the problem, the history of the problem. But NOT variants of the problem. -- I would start with Marilyn. And probably finish with her.
  • A statement of the problem with the express contraints.
  • The solution to the problem, with subsections on the various ways of understanding the problem. Perhaps also, starting with the simplest and ending with the most complex. Like Bayes's Theorem? WTF? Is someone who does not understand the solution at this point seriously going to understand that?
  • Similar problems with:
  • Further history of the problem (e.g. Gardner, &c.)
  • Links to similar problems (three card problem, boy girl problem, &c.)

In summary, I think this article could be 50% of the size its now and 200% of the quality. Are there some people who would be willing to work on this with me? Or others who think it is a bad idea and that the article is best in its current form? - Abscissa 06:27, 10 March 2006 (UTC)

I have no argument against the need but I am somewhat troubled by the suggested process. The proper way of dealing with a seriously deteriorated FA is to list it at WP:FARC (see some of the candidates there) with the reasons that it no longer meets WP:WIAFA. Only after a consensus is achieved should the article be rewritten. After that, if we have the desire to do so, the article can be submitted for WP:PR and/or WP:FAC. --hydnjo talk 12:34, 10 March 2006 (UTC)
Sure, sounds good to me. - Abscissa 12:52, 10 March 2006 (UTC)