### A Random Walk through Middle Land

##### How randomness rules our world and why we cannot see it, Part 2

Imagine that you are a contestant on the classic television game show *Let’s Make a Deal*. Behind one of three doors is a brand-new automobile. Behind the other two are goats. You choose door number one. Host Monty Hall, who knows what is behind all three doors, shows you that a goat is behind number two, then inquires: Would you like to keep the door you chose or switch? Our folk numeracy — our natural tendency to think anecdotally and to focus on small-number runs — tells us that it is 50–50, so it doesn’t matter, right?

Wrong. You had a one in three chance to start, but now that Monty has shown you one of the losing doors, you have a twothirds chance of winning by switching. Here is why. There are three possible three-doors configurations: (1) good, bad, bad; (2) bad, good, bad; (3) bad, bad, good. In (1) you lose by switching, but in (2) and (3) you can win by switching. If your folk numeracy is still overriding your rational brain, let’s say that there are 10 doors: you choose door number one, and Monty shows you door numbers two through nine, all goats. Now do you switch? Of course, because your chances of winning increase from one in 10 to nine in 10. This type of counterintuitive problem drives people to innumeracy, including mathematicians and statisticians, who famously upbraided Marilyn vos Savant when she first presented this puzzle in her *Parade* magazine column in 1990.

The “Monty Hall Problem” is just one of many probability puzzles that physicist Leonard Mlodinow of the California Institute of Technology presents in his delightfully entertaining new book *The Drunkard’s Walk* (Pantheon, 2008). His title employs the metaphor (sometimes called the “random walk”) to draw an analogy between “the paths molecules follow as they fly through space, incessantly bumping, and being bumped by, their sister molecules,” and “our lives, our paths from college to career, from single life to family life, from first hole of golf to eighteenth.” Although countless random collisions tend to cancel one another out because of the law of large numbers — where improbable events will probably happen given enough time and opportunity — every once in a great while, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction … a noticeable jiggle occurs.” We notice the improbable directional jiggle but ignore the zillions of meaningless and counteracting collisions.

In the Middle Land of our ancient evolutionary environment, which I introduced in Part 1 of this column last month, our brains never evolved a probability network, and thus our folk intuitions are ill equipped to deal with many aspects of the modern world. Although our intuitions can be useful in dealing with other people and social relationships (which evolved as common and important for a social primate species such as ours when we were struggling to survive in the harsh environs of the Paleolithic), they are misleading when it comes to such probabilistic problems as gambling. Let’s say you are playing the roulette wheel and you hit five reds in a row. Should you stay with red because you are on a “hot streak,” or should you switch because black is “due”? It doesn’t matter, because the roulette wheel has no memory, yet gamblers notoriously employ both the “hot streak fallacy” and the “dueness fallacy,” much to the delight of casino owners.

Additional random processes and our folk numeracy about them abound. The “law of small numbers,” for example, causes Hollywood studio executives to fire successful producers after a short run of box-office bombs, only to discover that the subsequent films under production during the producer’s reign became blockbusters after the firing. Athletes who appear on *Sports Illustrated*’s cover typically experience career downturns, not because of a jinx but because of the “regression to the mean,” where the exemplary performance that landed them on the cover is itself a low probability event that is difficult to repeat.

Extraordinary events do not always require extraordinary causes. Given enough time, they can happen by chance. Knowing this, Mlodinow says, “we can improve our skill at decision making and tame some of the biases that lead us to make poor judgments and poor choices … and we can learn to judge decisions by the spectrum of potential outcomes they might have produced rather than by the particular result that actually occurred.” Embrace the random. Find the pattern. Know the difference.

November 10th, 2008 at 5:04 pm

The fact that a door whose contents are unknown has acquired the property if being “chosen” does not change the door’s contents. When another door’s contents are revealed, that clearly changes the calculations, but it doesn’t change the facts. Once it is made clear that one of two remaining doors hide the prize, what is also clear is that switching or not switching is a coin flip. If I stay, I have a fifty percent chance of winning. If I move, same. Imagine that NO door had been chosen before the one or the eight goats were revealed. What are my odds? The situation is exactly the same as the scenario in which I have already chosen and then new information is revealed and I am offered a chance to change my selection. Either way, I am making a coin flip choice. The notion that my previous selection, made with less information, has any weight at all on the current circumstances is almost magical in it’s irrationality.

November 12th, 2008 at 1:06 am

hilarie,

Easiest way to see this is, imagine 10 doors, you pick no. 1, Monty then opens all other doors except no. 6. What’s so special about no. 6, you wonder? I thought no. 1 was a good choice, but Monty seems to think no. 6 is special–and he knows where the car is!

November 12th, 2008 at 3:53 am

sure barry – but michael is presenting the problem as mathematical (erroniously!)- your solution is psychological, and i think any poker player would tend to accept that and switch. good call!

November 12th, 2008 at 4:57 am

barry is correct-

Imagine a deck of cards face down. You guess which is the Ace of Spades. Your odds are 1 in 52. If Monty turns over all the remaining cards except one, without the ace of spades being revealed, leaving only your card and the the other remaining one face down. The odds are still 1 in 52 that your card is the Ace, but 50 to one that the remaining face down card is the Ace of spades.

The three door version is harder to see because it looks as if it is a 50-50 chance.

November 12th, 2008 at 5:32 am

Actually, Hilarie, your first sentence is correct, but the rest of your reasoning does not follow. Revealing the contents of another door does *not* change the calculations.

Your original odds are still the same. When you chose the first door, you had a one-third chance of being correct, and a two-thirds chance of being incorrect. The opening of the additional door (because it is *not* random) has not changed those odds. You still have a one-third chance of being correct and a two-thirds chance of being incorrect in your initial choice.

Look at it this way. Initially, there are three situations: Car behind Door 1, Car behind Door 2, Car behind Door 3. Let’s call these “1″, “2″, and “3″. Further, let us assume that each situation is equally likely. Say you select door 1. Then, each situation plays out as follows (remember, he knows where the car is and will always reveal a goat):

1. Monty opens one of the other doors (it doesn’t matter which one). You win if you stay, and lose if you switch.

2. Monty opens Door 3, showing you a goat. You lose if you stay, you win if you switch.

3. Monty opens Door 2, showing you a goat. You lose if you stay, you win if you switch.

In two of the three situations (which, you will recall, were equally likely), you win by switching. What is throwing you, and most other people, is that the door was *not* opened at random. Let us say we randomly chose which door to open (in this case, say it came up as Door 2), then the situations would play out thus:

1. Monty opens Door 2, showing you a goat. You win if you stay, and lose if you switch.

2. Monty opens Door 2, showing you the car. You lose if you stay, you lose if you switch (you have already lost).

3. Monty opens Door 2, showing you a goat. You lose if you stay, you win if you switch.

Now, in the two situations that you still have a choice, it is indeed a fifty-fifty proposition. The odds have changed because situation 2 has been eliminated. The whole thing is a bit counterintuitive, but it is correct.

If you are ever in this situation, by the way, just remember that you cannot make your chances worse by switching ;)

November 12th, 2008 at 5:53 am

The above argument is nonsense. The odds of either the last card or the first card being the ace of spades are exactly the same. ie one in 52 if you have not turned up any other cards yet; or 1 in 2 (ie 50-50) if you have turned up all the cards already. It does not matter which card you chose first.

Choosing to stay or choosing to move both have the same odds of winning. You are still choosing between two random unknowns.

Hilarie is correct.

I hope Michael’s tongue is in his cheek when he claims that switching increases your odds of winning.

November 12th, 2008 at 6:37 am

Anthony,

Both Michael Shermer and Chris are correct.

You would be right, if all the other cards had been turned by chance, and none of them happened to be the ace of spades. In that case, both the first and the last card would have a 1/2 chance of being the ace of spades.

However, remember that the point is that the person is choosing which cards to turn, he knows which one is the ace of spades, and he would never choose that one. In this case, the probabilities are 1/52 for the first card, and the remaining probabilities converge do the last card, that is, 51/52. His choosing (which is not random) reveals that information.

November 12th, 2008 at 6:43 am

Michael’s tongue is not in his cheek.

I would leave it to the other readers to show that Anthony’s version is a straw man substitution for the stated problem.

November 12th, 2008 at 6:45 am

The key to the Monty Hall Problem is the fact that Monty ALWAYS reveals a goat and he NEVER reveals the door with the car. This changes the normal probabilistic nature of the problem. I don;t think Michael made this clear enough but http://montyhallproblem.com/ does.

November 12th, 2008 at 6:47 am

I would add that the problem of so-called “folk” innumeracy is not our inability to deal with probability. It is that Monty Hall is cheating in order to make the game more interesting for advertisement-watching viewers. He is playing on the contestant’s admirable power to evaluate probability in small numbers without making it clear that the game is rigged.

November 12th, 2008 at 6:55 am

Anthony is correct. The point is that the assessment of the odds changed as each door was opened. 1 in 3 when no doors are revealed, 1 in 2 after one door is opened.

And with the deck of cards as well, the chances were 1 in 52 when no cards were revealed. But the odds changed because the choices/options changed as each card was turned and shown (and thereby were removed from the list of unknowns). Odds are not static, they are fluid. Can anybody really believe that when 50 of 52 cards are revealed that the chances of an uncovered ace of spades being revealed are still 1 in 52? We have known unknowns and unknown unknowns and….

R.

November 12th, 2008 at 7:18 am

I was confused by this problem for a long time. Here’s how it finally made sense to me. Stop thinking about the correct choice and think about the probability of being wrong. There are three choices, A, B, C. You choose A. The probability of the car being in B or C is 2/3. Now Monty opens C, because he knows it has a goat. The probability of a car in B or C is still 2/3, but now you know it isn’t C. The probability of a car in A is 1/3, same as always. The probability of B or C is still 2/3. The probability of a car in C is zero. The math becomes pretty easy at this point.

November 12th, 2008 at 7:23 am

I have read about this off and on for about 10 years. I still do not get it. While I accept some responsibility of my own muddleheadedness I also suggest that the explainers have their own responsibility to make their answer more clear. Using phrases like “of course” is not an explanation. The 52 cards “explanation” is the same as the 10 door “explanation”-no explanation at all to those of us who still do not get it. We await the better explainer (not to be confused with the “Intelligent Designer”).

November 12th, 2008 at 10:04 am

This is one of my favorite problems, and it’s hard to explain to people because (1) they make assumptions that aren’t true, and (2) there are mechanics of the real problem that aren’t explicitly stated.

The real crux of the problem is that Monty knows where the goats are and will always show you a goat, because there’s always at least 1 goat to be shown. His reveal is not random.

Here’s how I best was able to communicate it to the most ardent of deniers:

Okay, so you make your choice. At this point there are two unchosen doors left. After you pick your one door, if Monty automatically offered to trade you what’s behind your door for what’s behind both other doors put together, would you take it?

Of course, you know both intuitively that that raises your win chance from 1/3 to 2/3.

What if I told you that at least one of those two doors has a goat behind it? Would that change your mind?

No, because you know there’s only one car. There has to be at least one goat over there. You’ll still take it.

*Why should it matter that I then show you the goat*?

That’s what the “trade” does. It gives you everything behind all the other doors — they just show you all (or all but one) of the bad choices in that big group with a better chance of winning.

Again, expand to the deck of cards. I shuffle and give you one card face down. If you end up with the Ace of Spades you win. You can keep your one card, or trade with the rest of the deck. Obviously, you want to switch. Does it matter that I then look at the big stack and show you 50 cards that AREN’T the ace of spades? You knew there were at least 50 already there.

November 12th, 2008 at 10:08 am

btw, this is me above — different from the original Chris.

November 12th, 2008 at 11:59 am

Scott’s link, http://montyhallproblem.com/ explains this very well.

From that page, here is an exaggerated version of problem:

Imagine that there were a million doors. Also, after you have chosen your door; Monty opens all but one of the remaining doors, showing you that they are “losers.” It’s obvious that your first choice is wildly unlikely to have been right. And isn’t it obvious that of the other 999,999 doors that you didn’t choose, the one that he didn’t open is wildly likely to be the one with the prize?

November 12th, 2008 at 12:15 pm

There are three doors; let’s call them A, B and C. The car will be behind A one-third of the time, behind B one-third of the time and behind C one-third of the time. Let’s say you play this game several times and always pick A. If Monty doesn’t open a door, you will be right one-third of the time. If Monty opens a door (either B or C) it doesn’t change the fact that the car will STILL be behind Door A only one-third of the time. It will not suddenly start appearing behind Door A 50% of the time because Monty opens a door. It was placed there before the game started and nothing Monty does will change that. If you stick with A, you will therefore be right only one-third of the time. If you switch, you will be right two-thirds of the time (since one-third = two-thirds = 1.)

November 12th, 2008 at 12:17 pm

Correction: The last line should read “since one-third + two-thirds = 1.’

November 12th, 2008 at 4:19 pm

Oregonians for Science and Reason (O4SR.org) demonstrates the Monty Hall Problem each year at the annual “da Vinci Days” expo in Corvallis. Hundreds of visitors are put through the problem then record the results of their choices (by their putting a filbert in one of four tubes corresponding to switch or stay, won or lost). As the photos in the following link clearly show, statistically your are more likely to win the prize if you switch when given the opportunity to choose between the final two boxes. http://explorepdx.com/dvd2005.html

November 12th, 2008 at 4:57 pm

Right up there with The Unexpected Hanging for guaranteeing tons of responses to a question. Along with, Do We Have Free Will?

November 12th, 2008 at 10:49 pm

Robert Neary brings up a good point: if all else fails, actually playing the game out is possibly the best way to be convinced, if not actually understand.

Just remember to do enough iterations to tell the difference between 33%, 50%, and 67%, and keep track of them.

November 13th, 2008 at 10:23 am

It may be easier to think of it as a choice of sets, rather than of individual doors. You pick a door, call that set {a}. Monty Hall gives you the option switch to the set of all other doors, {b, c} , and if the car is behind any one of the doors in the second set, you get the car. The fact that Monty opens all the goat doors in the set {b, c} simply makes explicit the fact that if you switch and the car is in the set {b, c}, you get the car. Since switching allows you to functionally pick more doors, your odds of winning goes up. You have a 1/3 chance if you stay with set {a}, and a 2/3 chance if you pick two doors by going to set {b, c}.

If that makes some sense, but not completely, consider the same scenario, but with n doors, where n > 3. You pick a door, so your original set is {1}, with a 1/n chance of winning. Monty Hall then offers to let you switch to the set of {2…n}. Since the set you would be switching to contains all but the door that you picked, your chance of winning is (n-1)/n. Since (n-1)/n > 1/n, you are clearly better off switching.

November 13th, 2008 at 10:32 am

I already have a nice car.

I want the goat. : )

November 14th, 2008 at 8:58 am

There are many good ways to explain the correct answer to the Monty Hall Problem (always switch). However, if all else fails simply ask the naysayer to participate in a little experiment. 3 identical cups and a cotton ball should do the trick. Simple perform 100 trials where the player (naysayer) stands pat on the original choice and 100 trials of switching. The results will be obvious. Better yet, you can create a betting game (ie lay 2/1 odds that the switching strategy will be more successful over 100 trials). Sometimes when all else fails the brute force approach is best!

November 15th, 2008 at 7:46 am

Please listen to hilarie! The flaw in most responses is that the 1/3 odds when all 3 doors are closed changes as soon as any door is opened. New data is presented. On opening one of the two non-chosen the doors, the player’s odds change to either zero or 1/2. (If the opened door showed the prize, odds become zero) Many comments are just obfuscation.

Suppose a new player takes over for the old one when just two doors are closed, what are his odds? Suppose there are two players, one taking door A and one taking door C. Door B is then opened and shows no prize. Should both players switch to both increase their odds?

November 15th, 2008 at 2:08 pm

Maybe I can’t understand it, but I can prove that it is indeed 70 percent chances on winning if you change your mind!

I made a little program which chooses random numbers and puts your prize in door: 1,2 or 3. You can then choose your door and Monty will ask you if you want to change your mind. In the end you will see if you won or not!

For statistics, that has to be done several times. So I’ve done that too :)

Next, program will choose random number for winning doors and for participant (so you don’t have to) and do that 100 times.

To convince you for good, It will repeat the same thing 5 times and print the results.

Here it is:

http://drop.io/montyhallproblem

From there download montyhall.exe

If some of you is interested in source code, download:

montyhall.c (you can open it in wordpad)

November 15th, 2008 at 6:55 pm

Better yet, try it yourself. There are several online applets like this one:

http://montyhallgame.shawnolson.net/

Pick a strategy, stay or switch, and run through 20 or 30 trials. Your win percentage will quickly converge to 33% or 66%, respectively. It’s not a proof, but it may convince some of the 50/50′ers to think harder about it.

November 16th, 2008 at 9:36 pm

The deck of cards suggestion is the easiest.the mental trap is leaving one card hidden.

Select 1 of the 52 cards.Now turn over ALL the other 51 cards. How often will you see the Ace of Spades ? No numbers needed . Answer > most of the time.How can turning over 5,10,15 or any number of cards KNOWN to be NOT the Ace of Spades change anything ?

November 17th, 2008 at 8:38 am

How about a trillion doors? You choose door one. Opening all doors but the chosen one and the last door shows no prize. Do you really believe that your odds for winning by a change to the last door are (trillion-1)/trillion (nearly certain) versus 1/trillion (virtually a loser) for not changing? THE KEY ISSUE IS WHEN A DOOR IS OPENED SHOWING NO PRIZE, YOUR ODDS CHANGE. Confirmation of this is when the Monty Hall middle door is opened and IF it CONTAINED THE PRIZE, what are your chances now? Zero, of course, because the new information changes your odds. Opening a door changes your odds. 1/3 NO LONGER APPLIES.

November 17th, 2008 at 11:07 am

Boris Says: “How about a trillion doors? You choose door one. Opening all doors but the chosen one and the last door shows no prize. Do you really believe that your odds for winning by a change to the last door are (trillion-1)/trillion (nearly certain) versus 1/trillion (virtually a loser) for not changing?”

Yes, I actually believe this because it is TRUE. However, you dont need a trillion doors to prove this. As others have suggested conduct the experiment yourself (or use an online version) with a few cards, cups etc. and a willing partner.

November 17th, 2008 at 11:16 am

The challenge with this problem is a matter of perspective. If you were to walk into the room when only 2 doors remained unopened, without any prior knowledge of which door was originally picked, and were free to choose either one….then your odds are 50/50. But that is NOT the Monty Hall Problem. You have to commit to your door BEFORE any additional information (ie a goat) is revealed.

November 17th, 2008 at 11:33 am

frigo, the problem with your 50/50 comment (#31) is that the odds of where the prize is after one door is opened are independent of when a contestant comes into the room. The odds are 50/50 for Monty Hall and your late contestant’s 50/50. THE ODDS CHANGE WHEN NEW INFORMATION IS AVAILABLE, believe or don’t believe.

November 17th, 2008 at 3:19 pm

The thing that people are failing to realize is that when Monte opens an empty door it isn’t random. If the car was behind door two, he would open door 3. If the car were behind door three, he would open door 2. That is why the new door still has the 2/3 chance of being the correct one. Once the first door is chosen there are only so many possibilities. It is either 1, 2, or 3. If the person chooses 1 and it is 1 then they lose. If they chose 1 and and the correct door is 2 or 3, then they win. That doesn’t change. So the correct answer is to switch to maximize your odds of winning.

November 18th, 2008 at 7:22 am

When Monty Hall removes one of the doors from consideration, we now have a new problem: there are two doors and only one car. So the chance the car is behind either door is 50%. I matters not which door you selected previously.

November 18th, 2008 at 11:59 am

Well, Ted (and all the others who seem to think that switching has no affect at all on the odds), you should run through all the possible scenarios, and then total up odds. I’ll start it for you, since you seem to think you’ve figured it out already, and might not want to give the effort.

There are three doors, a, b, and c. I will represent the winning door with a capital A, B, or C. There are three possible starting scenarios:

1) A b c

2) a B c

3) a b C

Now, for scenario one, {A, b, c}, the contestant gets one pick, let’s denote it with a *.

I) A* b c

II) A b* c

III)A b c*

Finally, we can compare the odds for staying versus switching.

For case I, staying wins, switching loses.

For case II, staying loses, switching wins.

For case III, staying loses, switching wins.

So, in summary of scenario 1, a policy of staying wins 1 out of 3 cases. A policy of switching wins 2 out of 3 cases. Repeat, staying wins 33% of the time, switching wins 66% of the time.

Before you respond, try it for the other scenarios. I can guarantee that the switching policy is effective 66% of the time in all 3 cases for both scenarios 2 and 3. If you can show that this isn’t the case, please explain.

November 18th, 2008 at 12:27 pm

Switching is the correct way to go based on the scenario with Monty Hall. Think of it this way. If ther are 3 doors and you pick one, you have 33.33% chance of being correct. The other two doors combined have a 66.66% chance of being correct. Let’s say you could switch your door for the other two doors, without Monty showing you anything. You would obviously switch and now you have two doors instead of one. Obviously there’s at least one goat behind one of the your two doors. You know there’s at least one goat behind your two doors. Does it matter if he shows you a goat now? This is really the same scenario as him showing you a goat and then asking you to switch. You are really picking two doors that at least one of which has a goat. It’s a two for one deal even though it seems like it’s just a one for one deal.

November 19th, 2008 at 11:06 am

It’s 50/50. Try this. There are 3 doors. You can either pick door #1 or the other 2 doors (#2 and #3) as a GROUP!. Your odds are either 1/3 or 2/3. You OBVIOUSLY pick the group of 2 doors. Then, surprise, Monty opens one of your 2 doors and no prize displayed. Whether or not he knows where the prize is does not affect the location of the prize. You have just door #3 of your chosen group. Should you change to door #1? Are the odds that you have the prize 2:1 or 50/50? If you cannot accept that the odds change when a door is opened and NEW information is now available to you, work with yesterday’s odds and stay confused. ANY analysis that does not change the odds after a door is opened will give you the erroneous 1/3-2/3 result.

November 19th, 2008 at 12:24 pm

If you change the rules (e.g. “you can pick door #1 or the other 2 doors (#2 and #3) as a group), then you are creating a different problem. This does not help you solve the original problem.

November 21st, 2008 at 6:34 pm

As some one who frequently finds meaningful patterns in meaningless noise, I do sometimes wonder why you write so diligently about this probability nonsense.

November 23rd, 2008 at 2:19 pm

While I can follow each side of this discussion and do tend to side with the change school of thought, albeit after initially agreeing with the 50/50 group, it seems to me the real crux of this problem, statistically, is what percentage of the time Monty used the goat and his associated influential verbage to encourage a win verses a loss. Had everyone who switched won or even 66% (which may have happened, one could review this) seems this percentage would have become common knowledge and everyone would’ve changed and the show would’ve expended a great deal on prizes (a small % of advertizing revenues?) and the show may have been profitable to be on but less exciting to watch. Ultimately, Monty was a telelvised “carney” and like real carnies, or the Las Vegas version, he knew how to tip the odds to his favor.

November 23rd, 2008 at 2:22 pm

Whoops. That was Monty’s influential verbage, not the goat’s.

November 25th, 2008 at 10:28 am

Variant problem: Two brothers, Tom and Dick, are allowed to play at once. They each must pick a different door. Monty opens Dick’s door, shows a goat, and says “Sorry. Thanks for playing.” Then Monty offers Tom a chance to stay with his pick or switch to the remaining door. What should he do?

Answer: Tom should stand pat.

This is an exact dual of the original problem. But even when you understand the original, this can be surpising.

November 26th, 2008 at 11:37 am

Boris, to be astounded by this is normal, but to persist in denying that switching in the original scenario is the better strategy, even after you can read many concrete proofs (some even on this very page), is foolishness.

Mike, you said that your variant “is an exact dual of the original problem.” In your variant, though, you’re positing that Dick always picks a goat. This removes exactly 1/3 of the choices from the table, hence leaving a scenario where switching is a break-even.

December 11th, 2008 at 9:58 am

I’m astounded that despite written and visual explanations (like the Oregon tube photos which are a slam dunk!) people cannot accept that their intuition, like mine, was wrong.

Which, of course, is the central theme of Michael’s argument.

December 18th, 2008 at 4:52 pm

I had to go thru all the off site notes and still had to do my own thinking to get this figured out.

The clue is look at the process and see what is happening with the choices.

The player has 1 in 3 chance of initially choosing the car.

Now, put yourself in Monty’s shoes: Monty will show you a goat 100% of the time, but he chooses from either two goats or one goat and one car. Thus there is a 50/50 chance he had to choose “away” from the car.

Because you don’t know which choice he had to make, a switch ends up in your favor.

December 26th, 2008 at 8:25 pm

Steve,

#12. Thank you. That’s the easiest way to think about it.

December 26th, 2008 at 9:06 pm

If I may offer yet another explanation, the key isn’t that Monty shows you a goat, it’s that he offers you a chance to switch.

If one picks door #1 and thinks of doors 2&3 as a group, then he knows the odds of the car being behind 2 or 3 are 2/3s, while the odds of the car being behind door 1 are 1/3.

This remains true, even after the goat is revealed behind door 3.

Basically Monty is simply offering a chance to switch from your initial pick (1/3 chance) to the other group consisting of both doors (2/3s chance).

Of course you can’t actually trade for both doors, but if you think of it that way -as trading door 1 for doors 2&3, then you’re always being given a chance to change your odds for the better.

December 28th, 2008 at 1:25 am

Suppose two people are watching Let’s Make A Deal at home, playing along. One person picks door #1. The other person picks door #3. Monty reveals a goat behind door #2. When the contestant on TV is offered a chance to trade, the people playing along at home may also trade – even though they may have picked different doors than the contestant on TV. At this point, one of the people playing at home is going to win, and one is going to loose, but they both increase their chances of winning by making the trade. Now that’s counterintuitive!

January 4th, 2009 at 4:03 pm

I still don’t get it …

Since a goat was shown behind door 2, then the prize must be behind (1) or (3). Why should I change my mind about which door I chose? If the host always shows one of the losing doors and asks the player if they want to switch, the player may have the prize door and lose by switching or have the goat and lose by switching. (But hey, personally I wouldn’t mind having a goat to milk or cook.) So with the evidence presented, I just can’t see how the odds of winning are actually improved by changing the decision.

The way I see it:

- a door was chosen with a 1-in-3 chance of winning.

- a losing door was shown – had you known that door was a loser beforehand, you would have made your odds at 50/50.

- so you now know one losing door – big deal – you may or may not have the other losing door – *how* does knowing where one losing door is improve your chances of winning over the original 1-in-3?

I can see how the scheme may work if you had 10 doors and the host always showed you all but two doors (the one you chose plus one other). In that case it would make sense to switch doors. However, with a mere 3 doors I just can’t figure out the advantage in the switch.

Has anyone written a small computer program and played the game a few hundred times to verify this claim that the overall probability goes from 1/3 to 2/3 if you switch?

January 4th, 2009 at 4:14 pm

Ah, OK.

Thanks to Chris for providing a better explanation. I think I’ll go play a game and amuse myself with the results anyway. I’ve always struggled with assigning probabilities to anything other than a coin toss.

February 23rd, 2009 at 10:30 am

I teach high school math. Here’s the explanation that best helps my students understand that there is a 2/3 chance of winning by switching.

The probability of picking the right door at the beginning is 1/3. In other words, most of the time, you’d lose this game if it ended here. Keep that in mind — most likely, you picked a loser. Well, if you most likely picked a lsoer, why not start with a different door? — because no matter which door we choose, it’s most likely a loser. Remember: YOUR ORIGINAL CHOICE IS MOST LIKELY WRONG. Now, once one of the other doors is shown to be a loser, and you know YOUR ORIGINAL CHOICE IS MOST LIKELY WRONG, the other remaining door is most likely the winner.

August 15th, 2009 at 10:48 am

The Monty Hall problem is not as complicated as you and others make it. The following reasoning refutes the answer of 2/3 that you give.

Clearly, at the beginning the contestent has a 1/3 chance of picking the right door when he picks #1. And when Monty Hall shows him that a goat is behind door #2 he has in effect changed the probability space from 3 to 2 possibilities. So when presented a chance to change his choice the contestent has in effect started over, i.e. it’s as though there was never a door #2. He now has 2 possible choices so he clearly has improved his chances of picking the right door from 1/3 to 1/2.

As is often the case according to Occam’s razor, the simplest solution/explanation is usually the right one. Many things in life and mathematics are counterintutive but this is not one of them.

February 25th, 2010 at 8:53 am

I finally got it. I looked at it as a contest between my original card and the rest of the deck. No matter what I choose, I am a 51 to 1 underdog. The “rest of the deck” will beat me 51 times out of 52. Once Monty the dealer gets rid of 50 certain losers, the remaining card will win 51 times out of 52. Switch!

I have to admit that it was very hard for me to accept this at the three door level, but unless you picked the right door (only once every three attempts), Monty basically tells you where it is! So this is how I inferred that Monty tells you where the new car is two out of three times, giving legitimacy to Dr. Shermer’s Arument.

February 25th, 2010 at 9:07 am

“Suppose there are two players, one taking door A and one taking door C. Door B is then opened and shows no prize. Should both players switch to both increase their odds?”

When there are two players, they both have a 50/50 chance right from the start. They pick, Monty reveals a goat, and someone wins.

Perhaps what makes all this so hard to fathom is the fact that your odds don’t really “change”. Your odds of winning are based on the rules of the game, not so much your initial guess, which means nothing, since you have to choose again after seeing a goat. What I mean is that your odds of winning on the first try are zero. You may pick the right door, but you haven’t won the prize yet, because the game is not over.

September 29th, 2011 at 7:29 am

How the BELIEF BIAS works in this case, is “by making counterintuitive” the fact that BEFORE you made your choice each DOOR had a 33% possibility of having a car, but AFTER you made your choice, and EVEN BEFORE THE HOST SHOWS A GOAT, the possibility of you door having a Car increased to 50%, as it will play against the REMAINING door, BECAUSE you know one of the other two doors will have a Goat and will be eliminated.

Michael Shermer article in itself is a demonstration of how easy it is to fall into a Belief Bias…

Let’s the possibilities:

Alternative 1, first choice A, C shows a goat, A has the car.

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (0%, we know C doesn’t have a car)

Not switching wins. Switching looses.

Alternative 2, first choice A, C shows a goat, B has the car.

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (0%, we know C doesn’t have a car)

Switching wins. Not switching looses.

Alternative 3, first choice A, B shows a goat, A has the car

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (0%, we know B doesn’t have a car)

Goat / Goat / Car n=3 (50%)

Not switching wins. Switching looses.

Alternative 4, first choice A, B shows a goat, C has the car

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (0%, we know B doesn’t have a car)

Goat / Goat / Car n=3 (50%)

Switching wins. Not switching looses.

Alternative 5, first choice B, C shows a goat, B has the car.

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (0%, we know C doesn’t have a car)

Not switching wins. Switching looses.

Alternative 6, first choice B, C shows a goat, A has the car.

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (0%, we know C doesn’t have a car)

Switching wins. Not switching looses.

Alternative 7, first choice B, A shows a goat, B has the car

Door A / Door B / Door C

Car / Goat / Goat n=1 (0%, we know A doesn’t have a car)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (50%)

Not switching wins. Switching looses.

Alternative 8, first choice B, A shows a goat, C has the car

Door A / Door B / Door C

Car / Goat / Goat n=1 (0%, we know A doesn’t have a car)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (50%)

Switching wins. Not switching looses.

Alternative 9, first choice C, B shows a goat, C has the car.

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (0%, we know B doesn’t have a car)

Goat / Goat / Car n=3 (50%)

Not switching wins. Switching looses.

Alternative 10, first choice C, B shows a goat, A has the car.

Door A / Door B / Door C

Car / Goat / Goat n=1 (50%)

Goat / Car / Goat n=2 (0%, we know B doesn’t have a car)

Goat / Goat / Car n=3 (50%)

Switching wins. Not switching looses.

Alternative 11, first choice C, A shows a goat, C has the car

Door A / Door B / Door C

Car / Goat / Goat n=1 (0%, we know A doesn’t have a car)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (50%)

Not switching wins. Switching looses.

Alternative 12, first choice C, A shows a goat, B has the car

Door A / Door B / Door C

Car / Goat / Goat n=1 (0%, we know A doesn’t have a car)

Goat / Car / Goat n=2 (50%)

Goat / Goat / Car n=3 (50%)

Switching wins. Not witching looses.

Not Switching wins alternatives 1, 3, 5, 7, 9, and 11 = 6 out of 12. Not Switching looses alternatives 2, 4, 6, 8, 10, and 12 = 6 out of 12.

50/50

The result is the same for any number of door with only one car (see simulator)… Can you find more alternatives? If the percentages I left showing throw you off, forget them, and look at the possible configurations, there aren’t others, so there are only twelve alternatives and not switching wins in 6 of them. In four of them the car is in A, in another four it is in B, and in the remaining four you find the car in C. Conversely, you chose A 4 times, B 4 times, and C four times. You will find that this mathematics cannot be blown by any bullet you might try.

Not switching has 6 wins of 12 alternatives, making it 50/50.

Now, why would you believe that not switching would only win in 4 alternatives and loose in the other eight? Belief bias, that’s why.

September 29th, 2011 at 9:08 am

Final mathematical explanation: The probability that I had chosen the door (one of n doors) with the car changes (increases) with every losing door that it is open: Started as 1/n, one door opens, it goes to 1/(n-1), the second door opens, …it changes again to 1/(n-2), and so on, until it becomes 1/(n-[n-2])=1/2.

The minor premise is that only loosing doors are open. The major premise is that there is only one door with a car behind. The mechanism is that (n-2) doors will be open. The alternatives that we will find after the offer to switch is made, are 2xn! for our n choices… but the result is always that half of them will make a not switching a looser, while the other half will make not switching a winner. Properties of series affecting our statistical belief… LOL

Irrational belief? I think that Shermer would have something to say about it… ;-)

October 11th, 2012 at 3:28 am

“..the fact that BEFORE you made your choice each DOOR had a 33% possibility of having a car, but AFTER you made your choice, and EVEN BEFORE THE HOST SHOWS A GOAT, the possibility of you door having a Car increased to 50%”

This statement has got to be one of the most ridiculous I’ve read, shows a complete ignorance of probability theory. After that I couldn’t bring myself to read the rest of Frank’s comment