The Monty Hall problem is a counter-intuitive statistics puzzle:

- There are 3 doors, behind which are two goats and a car.
- You pick a door (call it door A). You’re hoping for the car of course.
- Monty Hall, the game show host, examines the other doors (B & C) and always opens one of them with a goat (Both doors might have goats; he’ll randomly pick one to open)

Here’s the game: Do you stick with door A (original guess) or switch to the other unopened door? Does it matter?

Surprisingly, the odds aren’t 50-50. If you switch doors you’ll win 2/3 of the time!

Today let’s get an intuition for *why* a simple game could be so baffling. The game is really about re-evaluating your decisions as new information emerges.

Contents

## Play the game

You’re probably muttering that two doors mean it’s a 50-50 chance. Ok bub, let’s play the game:

Try playing the game 50 times, using a “pick and hold” strategy. Just pick door 1 (or 2, or 3) and keep clicking. Click click click. Look at your percent win rate. You’ll see it settle around 1/3.

Now reset and play it 20 times, using a “pick and switch” approach. Pick a door, Monty reveals a goat (grey door), and you switch to the other. Look at your win rate. Is it above 50% Is it closer to 60%? To 66%?

There’s a chance the stay-and-hold strategy does decent on a small number of trials (under 20 or so). If you had a coin, how many flips would you need to convince yourself it was fair? You might get 2 heads in a row and think it was rigged. Just play the game a few dozen times to even it out and reduce the noise.

## Understanding Why Switching Works

That’s the hard (but convincing) way of realizing switching works. Here’s an easier way:

**If I pick a door and hold, I have a 1/3 chance of winning.**

My first guess is 1 in 3 — there are 3 random options, right?

If I rigidly stick with my first choice no matter what, I can’t improve my chances. Monty could add 50 doors, blow the other ones up, do a voodoo rain dance — it doesn’t matter. The best I can do with my original choice is 1 in 3. The other door must have the rest of the chances, or 2/3.

The explanation may make sense, but doesn’t explain *why* the odds “get better” on the other side. (Several readers have left their own explanations in the comments — try them out if the 1/3 stay vs 2/3 switch doesn’t click).

## Understanding The Game Filter

Let’s see why removing doors makes switching attractive. Instead of the regular game, imagine this variant:

- There are 100 doors to pick from in the beginning
- You pick one door
- Monty looks at the 99 others, finds the goats, and opens all but 1

Do you stick with your original door (1/100), or the other door, which was filtered from 99? (Try this in the simulator game; use 10 doors instead of 100).

It’s a bit clearer: Monty is taking a set of 99 choices and *improving* them by removing 98 goats. When he’s done, he has the top door out of 99 for you to pick.

Your decision: Do you want a *random* door out of 100 (initial guess) or the *best* door out of 99? Said another way, do you want 1 random chance or the best of 99 random chances?

We’re starting to see why Monty’s actions help us. He’s letting us choose between a generic, random choice and a *curated, filtered* choice. Filtered is better.

But… but… shouldn’t two choices mean a 50-50 chance?

## Overcoming Our Misconceptions

Assuming that “two choices means 50-50 chances” is our biggest hurdle.

Yes, two choices are equally likely when you know *nothing* about either choice. If I picked two random Japanese pitchers and asked “Who is ranked higher?” you’d have no guess. You pick the name that sounds cooler, and 50-50 is the best you can do. You know nothing about the situation.

Now, let’s say Pitcher A is a rookie, never been tested, and Pitcher B won the “Most Valuable Player” award the last 10 years in a row. Would this change your guess? Sure thing: you’ll pick Pitcher B (with near-certainty). Your uninformed friend would still call it a 50-50 situation.

Information matters.

## The more you know…

Here’s the general idea: **The more you know, the better your decision.**

With the Japanese baseball players, you know more than your friend and have better chances. Yes, yes, there’s a *chance* the new rookie is the best player in the league, but we’re talking *probabilities* here. The more you test the old standard, the less likely the new choice beats it.

This is what happens with the 100 door game. Your first pick is a random door (1/100) and your other choice is the champion that beat out 99 other doors (aka the MVP of the league). The odds are the champ is better than the new door, too.

## Visualizing the probability cloud

Here’s how I visualize the filtering process. At the start, every door has an equal chance — I imagine a pale green cloud, evenly distributed among all the doors.

As Monty starts removing the bad candidates (in the 99 you didn’t pick), he “pushes” the cloud away from the bad doors to the good ones on that side. On and on it goes — and the remaining doors get a brighter green cloud.

After all the filtering, there’s your original door (still with a pale green cloud) and the “Champ Door” glowing nuclear green, containing the probabilities of the 98 doors.

Here’s the key: Monty does not try to improve your door!

He is purposefully *not* examining your door and trying to get rid of the goats there. No, he is only “pulling the weeds” out of the neighbor’s lawn, not yours.

## Generalizing the game

The general principle is to re-evaluate probabilities as new information is added. For example:

A Bayesian Filter improves as it gets more information about whether messages are spam or not. You don’t want to stay static with your initial training set of data.

Evaluating theories. Without any evidence, two theories are equally likely. As you gather additional evidence (and run more trials) you can increase your confidence interval that theory A or B is correct. One aspect of statistics is determining “how much” information is needed to have confiidence in a theory.

These are general cases, but the message is clear: more information means you re-evaluate your choices. The fatal flaw of the Monty Hall paradox is *not taking Monty’s filtering into account*, thinking the chances are the same before and after he filters the other doors.

## Summary

Here’s the key points to understanding the Monty Hall puzzle:

- Two choices are 50-50 when you know nothing about them
- Monty helps us by “filtering” the bad choices on the other side. It’s a choice of a random guess and the “Champ door” that’s the best on the other side.
- In general, more information means you re-evaluate your choices.

The fatal flaw in the Monty Hall paradox is not taking Monty’s filtering into account, thinking the chances are the same before and after. But the goal isn’t to understand this puzzle — it’s to realize how subsequent actions & information challenge previous decisions. Happy math.

## Appendix

Let’s think about other scenarios to cement our understanding:

**Your buddy makes a guess**

Suppose your friend walks into the game after you’ve picked a door and Monty has revealed a goat — but he *doesn’t* know the reasoning that Monty used.

He sees two doors and is told to pick one: he has a 50-50 chance! He doesn’t know why one door or the other should be better (but you do). The main confusion is that we think we’re like our buddy — we forget (or don’t realize) the impact of Monty’s filtering.

**Monty goes wild**

Monty reveals the goat, and then has a seizure. He closes the door and mixes all the prizes, including your door. Does switching help?

No. Monty started to filter but never completed it — you have 3 random choices, just like in the beginning.

**Multiple Monty**

Monty gives you 6 doors: you pick 1, and he divides the 5 others into a group of 2 and 3. He then removes goats until each group has 1 door remaining. What do you switch to?

The group that originally had 3. It has 3 doors “collapsed” into 1, for 3/6 = 50% chance. Your original guess has 1/6 (16%), and the group that had 2 has a 2/6 = 33% of being right.

## Leave a Reply

1082 Comments on "Understanding the Monty Hall Problem"

You could explain it like this too:

If you stay with the door you picked initially you succeed if the initial door has a car, which has a chance of 1/3. If you’re strategy is to switch then you succeed if your initial pick is a goat, which has a chance of 2/3.

These two sentences were much more clear than pages upon pages of other verbose and intricate explanations, thank you. E.B. White would be proud.

All that I just read and although interesting still left me puzzled, this explanation was a lot clearer and removed the confusion, thanks.

Welcome back!

I actually blogged about this a while back:

http://blog.amhill.net/2009/07/24/from-the-archives-monty-hall/

And one of my uncles was dead-sure that I was wrong about it, so I wrote up an application (source code included) to show the percentage differences between switching and not switching. It’s linked to from that blag post above.

I like your flash version though!

I never understood this before, but as I read your explanations, the following came to mind:

The doors can be divided into two groups; the one you picked, and the ones you didn’t pick. When you pick one out of 100 doors, chances are very high that the prize is in the group you didn’t pick, you just don’t know which of those 99 doors has the car.

Luckily for you, Monty narrows it down for you by opening all but one door. The odds of the selected door having the prize hasn’t changed at all. You just know more a lot about the doors you didn’t pick.

Great article – I especially like the visualization part with the green clouds. Sometimes I think the hardest part with mathematical concepts is being able to visualize them.

@uwe: Nice, I like seeing the win and lose probabilities next to each other like that!

@Aaron: Cool, I like the automated way to go through hundreds of trials :).

@Gary: Yes, that’s exactly it. I need to think of a good concise way to get that point across — something along the line of “getting rid of the weeds in the neighbor’s garden”, i.e. improving the choices in the items you *didn’t* pick. Great observation.

@CL: Thanks, and I totally agree. I often end up with some mental picture of what’s happening when I think about math, I wish we’d all share what we “really” think about when solving a problem!

[…] Understanding the Monty Hall Problem | BetterExplained betterexplained.com/articles/understanding-the-monty-hall-problem – view page – cached The Monty Hall problem:http://en.wikipedia.org/wiki/Monty_Hall_problem is counter-intuitive statistics puzzle: * There are 3 doors, behind which are two — From the page […]

Yes, great post. After I had time to think about this problem a bit, I thought about it similar to Aaron/uwe. But I use slightly different wording: 1st, I pick a door (1 out of 3). Then, I’m given the opportunity to either 1) keep my pick or 2) choose the other 2 doors (i.e., knowing that Monty would take one away for me later so to speak). Thanks for the articles, they are generally my favorite in the blogosphere.

Another great post. When I first encountered this problem in college it nearly drove me crazy for the first couple of minutes until my lecturer simply said.. imagine it’s 1 million doors. It immediately became clear.

I like to use the Monty Hall problem when watching ‘Who wants to be a Millionaire?’ When a question is asked and I don’t know the answer, I pick one at random, and hope the actual contestant goes 50/50.

If they do, and my randomly chosen answer is still available, the correct answer more often that not is the other one.

Funny, I came across this problem a couple of years ago and never *intuitively* understood it; I was all over the internet for a better explanation. Now, I just read the problem statement (not the entire reasoning) and I immediately understood (intuitively) that switching the door is the way to go, and I’m wondering why I got so confused back then.

The way I visualize this problem is with an adversarial scenario. First I imagine it’s 100 doors, then I think of it this way:

Imagine you’re playing AGAINST Monty Hall for a car. You each must pick a door. You go first. You are forced to pick your door at random.

Monty Hall, however, has the luxury of looking at every door other than yours, and not only can but MUST pick the car door if he finds it.

He has a wide grin on his face (99 out of 100 times) and is already celebrating his new acquisition when all of a sudden you are offered to trade places with Monty and get to pick his door! Do you switch then? I think so!

@lewikee: Awesome, I like that formulation! Monty is basically picking the best door he can out of the remaining choices. Very nice ;).

I tend to a view that a two-in-three approach is clearest, combined with the fact that the host KNOWS which is the prize window, but will never reveal it.

If there are two windows NOT the prize, but there is only one window THE PRIZE, then the first guess is two-in-three likely NOT to be the prize.

If the first guess is NOT the prize, then the host, who knows which IS the prize, but will never reveal it, has only one option of window to reveal, the second window ALSO not the prize.

If you switch, you win.

That is two-in-three likely.

On one occasion in three, the first guess will BE the prize, and then switching will lose.

You are twice as likely, but not guaranteed, to win if you switch.

If you stick, you have straight one-in-three chance of landing the prize.

That doesn’t make any sense. If that is a part of the game, the host revealing one of them for you, how is this exactly improving your chances by switching on over?

so if two people play the game at the same time – one starts with door number 1 the other guy with door number 3. Monty opens door number 2 -> Goat. Both guys now go on and swap their picks – both have a higher chance to win now? The odds change on paper cause one variable has been taken out of the equation in reality it changes nothing.

You are confusing results with probability. Yes they both would be increasing their odds of winning but you don’t think they are since one person will still get the goat. Think of your same problem with a 100 doors. They each pick one and then the other 97 doors are opened. Should they both switch? Yes they should as it increases theirs odds.

You are also leaving out the situation where each person picks a door with the goat and he can’t open the door. If they both realize this then they would both win. which shows the situation where both peoples odds were increased and both win.

You should read the problem carefully before providing such flawed opinions. The original problem clearly states that only after you choose a door does Monty examine the two remaining doors knowing he is sure to find at least one goat for him to reveal to you. From your statement, if two people play at the same time and both of them choose different doors, Monty could not examine but one door, making it possible for him to find a car in there and not be able to open the door, letting you both know where the car is. So no, your example is way off base.

If you have a 2/3 chance of choosing the wrong door, than you probably will, when he gets rid of one wrong door you still most likely have a wrong door, so if there are no more wrong doors to pick than you would switch to the probably right one.

@ktr: Glad you liked it! Yes, I like thinking of it as “Pick 1 door or the best of the other 2”.

@Mike: Wow, that’s a really clever use of the paradox! I wasn’t sure if it had a ‘real-world’ use :).

Another way to think about it: If you pick a random answer (A B C D) you have a 3/4 choice of being wrong. If it’s still left after the 50-50 elimination, more often than not switching to the other choice will work out (since most of the time, your random guess is just there to be the wrong answer). Quite often your random guess will be eliminated, but if it remains it’s a good bet to go the other way. Neat!

@Srikanth: Cool, glad it clicked! Yes, sometimes it takes a second reading to have it snap into place.

Nicelly analysed and explained.

Ridiculously complicated. The probability the car is not behind ‘my’ door is 2/3. And now there’s only one door to pick. Change doors!

Interesting and crazy. I’ve read it twice and still don’t get it.

@3rojka: Thanks!

@Seldon: The 1-line explanation explains the “how” mechanics, but not the “why”. If Monty randomly revealed a door and it was a goat (i.e, he wasn’t trying to look at the other two doors and pick the best, he just randomly opened one for some reason), you’d still have an equal chance of being wrong and it wouldn’t give an _advantage_ to switch. So, the secret is in the process Monty uses to reveal, not just the fact that he leaves you with 1 other door.

@geld: It is a tricky problem. You might try playing the game several times (in real life with coins under cups, 2 pennies [goats] and a quarter [car]). You’ll eventually see that your initial guess is right only 1/3 of the time.

I think this is overly wordy. What helped me is:

if you picked the door with a goat, the other doors have a car and a goat. Monty then CAN’T OPEN the door with the car because that would ruin the game.

the probability that you picked the right door out of 3 is 1/3. so you have a 2/3 of having picked the WRONG door, at which juncture, Monty opens the door with the goat. that means the other door has a 2/3 chance, and your door has a 1/3 chance.

@thegnu: Thanks for the comment. Yep, the goal of the article isn’t to just understand why switching works. As you mention, it can be done in a paragraph.

The more interesting principle is seeing the role of the filter itself, so you can handle alternative scenarios also (like your buddy playing, Monty mixing the doors again, Monty giving you 2 sets of doors to pick from).

And from there, we can see that these filters (new information that impacts earlier choices) exist all over the place in real life, from spam filters to analyzing experimental evidence. Hope this helps!

I still don’t get it and disagree. You have no extra information? It’s not a more informed decision. Maybe I’m wrong but this seems exactly like the situation of taking past occurences into an equation that have absolute no relevance. It’s a NEW decision/outcome. If 4 red’s in a row come up on the roulette table it’s no more likely to be black than 50/50 (negating zero) than it ever was. Same situation here no?

Roulette, dice, coins, these are known in probability as independent events. One outcome does not alter the next.

Continued selections from the same instance, the same decreasing pool, are dependent events. Classically, consider a deck of cards. If I draw a bunch of black cards in a row, are you more likely to now draw a red card? Definitely. My first card draw was normal, and nothing can change that. A subsequent draw is different. There are no longer 52 cards. The ratio of black to red is no longer equal. You might say the rules have changed. They have changed in your favor.

Selecting a second time, from fewer gameshow doors, has different odds from selecting from the initial total of doors.

This is how I thought about it: imagine that door A is the one you pick. There are 3 equally likely scenarios. 1: Door A is correct. In this case, you would be better off staying. 2: Door B is correct. Monty would open C and you would be better off switching. 3: Door C is correct, Monty would open B and you would be better off switching. Two out of three times, you are better off switching.

Well, that’s correct up until he opens one of the doors. Then that choice goes out the window and we are left with only 2 choices

I like the way you encourage visualizing the numbers with both shape and color. Great stuff.

Or you could you know figure out the really simple system used on this and get about 70% with out all the close examining you just made me do

The combinations of your first choice are:

Car

Goat(1)

Goat(2)

That’s a 1/3 chance of winning the Car – as you would expect.

After you make your first choice the combinations behind the remaining two doors are:

Goat(1) & Goat(2)

Car & Goat(2)

Car & Goat(1)

If Monty Always picks a Goat then the remaining door contains:

Goat (1 or 2)

Car

Car

Hence if you switch there is a 2/3 chance of picking the Car.

Best concise and clearly laid out explanation. Better than that 10 pages-long contrived and non-convincing explanation above.

Initially we make a random choice with a 33% probability that the car is behind the door. The other two doors contain 66% of the probability. Once the goat is revealed the 66% probability is condensed into a single door. No extra probability is handed to the first door until a new random choice is made. Thus if no new random choice is made the original choice maintains its 33% probability but the other door now has all of the 66% probability of its original set. That’s why switching to the other door wins 66% of the time. If a new random choice is made of those two remaining doors, then it’s truly a 50% chance.

Everything hinges on making a random choice. Those that argue it is a 50% chance problem after the goat is revealed fail to see where the original probability of 33% came from. It came from making a random choice. If there was no random choice initially then there’d be no 33% chance. If the contestant was told what to pick, they’d claim their chances were not even. So after the goat is revealed, the only way to have a 50% chance is to flip a coin and make another random from the two remaining doors. But since 66% chance of a win was held in the other two doors and condensed from those 2 doors into 1, it makes sense to just pick that other door that has the 66% chance and NOT make any more random choices.

Why are both doors examined before they are opened? I find that part confusing. So did he open both doors or only one?