19 Comments
User's avatar
Paul Boisvert's avatar

Hi, Adam,

As a mathematician, I've long followed the MH problem. I basically used the exhaustive approach when explaining it to others, but your "grouping" approach is far better--a perfect conceptual shortcut. Thanks!

Expand full comment
White, D's avatar

Agree. I have always used the grouping approach when trying to explain to those who are not so 'math-savvy', and it seems to be the best way to convince them!

Expand full comment
Ran's avatar
May 20Edited

I think it's relevant that you're not trying to convince a naive person who's a total blank slate with no idea what the answer might be — good luck finding someone like that! — but rather, you want to convince someone who has good reason to think the answer is other than it is.

So while it's definitely helpful to have multiple explanations for why the right answer is right, I think you'll have more success if you also explain why the intuition goes awry, so that the person can successfully set their intuition aside and be open to a different answer.

To that end, I'd suggest something like:

> The obvious answer is "It doesn't matter": each door originally had equal probability of hiding the car, and Monty Hall hasn't moved the car, so they still have equal probability.

> But by that argument, you could also say that it's just as good to switch to the door where Monty Hall just revealed a goat! After all, he didn't change whether it had hidden a car, so surely it still has a one-third chance of hiding a car, even though you can plainly see that it doesn't?

> The resolution is to see that the car never literally had a one-third chance of being behind each door — it was behind a specific door, so that door had a 100% chance and the others had a 0% chance — it's just that you had no information about which door it was behind, so from your perspective it was *as if* each door had the same chance. If you played this game many times, your first guess would be right about one-third of the time.

> Monty Hall has now changed the situation: he's given you some information (that such-and-such door hid a goat), and crucially, he decided what information to give you *based on what door you had chosen*. By choosing one door, you guaranteed that Monty Hall wouldn't open it, so you affected what information he could give you. If you chose a door with a goat, then you forced him to show you the other door with a goat, whereas if you chose a door with a car, then you let him freely choose which goat to show you. So he's more likely to show you a given goat if you chose the door with the other goat than if you chose the door with the car; and now he's showing you a goat, so working backwards, this means it's more likely that you picked the other goat than that you picked the car.

> (Of course, the car is still behind one specific door, so it still technically has a 100% probability of being behind that door and a 0% probability of being behind the other. But given the information you now have, it's *as if* it has a one-third chance of being behind one door and a two-thirds chance of being behind the other. If you play this game many times using this strategy, you'll win about two-thirds of the cars.)

Expand full comment
Jacques Pasche's avatar

Once you understand that ” If you played this game many times, your first guess would be right about one-third of the time ”, which means it would be false two-third of the time, it’s easy to understand you had better always change your guess so that it would be right two-third of the time!

Expand full comment
Olynpuss's avatar

What happens if, like me, you’d prefer to win a goat?

Expand full comment
Richard Careaga's avatar

To start, contestant has a 1 in 3 chance of having picked the right door. Three doors, one prize. If that doesn't resonate with the student, it's unlikely that anything that follows will.

Often a useful heuristic is to restate the beginning condition. To start, the contestant has a 2 in 3 change of having picked the wrong door, which means that the two doors he didn't pick have a 2 in 3 chance of being the right door.

The host reveals that one of the doors does not contain the prize. This is not a surprise, because we knew from the start that only one of the two could contain the prize. What is new is that we now know that the remaining door is the sole possessor of the 2 in 3 chance,

Nothing else has changed except for our information about the two doors we didn't pick. With that information we can double our odds from 1 in 3 to 2 in 3 by switching.

There's a psychological barrier to do this, which is loss aversion bias. Would I be happier to have swapped the losing door for the winning door than I would be sadder than swapping the winning door for the losing door? Fear of regret is a powerful barrier to rationality.

Expand full comment
Andrew Colman's avatar

I worked out what I believe is a succinct and convincing explanation for my entry on the Monty Hall Problem for the Oxford Dictionary of Psychology. Here it is: There is one chance in three that the car is behind your originally chosen door, and accepting the invitation to switch wins the car if your original choice was wrong. The probability that your original choice was wrong is 2/3, and if it was wrong, then you are certain to win the car if you switch.

Expand full comment
Mike S's avatar

I tend to use the proof by exhaustion method to convince people, since they often struggle with calculations of probabilities and maths (as do I).

I use the analogy of a deck of cards. I ask them to pick one [and hold on to it], and ask what is the chance that it was the ace of spades. They generally understand the probability is 1 in 52 that it is the ace of spades, and you can get them to accept those odds don't change when you flip 50 of the remaining 51 and none of those is the ace either.

Expand full comment
White, D's avatar

I did a card explanation once too, it went like this:

I took three cards, two jacks and one queen, and shuffled them. Then I dealt one card to the person I was explaining to, and two cards to myself. I explained that the 'winner' is the person who has the queen. I was allowed to look at my cards, but the other player was NOT allowed to look at theirs. But I allowed them to choose to keep their one card, or switch to my two. Of course they immediately said they wanted to switch. I then asked them if they were REALLY sure... and I showed them a Jack I had in my hand... did seeing that card change their mind about what to do?

Of course, it did not. They still wanted to switch. For some reason, when they think of the MH game in this way, it was easier to understand why they should switch.

Expand full comment
Edgar's avatar

The initial player choice imposes a limit on Monty's choice of door in 2 out 3 plays. The information conferred by Monty's constraint will inform a players second choice in 2 out of 3 plays.

Expand full comment
Steve Fifield's avatar

Thanks. Reframing as groups really works for me. Only one door has a car behind it. Choose to open just one door, or choose to open BOTH of the other doors?

It mysteriously becomes a no-brainer when previously it seemed baffling.

Expand full comment
White, D's avatar

I am worried there may be a hidden issue in the Monty Hall problem that I have never seen discussed. Maybe someone can shed some light on this. Here is the problem scenario.

Imagine we have 3 players in our new Monty Hall Game; Alice, Bob and Charlie. None of them know of the others existence. They each believe that only they are playing the game. They are each placed in a room with a single window that looks out into a shared central courtyard. In the courtyard our 3 boxes are visible.

The three boxes are randomly assigned to each player. In this game we'll say Alice has Box 3, Bob has Box 1 and Charlie has Box 2. The game 'hosts' know where the prize is, but the contestants do not.

The hosts open one of the two boxes containing a goat. In this case that was Box 1. Bob is eliminated from winning the prize.

Both Alice and Charlie are then given the chance to switch between the remaining boxes.

There's the dilemma. To each of them... this IS the Monty Hall problem! But they can't BOTH end up with 2/3rds odds of winning a single game if they both switch!

So... is there a mathematical difference between playing once vs. playing repeatedly... meaning that the people who claimed there was 'no benefit to switching' in the original MH problem were actually not wrong? Or does this simply highlight how difficult it is to prove equivalence between two sets of conditions, and the example I came up with is actually NOT the MH problem for Alice and Charlie?

This stuff freaks me out.... any help would be greatly appreciated!

Expand full comment
Ray Bamford's avatar

Great question. The key insight (I believe) is that you have simply described a different game. In the original MH, the player never loses in round 1... Hence, there IS new information revealed by the host's choice... i.e., the prize is NOT in the open box, and the player gets a 2/3 chance of winning by switching.

In your "modified MH", each player "loses" 1/3 of the time in round 1 (hence each player has a 2/3 chance of passing round 1)... The new information from the host's choice is whether you passed round 1 or not. There is NO new info about whether the prize is more likely in one or the other unrevealed boxes. Hence, the chances for each of the unrevealed boxes are equal. So if they pass round 1, each of the surviving players has a 1/2 chance of winning if they switch in round 2.

Expand full comment
White, D's avatar

In my example, the game actually IS the standard MH game to the two players I described. You could just as easily describe only a standard MH game... and then after it is completely done say "By the way, unbeknownst to you, two other players were playing and here is what happened to them!" Why would THAT change anything?

And for the 'each player loses 1/3 of the time'... I agree... and what that seems to mean is that you can't compare single games to games being repeatedly played... so as I said.. perhaps the people who said 'there is no benefit to switching' in the original MH problem were not actually wrong... because in that original description it never said 'if you play multiple times'... it simply asked for the best strategy on ONE play!

Expand full comment
Claus Wilke's avatar

I like the 100 doors example, but I also like the grouping proof, and I usually formulate it slightly differently: There are two possible strategies I can follow, switch doors or not switch doors. Let's assume I pick one of those strategies ahead of time. If my strategy is "not switch", then I have to choose the door with the car initially. I have a 1/3 chance to do so. But if my strategy is "switch", then I have to choose a door _without the car_ initially. That's easier, as I have a 2/3 chance to do so.

Mentally where people go wrong is that they think the goal is always to initially pick the door with the car. But if your strategy is "switch," your goal is actually to pick a losing door initially. For most people I think it would be intuitive that it's easier to pick a losing door than the winning door, even if there are only three doors.

Expand full comment
Gary Cornell's avatar

I'm also a (retired) mathematician and have long thought that the best explanation is the one that uses a larger number of doors. It also teaches the well-known principal of mathematicians use instinctively of trying extreme i.e. boundary cases. If one imagines a million doors as I tell students when I tought it, almost all the students immediately grasp the answer. Of course to actually calculate how much better needs probability and I found that trees are the best way to do that.

Expand full comment
Sven Lundquist's avatar

Great post. I'm happy enough with the proof by exhaustion, but have always yearned for a more intuitive explanation (I gather I'm in exalted company with Paul Erdös here), and your proofs 3&4 help here. Do you have any similar wisdom to offer on the Sleeping Beauty problem? Thirder or halfer?

While I'm here, I very much enjoyed "Proof", but this sentence on spherical triangles in Chapter 2 has been really bugging me: “Eventually we’ll end up with a triangle that has one right angle (at the North Pole), and two 45° angles”. Surely we'll never end up there? Isn't this a geometrical equivalent of Cauchy's "this last is called the limit of all the others” which Weierstrauss took exception to, and which you'd just been discussing?

Expand full comment
Terry Clay's avatar

Well, I have for many years thought I needed to learn a much more rigorous approach to probability and statistics, but this argument, and everyone who supports it, have convinced me that it would be a complete waste of time. In addition, where our climate is heading, a goat might be a greater asset than a car.

Expand full comment
Ronald Turnbull's avatar

How do I know that Monty always reveals a goat after my door choice? Maybe Monty's strategy is a) if I choose a goat door go shucks you chose a goat b) if I choose the car door to show me a goat and hope I switch?

Expand full comment