Maurice Allais, a Nobel prize winning economist, died earlier this month. In this post, I’m going to focus on one of his many intellectual contributions, as it profoundly influenced modern psychology. It’s known as the Allais Paradox, and it was first outlined in a 1953 Econometrica article. Here’s an example of the paradox:
Suppose somebody offered you a choice between two different vacations. Vacation number one gives you a 50 percent chance of winning a three-week tour of England, France and Italy. Vacation number two offers you a one-week tour of England for sure.
Not surprisingly, the vast majority of people (typically over 80 percent) prefer the one-week tour of England. We almost always choose certainty over risk, and are willing to trade two weeks of vacation for the guarantee of a one-week vacation. A sure thing just seems better than a gamble that might leave us with nothing. But how about this wager:
Vacation number one offers you a 5 percent chance of winning a three week tour of England, France and Italy. Vacation number two gives you a 10 percent chance of winning a one week tour of England.
In this case, most people choose the three-week trip. We figure both vacations are unlikely to happen, so we might as well go for broke on the grand European tour. (People act the same way with lotteries: we typically buy the ticket for the biggest possible prize, regardless of the odds.)
Allais presciently realized that this very popular set of decisions – almost everybody made them – violated the rational assumptions of economics. Instead of making decisions that could be predicted by a few mathematical equations, people acted with frustrating inconsistency. After all, both questions involve 50 percent reductions in probability (from 100 percent to 50 percent, and from 10 percent to 5 percent), and yet generated completely opposite responses. Our choices seemed incoherent
Our economic system is based on a free market based on rational decisions. But people do not have an intuitive understanding of probability so they make decisions that are often simply wrong, based on the math. This may be changing, as our younger citizens seem to get this better, perhaps because of video games. But it can have a huge effect on all sorts of decisions we make unless we consciously examine the math.
There is no mathematical difference between the choices of 1% or 2% or 25% and 50%. One is exactly half as likely to happen as the other. And everyone who actually figures this out raise you hand? Not many because we just do not really ‘get’ probabilities.
But what these researchers found is that we hate losses and will accept some pretty bad deals if they minimize the losses. Look at this little humdinger:
The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows: If program A is adopted, 200 people will be saved. If program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved. Which of the two programs would you favor?
When this question was asked to a large sample of physicians, 72 percent chose option A, the safe-and-sure strategy, and only 28 percent chose program B, the risky strategy. In other words, physicians prefer a sure good thing over a gamble that risks utter failure. They are acting just like the people who choose the certain one week tour of England. But what about this scenario:
The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows: If program C is adopted, 400 people will die. If program D is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die. Which of the two programs would you favor?
These two different questions examine identical dilemmas. Saving one third of the population is the same as losing two thirds. But when Kahneman and Tversky framed the scenario in terms of losses, physicians reversed their previous decision. Only 22 percent voted for option C, while 78 percent of them opted for option D, the risky strategy that might save everyone. Of course, this is a ridiculous shift in preference, as nothing substantive has changed in the scenario.* But our choices are guided by our feelings, and losses just make us feel bad. Because the coldhearted equations of classical economics neglect emotion, their description of our decisions remained woefully incomplete.
These two scenarios have exactly the same mathematical properties, yet simply by changing the wording of the losses, people changed their choice.
I think this is why so many people get the ‘Monty Hall Problem‘ wrong. Everyone almost always assumes they pick the right door, so they are afraid to change. They would feel awful if they changed doors and ended up getting the goat. Yet the chance of that actually happening is exactly the same as picking and winning the car without switching – one out of three. The chance of losing if you switch is the same as winning if you don’t. Yet people worry about the loss more and take the chance on the car.
And in doing so they make the wrong decision.
But what if you come at it from the more likely scenario – you chose the wrong door.?You will do this two out of three times. If you do not change, you will win only 1/3 of the time. But if you switch, you will win 2/3rds of the time.
I got this almost immediately but I had such a hard time explaining it to people until I constructed a table showing all possibilities and walking them through it.
Now, what would you do if it was the Lady or the Tiger with three doors? If most people actually followed their inclinations, 67% of them would die. If they understood probabilities, 67% of them would live.
That is why understanding uncertainties, losses and probabilities can be so very important.