The neglect of probability, a type of cognitive bias, is the tendency to disregard probability when making a decision under uncertainty and is one simple way in which people regularly violate the normative rules for decision making. Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored. The term probability neglect was coined by Cass Sunstein.[1]
There are many related ways in which people violate the normative rules of decision making with regard to probability including the hindsight bias, the neglect of prior base rates effect, and the gambler's fallacy. However, this bias is different, in that, rather than incorrectly using probability, the actor disregards it.
"We have no intuitive grasp of risk and thus distinguish poorly among different threats," Dobelli has written. "The more serious the threat and the more emotional the topic (such as radioactivity), the less reassuring a reduction in risk seems to us."
In a 1972 experiment, participants were divided into two groups, with the former being told they would receive a mild electric shock and the latter told that there was a 50 percent chance they would receive such a shock. When the subjects' physical anxiety was measured, there was no difference between the two groups. This lack of difference remained even when the second group's chance of being shocked was lowered to 20 percent, then ten, then five. The conclusion: "we respond to the expected magnitude of an event...but not to its likelihood. In other words: We lack an intuitive grasp of probability."[2]
Baron (2000) suggests that the bias manifests itself among adults especially when it comes to difficult choices, such as medical decisions. This bias could make actors drastically violate expected-utility theory in their decision making, especially when a decision must be made in which one possible outcome has a much lower or higher utility but a small probability of occurring (e.g. in medical or gambling situations). In this aspect, the neglect of probability bias is similar to the neglect of prior base rates effect.
Cass Sunstein has cited the history of Love Canal in upstate New York, which became world-famous in the late 1970s owing to widely publicized public concerns about abandoned waste that was supposedly causing medical problems in the area. In response to these concerns, the U.S. federal government set in motion "an aggressive program for cleaning up abandoned hazardous waste sites, without examining the probability that illness would actually occur," and legislation was passed that did not reflect serious study of the actual degree of danger. Furthermore, when controlled studies were publicized showing little evidence that the waste represented a menace to public health, the anxiety of local residents did not diminish.[3]
One University of Chicago study showed that people are as afraid of a 1% chance as of a 99% chance of contamination by poisonous chemicals.[2] In another example of near-total neglect of probability, Rottenstreich and Hsee (2001) found that the typical subject was willing to pay $10 to avoid a 99% chance of a painful electric shock, and $7 to avoid a 1% chance of the same shock. They suggest that probability is more likely to be neglected when the outcomes are emotion-arousing.
In 2013, Tom Cagley noted that neglect of probability is "common in IT organizations that are planning and estimating projects or in risk management." He pointed out that there are available techniques, such as the Monte Carlo analysis, to study probability, but too often "the continuum of probability is ignored."[4]
In 2016, Rolf Dobelli presented a choice between two games of chance. In one, you have a one in 100 million chance of winning $10 million; in the other, you have a one in 10,000 chance of winning $10,000. It is more reasonable to choose the second game; but most people would choose the first. For this reason, jackpots in lotteries are growing.[2]
Dobelli has described the United States Food Additives Amendment of 1958 as a "classic example" of neglect of probability. The law – which prohibited carcinogenic substances in food, no matter how low the probability that they would in fact result in cancer – led to the substitution of those substances by ingredients that, while not causing cancer, stood a far greater chance of causing some sort of medical harm.[2]
In 2001, there was widespread panic around the U.S. over shark attacks, even though there was no evidence to show any increase in their occurrence. Legislation was actually enacted to address the issue.[3] Neglect of probability also figures in the purchase of lottery tickets.[4] Cass Sunstein has pointed out that terrorism is effective partly because of probability neglect.[3] "Terrorists show a working knowledge of probability neglect," he wrote in 2003, "producing public fear that might greatly exceed the discounted harm."[5]
The probability bias is especially pronounced among children. In a 1993 study, Baron, Granato, Spranca, and Teubal presented children with the following situation:
Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn't... Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?Jonathan Baron (2000) notes that subject X responded in the following manner:
A: Well, in that case I don't think you should wear a seat belt.
Q (interviewer): How do you know when that's gonna happen?
A: Like, just hope it doesn't!
Q: So, should you or shouldn't you wear seat belts?
A: Well, tell-you-the-truth we should wear seat belts.
Q: How come?
A: Just in case of an accident. You won't get hurt as much as you will if you didn't wear a seat belt.
Q: OK, well what about these kinds of things, when people get trapped?
A: I don't think you should, in that case.
It is clear that subject X completely disregards the probability of an accident happening versus the probability of getting hurt by the seat belt in making the decision. A normative model for this decision would advise the use of expected-utility theory to decide which option would likely maximize utility. This would involve weighing the changes in utility in each option by the probability that each option will occur, something that subject X ignores.
Another subject responded to the same question:
A: If you have a long trip, you wear seat belts half way.
Q: Which is more likely?
A: That you'll go flyin' through the windshield.
Q: Doesn't that mean you should wear them all the time?
A: No, it doesn't mean that.
Q: How do you know if you're gonna have one kind of accident or the other?
A: You don't know. You just hope and pray that you don't.
Again, the subject disregards the probability in making the decision by treating each possible outcome as equal in his reasoning.
Cass Sunstein has noted that for a long time after 9/11, many people refused to fly because they felt a heightened sense of fear or peril, even though, statistically, most of them "were not at significantly more risk after the attacks than they were before." Indeed, those who chose to drive long distances instead of flying thereby put themselves at an increased risk, given that driving is the less safe form of transportation.[3]
In a 2001 paper, Sunstein addressed the question of how the law should respond to the neglect of probability. He emphasized that it is important for government to "create institutions designed to ensure that genuine risks, rather than tiny ones, receive the most concern". While government policies in regard to potential dangers should focus on statistics and probabilities, government efforts to raise public awareness of these dangers should emphasize worst-case scenarios in order to be maximally effective. Moreover, while it seems advisable for government to "attempt to educate and inform people, rather than capitulating to unwarranted public fear", that fear will remain a real phenomenon and may thus cause serious problems, for example leading citizens to undertake "wasteful and excessive private precautions". In such cases, certain kinds of government regulation may be justified not because they address serious dangers but because they reduce fear. At the same time, government should "treat its citizens with respect" and "not treat them as objects to be channeled in government's preferred directions", so focusing on worst-case scenarios that feed on irrational fears would amount to "unacceptable manipulation".[3] In a 2003 article, however, Sunstein concluded that "As a normative matter, government should reduce even unjustified fear, if the benefits of the response can be shown to outweigh the costs."[5]