Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices,[1] and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence.[2] At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.[3]
Debiasing can occur within the decision maker. For example, a person may learn or adopt better strategies by which to make judgments and decisions.[4] Debiasing can also occur as a result of changes in external factors, such as changing the incentives relevant to a decision or the manner in which the decision is made.[5]
There are three general approaches to debiasing judgment and decision making, and the costly errors with which biased judgment and decision making is associated: changing incentives, nudging, and training. Each approach has strengths and weaknesses. For more details, see Morewedge and colleagues (2015).
Changing incentives can be an effective means to debias judgment and decision making. This approach is generally derived from economic theories suggesting that people act in their self-interest by seeking to maximize their utility over their lifetime. Many decision making biases may occur simply because they are more costly to eliminate than to ignore.[6] Making people more accountable for their decisions (increasing incentives), for example, can increase the extent to which they invest cognitive resources in making decisions, leading to less biased decision making when people generally have an idea of how a decision should be made.[7] However, "bias" might not be the appropriate term for these types of decision making errors. These "strategy-based" errors occur simply because the necessary effort outweighs the benefit. If a person makes a suboptimal choice based on an actual bias, then incentives may exacerbate the issue. An incentive in this case may simply cause the person to perform the suboptimal behavior more enthusiastically.
Incentives can be calibrated to change preferences toward more beneficial behavior. Price cuts on healthy foods increase their consumption in school cafeterias,[8] and soda taxes appear to reduce soda consumption by the public. People often are willing to use incentives to change their behavior through the means of a commitment device. Shoppers, for example, were willing to forego a cash back rebate on healthy food items if they did not increase the percentage of healthy foods in their shopping baskets.[9]
Incentives can backfire when they are miscalibrated or are weaker than social norms that were preventing undesirable behavior. Large incentives can also lead people to choke under pressure.[10]
Nudges, changes in information presentation or the manner by which judgments and decisions are elicited, is another means to debiasing. People may choose healthier foods if they are better able to understand their nutritional contents,[11] and may choose lower-calorie meals if they are explicitly asked if they would like to downsize their side orders.[12] Other examples of nudges include changing which option is the default option to which people will be assigned if they do not choose an alternative option, placing a limit on the serving size of soda, or automatically enrolling employees in a retirement savings program.
Training can effectively debias decision makers over the long term.[13] [14] Training, to date, has received less attention by academics and policy makers than incentives and nudges because initial debiasing training efforts resulted in mixed success (see Fischhoff, 1982 in Kahneman et al.[15]). Decision makers could be effectively debiased through training in specific domains. For example, experts can be trained to make very accurate decisions when decision making entails recognizing patterns and applying appropriate responses in domains such as firefighting, chess, and weather forecasting. Evidence of more general debiasing, across domains and different kinds of problems, however, was not discovered until recently. The reason for the lack of more domain-general debiasing was attributed to experts failing to recognize the underlying "deep structure" of problems in different formats and domains. Weather forecasters are able to predict rain with high accuracy, for example, but show the same overconfidence in their answers to basic trivia questions as other people. An exception was graduate training in scientific fields heavily reliant on statistics such as psychology.[16]
Experiments by Morewedge and colleagues (2015) have found interactive computer games and instructional videos can result in long-term debiasing at a general level. In a series of experiments, training with interactive computer games that provided players with personalized feedback, mitigating strategies, and practice, reduced six cognitive biases by more than 30% immediately and by more than 20% as long as three months later. The biased reduced were anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness.
Training in reference class forecasting may also improve outcomes. Reference class forecasting is a method for systematically debiasing estimates and decisions, based on what Daniel Kahneman calls the outside view. As pointed out by Kahneman in Thinking, Fast and Slow (p. 252), one of the reasons reference class forecasting is effective for debiasing is that, in contrast to conventional forecasting methods, it takes into account the so-called "unknown unknowns." According to Kahneman, reference class forecasting is effective for debiasing and "has come a long way" in practical implementation since he originally proposed the idea with Amos Tversky (p. 251).