Base Rate Fallacy: Definition, Examples, and Impact

The base rate fallacy is a cognitive bias that occurs when people rely too heavily on prior information, or “base rates,” instead of focusing on the current situation.

Key Takeaways

  • The idea of the base rate fallacy dates to Daniel Kahneman and Amos Tversky’s study on personality and assumptions about whether or not someone was an engineer or librarian.
  • This bias can lead to incorrect judgments and decisions, especially when dealing with ambiguous or unfamiliar situations.
  • There are several strategies that can help mitigate this fallacy, such as using Bayesian reasoning and considering multiple sources of information.
question marks in brain

What Is Base Rate Fallacy?

The base-rate fallacy is a decision-making error in which information about the rate of occurrence of some trait in a population (the base-rate information) is ignored or not given appropriate weight.

For example, given a choice of the two categories, people might categorize a woman as a politician rather than a banker if they heard that she enjoyed social activism at school—even if they knew that she was drawn from a population consisting of 90% bankers and 10% politicians (APA).

Daniel Kahneman and Amos Tversky conducted a study where participants were presented with a personality sketch of a fictional graduate student referred to as Tom W.

This sketch contained information about Tom W’s interests, and participants were asked to guess if he was a librarian or an engineer.

Participants were also given base-rate information about the two populations (librarians and engineers) (Bar-Hillel, 1983).

Despite this base-rate evidence, participants overwhelmingly — 95% of them — guessed that Tom W. was an engineer rather than a librarian —a decision that was inconsistent with the base-rate evidence.

At the time when this study was conducted, far more students were enrolled in education and the humanities than in computer science, and their predictions were based on the personality sketch – the individuating information – over the base rate information.

This is one example of the base-rate fallacy in action, and it highlights how human beings often ignore relevant statistical or probability information when making decisions, particularly when they have other cognitive biases operating at the same time (Bar-Hillel, 1983).

Why it happens

There have been a number of explanations for why the base rate fallacy occurs. One explanation is that people tend to use heuristics or mental shortcuts when making decisions.

This can lead to errors in judgment, as people do not take the time to process all of the available information and weigh it up properly (Bar-Hillel, 1980).

Another explanation for the base rate fallacy is that people tend to focus on the specific case or example at hand rather than thinking about the broader picture.

This availability heuristic can lead us to overestimate how likely it is that something will happen, as we are more likely to remember cases where it did occur.

Finally, the base rate fallacy may also occur because people have a general bias against thinking about probability and statistics.

This means that they are less likely to take into account base rate information when making decisions (Bar-Hillel, 1980).

Relevance

Maya Bar-Hillel, in her 1980 paper, “The base-rate fallacy in probability judgments,” presented an alternate explanation for the base rate fallacy: relevance.

Kahneman and Tversky’s study, and others like it, she argued, did not demonstrate the base-rate fallacy.

Rather, they showed that people were using different standards of judgment—what Bar-Hillel called the “evidential meaning” of the base rate as opposed to its “informational content.”

Informational content is the probability that something is true, given the evidence. Evidential meaning is the probability that the evidence is true, given that something is true. For example, imagine you are told that a new drug has a 90% success rate.

The evidential meaning of this base rate would be low if the drug was being tested on a very small number of people, but the informational content would be high if you knew that the drug was being tested on a large number of people (Bar-Hillel, 1980).

Bar-Hillel used this distinction to explain why people sometimes ignore base rate information when making decisions.

She argued that they were often conflating evidential meaning and informational content, jettisoning information that they simply believed to be irrelevant to the judgment that they were trying to make. She contended that, before making a judgment, people categorize the information given to them into different levels of relevance.

If something is deemed irrelevant, they discard it and do not factor it into the conclusion we draw.

One major impact on whether or not a piece of information is deemed relevant is specificity – The more specific information is to the situation at hand, the more relevant it seems.

Because individuating information — such as information about someone’s personality — is very specific, it is thus denoted as highly relevant.

Meanwhile, base rate information is very general and tends to be categorized as low-relevance information (Bar-Hillel, 1980).

Avoiding the Base Rate Fallacy

The base-rate fallacy is a cognitive bias that leads people to make inconsistent and illogical decisions.

It occurs when individuals are overweight or ignore information about the probability of an event occurring in favor of information that is irrelevant to the outcome.

This cognitive bias can lead to irrational decisions and behavior. For example, if someone were told that one person among a group of 100 had contracted a fatal disease, they may be more likely to go see their doctor for routine checkups.

This, although not seeming like an outwardly harmful action, could lead to a cumulative overburdening of the healthcare system and, thus, various negative effects (Macchi, 1995).

The base rate fallacy can also have negative implications on someone’s financial decisions.

For example, if an investor is presented with two investment opportunities that vary in terms of their projected risk and return, they may be more likely to choose the riskier option if they are not given sufficient information about their historical performance.

This could lead to a number of negative outcomes, such as poor returns on investments or financial losses (Macchi, 1995).

There are various strategies that can help individuals avoid falling prey to the base rate fallacy. One strategy involves actively seeking out relevant statistical or probability information when making decisions rather than simply relying on general assumptions or cognitive biases.

Another effective strategy is consulting with experts or professionals in different fields who can provide insight into factors like base rates and probabilities related to specific decisions or situations.

By being aware of the potential pitfalls of decision-making and taking proactive steps to avoid them, it is possible to mitigate the negative effects of the base rate fallacy (Macchi, 1995).

One mathematical approach to avoiding the base-rate fallacy is known as a Bayesian approach to decision-making. This approach takes into account both the base rate information and the individuating information when making a decision, rather than overweighting one type of information over the other.

This approach has been shown to be more effective in avoiding the base-rate fallacy and making decisions that are more closely aligned with reality (Macchi, 1995).

Examples

Medical Diagnoses

The base-rate fallacy has also been implicated in errors made by doctors when making diagnoses.

For example, consider a situation in which a doctor is trying to decide whether a patient has a rare disease. The doctor knows that the base rate of the disease is 1 in 10,000.

The doctor also knows that there is a test for the disease that is 90% accurate. That is, if a person has the disease, the test will correctly identify them as having the disease 90% of the time.

Similarly, if a person does not have the disease, the test will correctly identify them as not having the disease 90% of the time (Heller, 1992).

Now, suppose that the patient tests positive for the disease. What is the probability that the patient actually has the disease? Many might say that there is a 90% chance that the patient has the disease.

However, this answer is incorrect. The correct answer is that there is only an 8.3% chance that the patient has the disease. The error made in this example is a base-rate fallacy.

When making a diagnosis, doctors (and other professionals) sometimes focus too much on the individual test results and less on the base-rate information.

As a result, they may make incorrect decisions, leading to unnecessary and even harmful treatment (Heller, 1992).

Twenty-One

The base-rate fallacy has been variously attributed to the game Twenty-One, a card game that was popularized in the United States in the early 1900s.

In this game, players are dealt cards face down and must guess whether the next card will be higher or lower than the current one. If they guess correctly, they win money; if they guess incorrectly, they lose money.

According to some accounts, gamblers who fell victim to the base-rate fallacy did so because they focused on the individual cards that were dealt rather than on the overall distribution of cards.

For example, if a player is dealt a series of low cards, he might think that a high card is more likely to be dealt next, regardless of the fact that high cards are rarer in general (Koehler, 1996).

While this account of the base-rate fallacy is often cited in decision-making discussions, it may not actually be a valid explanation. Some evidence suggests that people who fall victim to the base-rate fallacy do so because they have difficulty interpreting and integrating information about rates.

In other words, they may not accurately assess the likelihood of an event occurring, even when given information about the base rate (Koehler, 1996).

References

Bar-Hillel, M. (1980). The base-rate fallacy in probability judgments. Acta Psychologica, 44 (3), 211-233.

Barbey, A. K., & Sloman, S. A. (2007). Base-rate respect: From ecological rationality to dual processes Behavioral and Brain Sciences 30 (3), 241-254.

Bar-Hillel, M. (1983). The base rate fallacy controversy. In Advances in Psychology (Vol. 16, pp. 39-61). North-Holland.

Heller, R. F., Saltzstein, H. D., & Caspe, W. B. (1992). Heuristics in medical and non-medical decision-making. The Quarterly Journal of Experimental Psychology Section A, 44 (2), 211-235.

Kahneman, D., Slovic, S. P., Slovic, P., & Tversky, A. (Eds.). (1982).  Judgment under uncertainty: Heuristics and biases. Cambridge university press.

Kahneman, D., & Tversky, A. (1972). Subjective probability: A judgment of representativeness Cognitive psychology 3 (3), 430-454.

Kahneman, D., & Tversky, A. (1973). On the psychology of prediction.  Psychological review 80 (4), 237.

Koehler, J. J. (1996). The base rate fallacy reconsidered: Descriptive, normative, and methodological challenges. Behavioral and brain sciences, 19 (1), 1-17.

Macchi, L. (1995). Pragmatic aspects of the base-rate fallacy. The Quarterly Journal of Experimental Psychology, 48 (1), 188-207.

Tversky, A., &l Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive psychology, 5(2), 207-232.

Print Friendly, PDF & Email

Saul McLeod, PhD

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Editor-in-Chief for Simply Psychology

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.


Charlotte Nickerson

Research Assistant at Harvard University

Undergraduate at Harvard University

Charlotte Nickerson is a student at Harvard University obsessed with the intersection of mental health, productivity, and design.

h4 { font-weight: bold; } h1 { font-size: 40px; } h5 { font-weight: bold; } .mv-ad-box * { display: none !important; } .content-unmask .mv-ad-box { display:none; } #printfriendly { line-height: 1.7; } #printfriendly #pf-title { font-size: 40px; }