What is Availability Bias?
Availability bias is a human cognitive bias that causes people to overestimate the probability of events associated with memorable or vivid occurrences. Because memorable events are further magnified by coverage in the media, the bias is compounded on the societal level. Two prominent examples would be estimations of how likely plane accidents are to occur and how often children are abducted. Both events are quite rare, but the vast majority of the population believes that they are more common than they are and behaves accordingly.
In reality, people are much more likely to die from an auto accident than a plane accident, and children are more likely to die in an accident than get abducted. The majority of people think the reverse is true, however, because the less likely events are more "available" — more memorable. Looking at the literature or even just the interactions of daily life will reveal thousands of examples of availability bias in action.
Availability bias is at the root of many other human biases and culture-level effects. For instance, medieval medicine was probably barely more effective than leaving a malady alone to heal on its own, but because the times where the therapy "worked" are more available in the minds of many, practicing medicine was generally considered effective whether or not it really was.
The study of this bias was pioneered by psychologists Amos Tversky and Daniel Kahneman, who founded the field of "heuristics and biases" and developed a model called prospect theory to explain systematic bias in human decision-making. Kahneman subsequently won the 2002 Nobel Prize in Economics for his work, despite having never taken an economics class. Tversky, his long-time partner in the research of heuristics and biases, died in 1996.
A concept intimately connected to availability bias is that of base-rate neglect. Base-rate neglect refers to integrating irrelevant information into a probability judgment, biasing it from the natural base rate. An example would be letting someone into a college just based on an interview, when empirical studies have shown that past performance and grades are the best possible indicator of future performance, and that interviews merely cloud the assessment. Because people like "seeing things for themselves," however, the interviews are likely to continue to take place, even in the absence of any support for their effectiveness.
Discussion Comments
Statistics make it clear that there is little or no danger in boarding a plane or exiting a household. There is a danger in living, because at all times there is the potential of dying. This should not cause fear, however, because we all die at some point, and there is no use trying to assure ourselves of safety through stats.
When a disaster occurs, it is often examined in the light of Black Swan Theory, which states that random occurrences of a large impact are often unquantifiable in normal terms. There are many free-radicals which influence large-scale events, and therefore would slip past people who have no availability bias, or a false, statistics-based, assurance of safety.
Overconfidence bias happens when a person believes themselves to be right on a particular issue in terms of answering questions. Sometimes these kinds of people can be assured due to a large amount of recognition from other ignorant people. Their confidence is a facade behind which there is little or no competence.
I think there can be the effect of neglecting the legitimacy of these events as a reaction to availability bias. Some people say that things like these could never happen simply because everybody else is afraid of them happening. Then, when they actually do occur, the person is taken by surprise. This is the case of many people who go into war, they see others dying around them, but may feel invulnerable.
Post your comments