In general, a heuristic is a rule-of-thumb – a mental shortcut that helps guide our decisions. Our brains use mental shortcuts (heuristics) to make split-second decisions. Heuristics, also termed biases, affect how we process complex information. The availability bias happens when we judge the likelihood of an event.
Origin of Availability Heuristic
In the late 1960s and early 1970s, Amos Tversky and Daniel Kahneman started working on examining human judgment under uncertainty. Prior to that, the predominantly held view of human judgment was of humans as rational actors. Kahneman and Tversky discovered that judgment under uncertainty relies on a limited number of simplifying mechanisms rather than extensive cognitive processing. One simplifying mechanism we employ to judge an event is based on how many similar instances come to our mind. In 1973, Amos Tversky and Daniel Kahneman labeled this phenomenon availability heuristic.
What is Availability Bias?
An availability heuristic is a mental shortcut that relies on immediate examples that come our mind when evaluating a specific topic, concept, method or decision. We tend to use a readily available facts to base our beliefs about a comparably distant concept. We tend to assume that future events will closely resemble our recent experience. The frequency of an event’s occurrence and the ease of recall affect our judgment. We tend to attribute higher weightage to information that most easily comes to our minds. Because of the availability bias, our perceptions of risk may be in error and we might worry about the wrong risks. This can have disastrous impacts. Ease of recall suggests that if something is more easily recalled in memory it must occur with a higher probability. This phenomenon distorts our understanding of real risks. For example:
- Most people over-estimate the likelihood of attacks by sharks than traffic accidents
- Periods of very warm or very cold weather affect our beliefs on climate change
- A movie about a nuclear disaster might convince us that a nuclear war / accident is highly likely
Decision-making agencies, from families to governments spend inordinate time to address unfounded fears. In so doing, they ignore much more common and controllable threats. In this process, they also misdirect resources that are put to better use elsewhere.
Demagogues have understood the coercive effect of our availability biases. They rouse rabble to fever pitch and ensure that our mental alertness is always on the high. Historically, demagogues have leveraged our availability bias and tap into our fears to shape public perceptions and manipulate the populace. For example, Joseph Goebbels, the Nazi propaganda minister mastered this manipulation technique to move the entire German population and the country to World War II:
If you tell a lie big enough and keep repeating it, people will eventually come to believe it. The lie can be maintained only for such time as the State can shield the people from the political, economic and/or military consequences of the lie. It thus becomes vitally important for the State to use all of its powers to repress dissent, for the truth is the mortal enemy of the lie, and thus by extension, the truth is the greatest enemy of the State.
On plane crashes
Perhaps you had just read a news article about a massive plane crash. The fear-invoking headline, paired with the image of a wrecked plane wreathed in flames, leaves an easily recalled impression. This leads you to wildly overrate your chances of dying in a similar plane crash. This is the availability heuristic bias at work. Hence, you tend to rate air travel as more dangerous. The mode of transportation that Americans chose in the aftermath of September 11 attacks profoundly demonstrate this heuristic. The availability bias instilled fear of air travel among large segments of the American population. In contrast:
For example, in 2016, 10 planes crashed and were reported by press around the world. However, the media did not state that these were 10 of 40 million flights that landed without any incident that year. In fact, 2016 was the second safest year in aviation history. Alas, reporters hardly seem to care to report such good news! So, when you are not in any immediate danger but your alarm bells chime, don’t blindly submit to your fears. Assess your risks when you are calmer.
People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media.
This heuristic also manifests itself in business, stock markets, economics, weather, etc. Our short-term analyses aren’t only invalid, but also unhelpful and misleading. For example, a legion of economists pronounced events like the 2009 financial crisis as unthinkable right until it happened. The booming economy and the US housing bubble that preceded the 2007-2008 financial crisis provided positive indicators for an upward economic trajectory. These indicators proferred no justification to assume the worst-case scenario of a global economic crisis, which eventually happened. In this case, the economists fell prey to the availability heuristic.
On a smaller scale, a study by Karlsson, Loewenstein, and Ariely (2008) showed that people are more likely to purchase insurance to protect themselves after a natural disaster they have just experienced than they are to purchase insurance on this type of disaster before it happens. In support of this study, Max Bazerman adds:
This pattern may be sensible for some types of risks. After all, the experience of surviving a hurricane may offer solid evidence that your property is more vulnerable to hurricanes than you had thought or that climate change is increasing your vulnerability to hurricanes.
The longer we preoccupy ourselves with an event, the more available it will be in our minds. And, the more probable will we believe the event to occur. The problem is that certain events tend to stand out in our minds more than others. Excessive media coverage can also cause this to happen. Sometimes, the novelty or drama surrounding an event can cause it to become more available in our minds. Because the event is so unusual, it takes on greater significance, which leads us to incorrectly assume that the event is much more common than it really is.
The attention which we lend to an experience is proportional to its vivid or interesting character; and it is a notorious fact that what interests us most vividly at the time is, other things equal, what we remember best. (William James)
Heuristics play an important role in how we make decisions and act upon information in the world around us. The availability heuristic can be a helpful tool, but it is also important to remember that it can sometimes lead to incorrect assessments. Just because something looms large in our memories does not necessarily mean that it is more common. Hence, it can be helpful to rely on numerous tools, reliable data and decision-making strategies when we making decisions and choices.