Denominator neglect is a behavioural phenomenon that affects people exposed to data that is presented in certain ways to influence the outcomes. We are good at thinking in anecdotes; our brains are wired to identify patterns and potential causal connections among specific events. However, when our brains are tasled to predict chances and deal with uncertainties, they easily succumb to confusion.
Intrinsically, our brains shift and solve easier problems rather than complex problems, substituting the answer to the easy problem without realizing it. This tendency also applies to mathematical problems. As humans, we miserably fail at appraising large-scale phenomena, such as evaluating the risk of global pandemics or predicting election outcomes.
The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects
To demonstrate denominator neglect, Kahneman presents two situations in his book.
Scenario: Urn game
There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner.
- The first urn has 10 marbles in it, 1 of which is red
- The second urn has 100 marbles in it, 8 of which are red
Which urn would you choose?
Statistically, you should try your luck with the urn that has 10 marbles, because 1 out of 10 (10%) of all marbles are red. In the second urn, only 8% of the marbles are red. It doesn’t seem a tricky decision, but your chances of drawing a red marble out of the first urn are greater (10%) than your chances of drawing a red marble out of the second urn (8%).
As Daniel Kahneman describes in Thinking, Fast and Slow:
About 30-40% of students (the survey participants) choose the urn with the larger number of winning marbles, rather than the urn that provides a better chance of winning…Vivid imagery contributes to denominator neglect…When I think of the small urn, I see a single red marble…When I think of the larger urn, I see eight winning marbles.
This is an example of denominator neglect – when somebody focuses on the headline number indicating the occurrence of an event (picking the red marble – the numerator), with little consideration for the number of times that event could occur (all the marbles – the denominator).
In other words, our view of the probability of an event occurring is skewed by the disproportionate attention we give to the absolute number of winning marbles.
The effects of strategically choosing the right scale (i.e. the right denominator) can be dramatic. In one study, respondents judged a disease that kills 1,200 out of every 10,000 afflicted individuals to be more dangerous than one that’s twice as lethal, killing 24 out of every 100. Rationally speaking, the second disease was clearly more lethal given that chances were that it would kill twice the number of afflicted people than the first one would. However, people still found the first disease more dangerous because the number 1,200 is much bigger than the number 24. In the process, they had totally neglected the denominator and made a blunder in arriving at the conclusion that they did.
Case – Vaccination study
The denominator neglect effect is more pronounced in cases of rare events i.e., events with a very low probability of occurence, such as drug side effects, terrorist attacks, plane crashes, earthquakes, shark attacks, etc. For example:
3,000 dead in the attacks on the World Trade Center is easier to imagine than saying that there is a 0.001% probability of falling victim to an attack
The effect is so powerful that even experts regularly fall for it. In fact, this effect also explains how our perceptions of risk vary so much. As Kahenman writes:
You read that a vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability. The risk appears small. Now consider another description of the same risk: One of 100,000 vaccinated children will be permanently disabled. The second statement does something to your mind that the first does not: it calls up the image of an individual child who is permanently disabled by the vaccine; the 999,999 safely vaccinated children have faded into the background.
The larger point is that people concentrate on absolute values in most cases and don’t take the denominator into account. This also explains why people are more likely to spend in a stronger currency than a weaker one. As Gilovich and Ross write:
People are more likely to buy expensive brand-name products when they are priced in a strong currency like the British pound that results in a relatively small price tag (318 pounds for an Apple iPad with retinal display) than when priced in a weak currency like the Mexican peso that results in a relatively large price tag(6,395 pesos for the same iPad).
Manipulation with denominator neglect
The clear and bite-sized representation of statistical facts for our System 1 – the subconscious – offers a lot of room for manipulation. If I want to make a risk appear particularly low, I should always choose an appropriate mathematical formulation.
The psychologist Paul Slovic presents a few vivid examples.
Each year in the United States, approximately 1,000 people are murdered by mentally ill people who have not taken their medication regularly. For example, say, 1,000 out of 273,000,000 Americans die this way every year, or The annual probability of being murdered by a mentally ill person is 0.00036%. You immediately get an uneasy feeling of fear. You suspect a sociopath who wants to ram a knife in your stomach at every corner.
In the second case you have to strain your brain [System 2] to realize that the probability of getting hold of such a murderer is extremely low. In any case, the probability of dying from diabetes, cardiovascular diseases, cancer or road traffic is significantly higher!
Indians and Gadgets
This effect also explains why many Indians buy gadgets when they go abroad. In many cases, the goods Indians buy abroad are cheaper, but in several other cases, they simply appear cheaper because they are priced in a much stronger currency.
Overcoming denominator neglect
To overcome the neglected denominator, Daniel Kahneman suggests a fundamentally simple procedure:
Whenever you are presented with a probability with absolute numbers (individual fates), you should reformulate the statement accordingly and form a mathematical probability from it. Of course, this presupposes that you get your System 2 – conscious thinking – going. Especially in very turbulent times, you shouldn’t let your emotions and fears guide you. Actively pursue facts and follow recognized subject matter experts in their field and not self-proclaimed YouTube scientists.
When dealing with data, ensure that the presentation is equal and fair; if not, take time to understand the truth behind the numbers. Work against any insidious bias working away. Particularly, this is essential in dealing with risky situations. More emotive subjects will lean towards instincts and judgement than fractions, where absolutes, data and quantitative rational convey a completely contrarian meaning.