Idea in short

The straight-line instinct describes our tendency to assume that a trend will continue along a straight-line. However, in reality, such trends rarely progress along a straight-line forever. In Factfulness, Professor Hans Rosling explains this pattern using the world population as an example. Despite popular misconceptions, the world population does not rise in a linear fashion. According to the Gapminder Foundation, Factfulness is remembering that straight-line trends are rare in reality and recognizing such trends when they occur. The world is more than just straight lines; the world has many other types of curves, including asymptotic growth, S-curves of technology adoption, or exponential growth (really bad when it comes to diseases), etc. In short, important trends are best represented by curves. To understand a trend, we should understand what type of curve it is. Simply assuming a straight-line would result in false conceptions.

Case for statistical literacy

When asked by The Harvard Gazette, on the one thing he would like to change about our world, psychologist, Steven Pinker, opines that many people succumb to:

the cognitive bias of assessing the world through anecdotes and images, rather than data and facts.

Instead of being swayed by startling news reports and pithy tweets, we need to educate people on making informed decisions based on reliable data. He also advocates for Factfulness becomes an endemic part of modern culture.

We need to make “factfulness” (as Hans, Ola, and Anna Rosling call it) an inherent part of the culture of education, journalism, commentary, and politics.

A proponent for reducing statistical illiteracy, Steve Pinker wishes that:

An awareness of the infirmity of unaided human intuition should be part of the conventional wisdom of every educated person. Guiding policy or activism by conspicuous events, without reference to data, should come to be seen as risible as guiding them by omens, dreams, or whether Jupiter is rising in Sagittarius.

The relatable aspect of this instinct is the need for people to find patterns and expect that the patterns will continue into a foreseeable future. In doing so, we overlook the other developments that influence the current trajectory of events. Trends can linearly increase, decrease or simply follow a random course. Hence, one cannot hold the current pattern as a baseline to predict the future.

The Population Bomb

In 1968, the American biologist, Paul R. Ehrlich, published the book – The Population Bomb. This book portends doom and gloom; it made dire predictions and triggered a wave of repression around the world. It also documents one of the best-known examples of this is the straight-line instinct and the associated fallacy. The author projected the population development between 1970 and 1980 and predicted the death of hundreds of millions of people from famine and disease.

According to Ehrlich, before the 20th century was over, there would be millions and millions of starvation deaths across the globe. He also said that we would run out of oil, and that there would be shortages of many other resources. As a biologist., Ehrlich studies animal populations. If an animal population grows unchecked, then it eventually outstrips its food supply, and the result is massive starvation. Ehrlich said that this would eventually happen with humans. He prophesied that it was only a matter of a decade or two until this mass human starvation would occur.

Hundreds of millions of people are going to starve to death. Nothing can prevent a substantial increase in the world death rate.

Published at a time of tremendous conflict and social upheaval, this book argued that many of the most alarming events had a single, underlying cause:

Too many people, packed into too-tight spaces, taking too much from the earth. Unless humanity cut down its numbers—soon—all of us would face “mass starvation” on “a dying planet.”

To much relief, it became clear by the end of the 1970s that this prediction was not true. The straight-line instinct got the better of us. In making his prediction, Ehrlich overestimated possible famines and death rates; these had, in fact, declined over the years.

The power and misuse of statistics

Lies, damned lies, and statistics (Benjamin Disraeli)

Mark Twain popularized this saying in Chapters from My Autobiography, published in the North American Review in 1907:

Figures often beguile me, particularly when I have the arranging of them myself. There are three kinds of lies: lies, damned lies, and statistics.

Numbers have a persuasive power, especially when appropriately used to bolster one’s viewpoint. On the other hand, statistics, when used deceptively, can beguile a casual observer into believing something to be true, when the data proves otherwise. According to a Business Insider article published in 2010 by Mark Suster, 73.6% of all statistics are made up.

In some cases, the misuse may be accidental. In other cases, the perpetrator may purposefully doctor the facts to further his / her hidden agenda. As history has shown, dictators have understood the power of statistics to foment discontent and radicalize people’s sentiments:

The death of one man is a tragedy. The death of millions is a statistic. (Joseph Stalin)

Tragedy of the commons

Our brains are hardwired to operate in very limited ways. We hastily make generalizations based on our experience and limited observations. For example, if today, we do something that gets us something we want. Tomorrow, we’ll do more of the same activities and expect the same outcomes. If, however, we don’t get what we want, we get disappointed. Most of us can appreciate this abstract example from a metaphysical standpoint. However, our inertia and emotional misgivings inhibit us from pursuing an alternate course of action.

Insanity is doing the same thing over and over again and expecting different results. (Albert Einstein)

Homo economicus and the tragedy

In Game theory, homo economicus – the rational human being- is modeled under the premises of perfect rationality. The theory assumes that we always act in a way that maximize utility and are capable of making complex deductions to achieve the best possible outcome. In other words, the theory assumes that we will always think through all possible outcomes and choose the best course of action that results in the best possible result. However, the tragedy of the commons phenomenon proves otherwise:

This phenomenon describes the human behaviors in a shared-resource system where individual users, acting independently according to their own self-interest, behave contrary to the common good of all users by depleting or spoiling the shared resource through their collective action.

Example of a tragedy of the commons

Assume that I throw out a medium-sized net today and catch 500 fishes today. Motivated by my harvest, tomorrow I’ll throw out a larger net and catch 1,000 fish. My fellow piscatorians will emulate my success by throwing in wider nets. Soon, we’ll all end up catching 1,000 fishes each day.

Soon, the average size and number of fishes we catch will start to diminish. As we’ve already caught a lot of big fishes, we’ll be netting smaller fishes. Soon, we’ll catch all these too, before we all realize that we have no more fishes to catch.

In this example, my fellow fishers and I based our actions on the linear assumption that the future will follow the same trajectory as today or the past.

Danger of the straight-line instinct

As the above example shows, cognitive biases, such as the straight-line instinct make our lives difficult. On the one hand, we assess things much more dramatically than they really are. Such misinterpretations lead to a negative things in the future.

On the other hand, we derive comfort in foreseeing our future through our intrinsic, straight-line instinct. Especially, from the comfort zone that we are familiar with, envision / play out scenarios that may positively or negatively impact the future. Make plans to deal with the adverse scenarios and affect positive outcomes.

Tips for consultants

As consultants, we, consciously or otherwise, make decisions on behalf of our clients. This comes with great responsibilities. Recommendations based on shaky assumptions of a future from a familiar past may adversely impact your client. Hence, it stands to reason that the results of some of our decisions are not as valuable, if the predictions turn out to be false. So, stress-test your assumptions.

Validate your hypothesis though market research, client conversations, benchmarks, playing devil’s advocate, war games or any other methodology available at your disposal. Seek dis-confirming evidence. If the data proves otherwise, make the recommendation and gracefully accept your assumption flaw.

When using hypotheses, consultants should seek to invalidate their hypotheses; not confirm them. Unfortunately, this is not always the case. There are many cases of consultants interpreting data in a biased manner until they validate the result their clients expect to see. Such confirmation biases are problematic; the client will continue in-effective programs and initiatives past the point of no return. Confirmation bias also wastes a huge amount of time and funding. We must not take client opinions at face value and must be aware of the role of biased reporting.

In the long run, clients value honest opinions that decide their future than confirm their short-term beliefs. Make sure that you educate your client stakeholders on the underlying assumptions to avoid faux pas in the future. The last you should do is pursue the status quo and lead your client to nemesis. Neither you, nor your client would want that!