Biases: why you are not as rational as you like to think

Because of our cognitive biases we tend to make systematic errors in our thinking.

Solve the following simple puzzle.

 

A table tennis bat and ball together cost 1.10 euros.

The bat costs 1 euro more than the ball.

How much does the ball cost?

 

Answer the question before reading on.

 

The plausible, intuitive, but wrong answer to this puzzle is 10 euro cents. If the ball costs 10 euro cents, the bat costs 1.10 euros and the total cost is 1.20 euros. The correct answer is 5 euro cents.

 

If you answered 10 euro cents, you followed your intuition and didn't bother to check if your answer was correct. You may be comforted by the fact that more than 50% of students at three leading US universities also gave the incorrect answer. At less prestigious US universities, even more than 80% gave the wrong answer. These are remarkable figures because checking the answer requires only a few seconds of moderate mental effort.

 

If you came up with the right answer, you probably thought of the intuitive incorrect answer first, but you managed to resist it. Perhaps the plausible answer made you suspicious: the puzzle couldn't be that simple, could it? Or maybe you are more sceptical of your intuition and less easily satisfied with superficially plausible answers (more rational).

 

We are not as rational or logical as we like to think

The human mind is inherently lazy and reluctant to put in more effort than is strictly necessary. It generally follows the law of least effort: it takes the least demanding course of action. We generally rely on our intuitive hunches (some people more than others) because questioning them requires mental effort. We avoid cognitive effort as much as possible as we find it somewhat unpleasant, so we stop thinking after a superficially plausible answer occurs to us.

 

When information is scarce, our mind operates as a machine for jumping to conclusions. To make sense of the world, it continually constructs the best possible coherent story based on the information available to it. It obviously cannot take into account information it doesn’t have, so only the available evidence counts. The less our mind knows, the easier it is to build a coherent story, because there are fewer elements to connect. Insufficient evidence can lead to very convincing stories.

 

The subjective confidence that people have in their mind-constructed stories is based on the coherence of these stories (confidence through coherence), not on the quantity or quality of the evidence on which they are based. Feelings of high confidence mean that the constructed story is coherent, and not necessarily that the story is true. It is the consistency of the information that matters, not its completeness. Some of our most important beliefs are not based on evidence at all, but are based solely on the fact that people we love and trust hold these beliefs.

 

Jumping to conclusions is an efficient strategy if the leap saves a lot of time and effort, the conclusions are usually correct, and the occasional errors have acceptable costs. It is a risky strategy in unfamiliar situations, when the stakes are high and when there is insufficient time to gather the necessary information.

 

Often our mind-constructed stories are close to reality and support reasonable judgments and decisions. For example, when we buy a house after carefully weighing all the relevant pros and cons. Or when we respond emotionally mature to an insult after first calming ourselves by taking a few slow, deep breaths and counting to ten. Or when we choose a study after extensive research into the relevant aspects that are most important to us.

 

But because of our cognitive limitations, we regularly make systematic thinking errors that lead to unreasonable judgements and decisions. For example, we jump to wrong conclusions because we don’t check whether our conclusions are correct. Or we move to a Mediterranean climate, only to discover that our focus on the sunny climate blinded us to the downsides of the move.

 

Heuristics

Heuristics are mental shortcuts that allow us to make judgements and decisions quickly and efficiently by using rules of thumb rather than gathering all information and weighing all relevant factors. Heuristics are often applied automatically and without conscious awareness. They protect us against information overload and analysis paralysis.

 

The essence of intuitive heuristics is that when faced with a difficult question, we often answer an easier question instead, usually without noticing the difference. For example, the affect heuristic is a largely unconscious mental shortcut through which we answer the easy question ‘How do I feel about this?’ instead of the much more difficult question ‘What do I think about this?’.

 

Heuristics usually work well, but they tend to produce systematic errors in our thinking (cognitive biases), often without us being aware of these errors. For example, the availability heuristic is our tendency to judge the relative importance of issues by the ease with which they are retrieved from memory. Violent crimes, plane crashes and shark attacks appear to be more common than they actually are because these events receive a lot of media attention and are easily recalled from memory. The affect heuristic is our tendency to base our judgements and decisions directly on feelings of like and dislike. This makes us susceptible to government fear campaigns and advertising campaigns that aim to evoke positive emotions, such as environmental organisations using cute animals in their campaigns.

 

Biases

Psychological scientists define bias as our tendency to make systematic errors when judging or making decisions, due to perceptual, cognitive, social, or cultural processes. Biases are a natural part of human perception and cognition and affect everyone to some extent. The processes that create biases can distort our interpretation and understanding of reality, and lead to judgements and decisions that are inconsistent with logic, evidence, or our own interests. Biases are usually not intentional, but the result of automatic and unconscious processes. They can be aggravated by external factors such as stress, time pressure and emotional states. 

 

Cognitive biases are systematic errors in our thinking that predictably recur in specific circumstances. These thinking errors can distort our interpretation and understanding of reality, which can influence our judgements, decisions and actions. Cognitive biases can result from various cognitive processes and mechanisms, such as the use of heuristics, memory distortions, motivated reasoning, anchoring, valuation of gains and losses, and ease of retrieval from memory.

Perceptual biases are systematic errors in the way we perceive sensory information from the environment. These perceptual errors can distort our interpretation and understanding of sensory input, which can impact our judgements, decisions, and actions. An example of a perceptual bias is the contrast principle.

 

Being vigilant against the negative consequences of biases is impractical and mentally tiresome, but it can be worth the effort in situations where the stakes are high. By becoming aware of the existence of biases and consciously working to reduce their negative impact, we can make better judgements and decisions. For example, by  actively questioning our assumptions, seeking additional information, evaluating options, considering evidence objectively, seeking input from others, or considering alternative perspectives (frames).

 

Common biases and heuristics

This section provides a brief description of some common biases and heuristics. For each of these a link is provided to a detailed description of the bias, including techniques to reduce its impact. 

 

👉 The affect heuristic is our tendency to make judgements and decisions based on our current emotions.

 

👉 The anchoring effect is our tendency to make estimates that are biased toward the initially proposed value (the anchor).

 

👉 The authority principle is our tendency to comply with requests and follow instructions from people in positions of authority.

👉 The availability heuristic is our tendency to assume that things that come to mind easily are more common than they really are.

👉 Confirmation bias is our tendency to search for and favour information that supports our pre-existing beliefs, and to ignore or devalue information that contradicts these beliefs.

👉 The consistency principle is our tendency to act consistently in accordance with what we have previously said or done.

  

👉 The contrast principle is our tendency to perceive differences between two things to be greater than they actually are when those things are experienced one after the other.

 

👉 The focusing illusion is our tendency to overestimate the importance of something because we focus our attention on it.

 

👉 The framing effect is our tendency to be unjustifiably influenced by the way information is formulated (framed).

 

👉 The fundamental attribution error is our tendency to overestimate the role of personality traits and underestimate the role of environmental influences in explaining the behaviour of others.

👉 Hindsight bias, also known  as the ‘I-knew-it-all-along’ effect, is our tendency to view past events as more predictable than they actually were. 

👉 The liking principle is our tendency to be more easily persuaded by people we like.  

 

👉 Loss aversion is our tendency to prefer to avoid losses rather than achieve equivalent gains.

 

👉 The narrative fallacy is our tendency to create incorrect cause-and-effect stories out of one or more events.

 

👉 Negativity bias is our tendency to pay more attention and respond more emotionally to negative stimuli than to positive ones. 

 

👉 Optimism bias (or optimistic bias) is our tendency to overestimate the likelihood that good things will happen to us and underestimate the likelihood that bad things will happen to us. 

 

👉 The planning fallacy is our tendency to underestimate the time, costs, and risks of future tasks, while overestimating their benefits.

👉 Present bias is our tendency to prefer immediate rewards at the expense of future rewards.

 

👉 The reciprocity principle is our strong tendency to feel obligated to reciprocate favours received.

👉 The scarcity principle is our tendency to assign a higher value to things that are perceived as scarce.

👉 The social proof principle is our tendency to look at the actions or beliefs of others to determine what is appropriate or correct in a given situation.

👉 The sunk cost fallacy is our tendency to keep investing resources in something that is no longer rational or beneficial, simply because we have already invested unrecoverable resources in it.

👉 Survivorship bias is our tendency to focus on the successful outcomes (survivors, winners) of a particular situation, while overlooking the unsuccessful outcomes (failures, losers).

👉 Truth bias is our tendency to assume that others are telling the truth unless we have reason to doubt them.

👉 The unity principle is our tendency to favour those we consider to be one of us.

👉 The WYSIATI (What You See Is All There Is) bias is our tendency to base judgements and decisions solely on the information available, without considering what may be missing.

References

Thinking, Fast and Slow, by Daniel Kahneman

 

Influence, New and Expanded: The Psychology of Persuasion, by Robert B. Cialdini PhD

5 Common Mental Errors That Sway You From Making Good Decisions, by James Clear, https://jamesclear.com/common-mental-errors

 

12 Common Biases That Affect How We Make Everyday Decisions, Psychology Today, by Christopher Dwyer Ph.D.,

https://www.psychologytoday.com/us/blog/thoughts-on-thinking/201809/12-common-biases-that-affect-how-we-make-everyday-decisions

 

What Are Heuristics?, Verywell Mind, By Kendra Cherry, MSEd, https://www.verywellmind.com/what-is-a-heuristic-2795235

 

Why do we take mental shortcuts?, The Decision Lab, https://thedecisionlab.com/biases/heuristics

Topics & Contact

 

Previous
Previous

ACT: the path to a fulfilling and meaningful life

Next
Next

The framing effect: how language shapes your perception of reality