A story about two minds: the vast difference between real and perceived risk

ElevenPaths    22 October, 2018
“In our society it is generally not considered justifiable to make a
decision purely on an emotional response. We want to be considered scientific
and rational, so we come up with reasons after the fact to justify our choice.”
—Raj Raghunathan, McCombs School of Business, The University of Texas at Austin

Look carefully the following figures. Which one is the largest for you? The
figure on the right or that one on the left?

Ebbinghaus visual illusion image

Yes, indeed they are identical. However, even if you know it, or even if you try to measure it with a ruler, you cannot avoid seeing the right one larger, right?


It is the well-known Ebbinghaus visual illusion and I’m sure you have seen many
more like it. Anyway, did you know that not all the illusions that captivate us
in our everyday life are visual? There are other much more dangerous: the
mind or “cognitive” illusions
.

These cognitive illusions make us to be more afraid of taking a flight than
driving a car, or to think that there are more people who die of accidents than
of heart disease. We are awful at assessing risk. According
to psychologists
, we lead ourselves astray by different ways:

  1. Distortion
    of Habitus:
    the more familiar you are with a
    risk, the more exposed you are to it and the more used you are to mitigate
    it, so it seems to you less risky. Nothing happens if you are using your
    mobile phone while driving, right? Conversely, you tend to overestimate an
    exceptional or unexpected risk. 
  2. Time
    distortion:
    you under-react to a slowly
    growing risk or to a long-term risk. These cigarettes you know you
    shouldn’t smoke… but one by one they seem to be less harmful. You know
    what I am talking about, right? Conversely, you tend to overreact to immediate risk. 
  3. Distortion
    of spirit:
    people overreact to risks that are
    personified, intentional or mediatized. Am I right if I say that your
    reaction is different when someone intentionally throws a stone at you
    than if this stone hit you because it detached from a cornice, or because
    of the wind? Conversely, you under react to hazard or natural risks.

If we are rational animals, the apex of Evolution, why are so awful
at assessing risk and then can we make such bad decisions? Because we aren’t
as rational as we think. Because indeed, we don’t have a mind: we are two
minds!
Dionysus and Apollo
coexisting in the same brain
We have two modes of though: the first one is intuitive and automatic,
the other one is reflexive and rational. They are named AUTOMATIC and REFLEXIVE
or SYSTEM I and SYSTEM II.  
 Thinking, Fast and Slow: image

As explained by Daniel Kahneman in his great book Thinking,
Fast and Slow
:

System 1 operates automatically and quickly, with little or no effort and
no sense of voluntary control.

System 2 allocates attention to the effortful mental activities that demand
it, including complex computations. The operations of System 2 are often
associated with the subjective experience of agency, choice, and concentration.

When we think of ourselves, we identify with System 2, the conscious,
reasoning self that has beliefs, makes choices, and decides what to think about
and what to do. The automatic operations of System 1 generate surprisingly
complex patterns of ideas, but only the slower System 2 can construct thoughts
in an orderly series of steps.”

The following table includes several examples of operations from both systems:

There is no dichotomy between System 1 and System 2: they are not two homunculus
sit on our shoulders and whispering in our ears. We are not perfectly rational,
nor completely emotional and instinctive. We are both, all the time.
They are the intertwined components of a unique system. It is true that
sometimes we use one more than the other, but both are engaged in risk
assessment.

If you can’t answer a difficult question, just replace it
with an easy one
Why must our
brain evolve towards this work division between System 1 and System 2? Just
because it is really efficient: it minimizes effort and optimizes execution.
Most of the time we do very well with this task division because, as conveyed
by Daniel Kahneman in Thinking,
Fast and Slow
:

“System 1 is generally very good at what it does: its models of familiar
situations are accurate, its short-term predictions are usually accurate as
well, and its initial reactions to challenges are swift and generally
appropriate. System 1 has biases, however, systematic errors that it is prone
to make in specified circumstances.”

Where do these biases come from? Answer the following Yes/No questions and it
will become clearer to you.

  1. Do you think that cyberterrorism is a big threat to citizen security?
  2. Do you think
    that crypto mining represents a serious threat to citizen security? 
  3. Do you think
    that using a smartphone with Internet connection is a grave threat to
    citizen security?
  4. Finally, the
    most important one: do you have at your disposal the required data and
    facts to give a full, critical and reasoned answer for the first three
    questions?

I don’t know your answers for the first three questions, but I would bet
that you answered the third one with a resounding NO.
No one thinks they
have all the necessary facts to answer them! In spite of it, you answered with
Yes or NO to the first three ones because you have an intuitive knowledge
thanks to your experiences and readings on the risk of the mentioned threats.  

This is how we work most of the time in our lives: we must continually make
judgements and take decisions, even if we don’t have all the necessary data and
facts, the time to collect them, nor the intellectual ability to process them
completely. Our rationality is limited or “bounded”, as named by Herbert Simon.


The modern world is so complex and our minds too limited to be able to process
all the information before taking a decision. This is why, instead of
seeking optimal procedures to maximize utility functions, we use heuristics!
When
we face a difficult question, we often tend to answer to an easy one, generally
without realising it. It is a simple procedure that helps us find the
appropriate answers, even if often imperfect, for difficult questions.

That is where the origin of our biases thrives. That is why there is often a
vast gap between our risk assessment and the real risk. Our Systems 1 and 2
were developed in an environment where threats were relatively easy to
understand:
a predator leaping on you, a fire spreading, or a member of
another tribe looking at you with a grim face while holding a hidden object.
When assessing risk in the context of our modern society, System 1 often fails
miserably, while System 2 is unable to gain control. Our brains are stuck in
the heuristics from hundreds of thousands of years ago, appropriate for
primitive life of the small social groups living in Nature. They haven’t had
time to update a version for the 21st century.
We need to execute a new operating system in a hardware
of over 100,000 years
This old software riddled with bugs and poorly patched is
error-prone. When a heuristic fails, our security feeling moves away from
security reality. Sometimes we pay more attention to the media or most
threatening risk, instead of to the most prevalent but less newsworthy or
striking one. Or even we create new risks when trying to avoid the old ones.
Security is always a compromise. If the risk severity is misinterpreted,
then the security will be inadequate
. This is why it’s important to learn
to overcome cognitive biases when taking security-related decisions.
To sum up, risk perception is a unique system, but multifaceted:
each complex face contributes to our judgments about the threats hanging over
us. In next blog entries we will explain why it’s so difficult to think
statistically and, consequently, we are so awful at assessing risk, which leads
to take bad security decisions
. We will take a closer look to brain
functioning, in order to understand the limits of our bounded rationality and
be on guard against our most devastating thought errors.
Gonzalo
Álvarez de Marañón

Innovation and Labs (ElevenPaths)

Leave a Reply

Your email address will not be published. Required fields are marked *