# The Confirmation Bias: we seek the information that confirms our decisions, refusing their opposed evidences

##### ElevenPaths10 December, 2018
Imagine yourself in a lab over an experiment. You’re asked to analyze the following number sequence:
2, 4, 6
This sequence follows a rule. What do you think the rule is? You can propose more three-number sequences to the experiment leader, who will tell you if the sequence proposed follows the rule or not. You can propose as many new three-number sequences as you wish. As soon as you discern the rule, announce it to the experiment leader for you to know if you got it or not.
So, which is the first three-number sequence that you would propose to discern the rule followed by the sequence 2,4,6?
I’m sure that, as soon as you have seen the sequence 2,4,6, the first rule that sprang to mind was “even numbers ascending in pairs”. I have performed this experiment a number of times with scientists from all fields as well as with security professionals. So far, in 100% of cases, people tend to propose sequences as 8,10,12. That is, they put forward three consecutive even numbers in order to confirm their hypothesis. Would you have raised a similar sequence?
I confirm them that, indeed, sequences such as 8, 10, 12 or 10, 12, 14 or similar ones follow the rule. Then, they put forward sequences such as 100, 102, 104 or similar ones. Would you have proposed something like this as well?
Once they have suggested two or three more sequences like these ones, they firmly believe that they got it and announce: “The rule is even numbers ascending in pairs”.
Of course, this is not the rule!. At this stage, they change the rule and suggest sequences as 11, 13, 15. Once again, I confirm that they follow the rule, so they feel encouraged and put forward 101, 103, 105. It follows the rule as well. At this time, they announce: “The rule is numbers ascending in pairs”.
But neither is it the answer! Some suggest then 5, 3, 1, but this sequence does not follow the rule. And this is how, little by little, they finally reach the real rule. Have you already guessed it?
The rule is any number sequence in ascending order, regardless of the difference between each number and its following one, for instance: 1, 100, 1000.
What’s happening here? Why is so difficult to find such an easy rule? Just because people try to prove that their hypotheses are the right ones: they choose examples that confirm their hypothesis instead of taking those that disprove it. The real scenario is that no hypothesis can be completely validated. Just a counterexample is enough to reject it. The first black swan discovered in Australia disproved the long-held theory in Europe that “all swans are white”. In science, this happens continuously. New findings displace old theories, thus rejecting hypothesis sometimes supported for centuries.
In conclusion: if you want to prove that your hypothesis is certainly right, you must fail in your attempts to disprove it. In other words, the experiments that you design do not have to be focused on proving your hypothesis, but on refuting it. It is at this point where most people fail, including scientists. Because we are clung to our hypotheses, to our ideas, we seek to confirm our beliefs.
Coming back to the sequence experiment. If your initial hypothesis is “even numbers ascending in pairs”, which number sequence would you have to propose? Instead of those sequences that confirm it, like 10, 12, 14, put forward those that would refute it, such as 9, 15, 29. Can you see it? The second sequence includes odd numbers that do not increase in pairs. If it follows the rule, your hypothesis is thus disproved. By doing so you move towards the right answer. Otherwise, no matter how many sequences confirming your hypothesis you propose: you will be fastened with your error. This is the heart of the scientific method: you try to refute your theories, not to prove them. And this, dear friends, it is a daunting task for humans.
If you think that something is true, don’t seek to confirm it, but instead try to disprove it
In the following video from Veritasium you can follow a number of persons over the 2-4-8 experiment:
Did you notice how do they persevere in their hypotheses? Even if their initial proposals for the rule are rejected, they keep on putting forward three-number sequences which are variants of their initial hypothesis. And they always, always, always suggest sequences to confirm the hypothesis, not to disprove it.
This experiment was designed and its results initially published by the psychologist Peter Wason in the sixties. Actually, it was him who coined the term “confirmation bias” to refer to our trend to boost information confirming our hypotheses, personal beliefs and ideas, no matter whether they are true or not.
Unfortunately, even if I’m explaining that you are swayed by this confirmation bias, you will keep on looking for information confirming your hypothesis and rejecting information that disproves it. The fact of being aware of it will not save yourself from risk. Don’t you believe it? Here you have another logical reasoning challenge, created again by Wason:
You are shown a set of four cards placed on a table, two of them show a letter each one and the remaining two a number each one:
A D 3 7
Each card has a number on one side and a letter on the other side. The challenge is to decide which cards to turn over for verifying the following rule:
Any card having an A on one side has a 3 on the other side
This time I will not give you the answer. You are invited to let your answer in the comments below. I will only give you a lead: don’t try to validate your hypothesis, try to disprove it.
We seek the evidence that confirms our position
These experiments show that, once you take a position on an issue, you are more likely to look for or give credit to these evidences supporting your position instead of to the evidence rejecting it. However, don’t think that we work like this just for drawing scientific theories up. We are swayed by this bias in our everyday life, at any moment and when we perform any kind of task or interact with other people. The more clung you are to a hypothesis, the more difficult considering opposed hypotheses will be.
The explanation is quite simple. Information assessment is an intellectual-cost activity. Our brain is lazy and prefers to use “thought shortcuts”. This is how it saves time when making choices, especially under pressure or when facing great uncertainty. Finally, we tend to prioritize that information allowing us to quickly reach the conclusion that we boost.
This trend to seek confirmatory information can lead to all kinds of false beliefs and bad choices, since you will always be able to find an evidence proving (almost) any idea. Do you smoke and want to believe that it isn’t so bad for health? I’m sure you had a relative who died at 98 and he smoked a carton of cigarettes per day. Do you have a sedentary lifestyle and think that sport is not that healthy? You certainly had another relative who was extremely careful about his health and died at 38 because of a heart attack. Do you use the same password to protect all your services and think is not necessary to change it? I’m sure you have been like this for years and nothing happened, so why should something happen tomorrow?
As you may have realized, the fact that there is an evidence supporting an affirmation is not enough to reach a determined conclusion, since there could be another evidence against it: people who had died because of lung cancer as a direct consequence of smoking, people who had died because of a heart attack as a consequence of the cardiovascular risk reached following years of sedentary lifestyle and obesity, or people whose account was hacked because the passwords used in different services were identical.
The highest risk of this confirmation bias is that if you are looking for a single kind of evidence, you will certainly find it. You need to look for both types of evidences: also that one that refutes your position.
You are not as open to change as you like to think
According to the professor Stuart Sutherland, the author of Irrationality: The Enemy within, changing our ideas and hypotheses about reality is extremely difficult due to several reasons:
1. People consistently avoid exposing themselves to evidence that might disprove their beliefs.
2. On receiving evidence against their beliefs, they often refuse to believe it.
3. The existence of a belief distorts people’s interpretation of new evidence in such a way as to make it consistent with the belief.
4. People selectively remember items that are in line with their beliefs.
5. To these four reasons one might add a fifth, the desire to protect one’s self-esteem.
This confirmation bias is ubiquitous in cybersecurity work life. We can see it in our everyday life:
• If you are a technologist, you may think that technology is the solution for your security problems. If technology fails, you will blame people who manage or use it , or the processes implemented. You will seek and highlight technology’s success in the same way you will ignore and underestimate its failures. By doing so, you will be, whether consciously or not, swollen its effectiveness.
• Over a security audit, it is quite common to draw the conclusions just with some evidences. You find something, and you quickly make an explanation for it. Once an opinion is shaped soon after starting the research on the security of a system, you will spend more time seeking evidences confirming your first impression than those disproving it.
• If you want to hire a security professional for your organization, and you think that those having certifications such as CISSP, CEH, CISM, etc., are higher-qualified, then you will find all sorts of evidences to support your belief.
• If you are responsible for information security within a company, in case your CEO thinks it’s important to invest in security, the focus will be placed on your department’s achievements. Otherwise, if your CEO thinks it’s an unnecessary expense, the focus of their decisions will be placed on your errors and gaps, so ignoring your achieved successes.
• Security experts within an organization, as well as the externally-hired security experts are that: experts. So, it’s quite normal if they want to be seen as humans. This halo of “expert” makes them to be trusted by everyone, so looking for alternative solutions becomes unnecessary. After all, if the expert considers that a given solution is the right one, why should we seek more? The expert will tend to reject those solutions that may threat their role.
• In a similar vein, there is none more dangerous than a group of experts together in the same room, because the well-known “groupthink” will almost certainly appear: each group member will try to conform their opinion to the opinion considered as the group consensus, until the group agrees a determined action, that will be individually considered by each group member as inadvisable. When they think the same, is because no one is thinking.
• In relation to this situation, we also can have a “false consensus”. We often invite people to a meeting because we know that they agree with us and share our ideas.
A survival guide for cybersecurity professionals
Whether you like it or not, we are all conditioned by the confirmation bias. Here you have a checklist guide that you can use before taking important decisions. I have made it on the basis of the advices given by Stuart Sutherland in his book Irrationality.
• I have actively sought evidences that disprove my beliefs.
• I have looked for more than a single hypothesis, so I have considered a minimum of two opposed hypotheses.
• I have invested time in and put attention to seriously consider the information that has contrasted with my beliefs, instead of rejecting it at once.
• I have not distorted the new evidence built after my initial hypothesis: I have carefully considered if it can be interpreted as a refutation of my beliefs instead of as their confirmation.
• I have not trusted my memory: I’m aware that those facts that better match with our way of thinking are easily remembered. This is why I have asked other people and checked out past events’ data and notes.
• I have counted on the support of a devil’s advocate, who has questioned all my hypotheses.
As we explained in a previous article of this set, biases are an inherent characteristic in human thinking. The first step to avoid them is knowing them. In particular, the confirmation bias can become a problem when taking complex decisions. You can use this checklist if you must make an important choice to avoid the confirmation bias. And remember that changing your mind due to new evidences is a sign of strength, not a weakness.
Gonzalo Álvarez Marañón
Innovation and Labs (ElevenPaths)