Security is both a feeling and a reality
Daniel Gardner starts his book The Science of Fear with the shocking history of US September 11 attacks:
And so in the months following the September 11 attacks, as politicians and journalists worried endlessly about terrorism, anthrax, and dirty bombs, people who fled the airports to be safe from terrorism crashed and bled to death on America’s roads. And nobody noticed. […] It turned out that the shift from planes to cars in America lasted one year. Then traffic patterns went back to normal. Gigerenzer also found that, exactly as expected, fatalities on American roads soared after September 2001 and settled back to normal levels in September 2002. With these data, Gigerenzer was able to calculate the number of Americans killed in car crashes as a direct result of the switch from planes to cars. It was 1,595.
What killed all those victims? The fear
We all know that flying is safer than driving a car. In fact, the most dangerous part of flying is the car journey to the airport, as statistics have showed. So, why are we more afraid of flying than driving? Because risk acceptance is not only based on technical estimates of risk and benefit, but also on subjective factors, such as the feelings.
Our beliefs on the world are determined by our emotional preferences
The Affect Heuristic allows someone to make a decision based on an affect ꟷthis is, a feelingꟷ instead of on a rational consideration. This heuristic works according to the following substitution:
If your feelings towards a situation are positive, then you are more likely to judge its risks as low; on the other hand, if your feelings towards a situation are negative, this would lead to a higher risk perception.
You are using your affective response to a risk (for instance, how do I feel about genetically modified food, nuclear power, breast cancer or firearms?) in order to infer how serious a given risk is for you (for example, how many people die of breast cancer or by firearms per year?). Often, you will find that there is an important gap between actual and perceived risk.
On our brains, risk is associated with a number of psychological factors that determine if we are more or less afraid. And how can these factors be measured?
One of the most well-known researchers on risk analysis, Paul Slovic, suggested a psychometric paradigm to measure perceived levels of risk according to the affective response to different threats. In his first research, Slovic suggested 18 characteristics to quantitatively measure risk perception. In order to make it simpler, the following table only includes those risk perception factors most directly related to cybersecurity:
|People overreact against those risks that:||People play down those risks that:|
|Strike fear||Do not strike fear|
|Are uncontrollable||Are under their control|
|Are globally catastrophic||Impact on a few people|
|Impact on others, not the activity agent (inequitable)||Impact on the activity agent (equitable)|
|Are externally imposed||Are voluntary|
|Are unknown||Are known|
|Are difficult to understand||Are well-understood|
|Are new, infrequent||Are old or common|
|Have immediate consequences||Have long-term effects|
Let’s see again the example of flying or travelling by car from this new perspective. If you evaluate each one of the previous factors for both activities, you will reach a similar result to the one represented by the following graphic:
Maybe now it seems clearer for you why we are more afraid of flying than travelling by car in spite of what statistics and studies on accidents and mortality show: We are emotional beings!
You risk perception against threats is conditioned by fear and familiarity
Later, when going further into the study of these factors, Slovic discerned that there are two main dimensions among all of them: fear and familiarity. Both dimensions may be graphically represented in order to make risk classification simpler.
If we focus on these two factors, the Affect Heuristic may be redefined as the following substitution:
When evaluating two threats A and B, the more fear one of them strikes into you and the less familiar it is to you, the higher you will perceive its risk regarding the remaining one.
Unconsciously, you are making a judgement: flying is more frightening and less familiar than travelling by car, so it must be riskier. This way, you place the flight into the bottom right side (High Risk) and the car into the top left side (Low Risk). And not even all the existing statistics will change this affect. You can try it out on your brother-in-law.
This heuristic is specially applied when you must take quick decisions. When you are under pressure and out of time, you cannot avoid feeling affective or emotional reactions towards most of the options. Of course, in addition to affect, psychological shortcuts also leap into action, helping you to determine if a risk seems to be high or low: they are the cognitive biases and heuristics that we have been examining over previous articles.
Familiarity is a key factor to risk assessment. The more familiar you are to an activity or event, the less attention you pay to it. Your brain is bombed by millions of input data and need to filter them, extracting the relevant information. In general, relevant means new, anything that involves a change. Over time, when our brain responds to the same stimulus time and again, it gets used to and ends up ignoring it.
Habituation is a wonderful phenomenon that allows you to get along in your everyday life without having to pay attention to everything. The downside is that you become desensitized to frequent stimuli. The more familiar an activity is, the less risky it ends up seeming to you. For this reason, you may smoke, eat ultra-processed food, whatsapp while driving and cross the road while reading Facebook on your mobile device EVERYDAY! You are so used to these activities (they are familiar to you) that they don’t seem risky to you anymore.
The surprising relationship between our judgements of risk and benefit
But the story does not end there. Paul Slovic did not only reach the conclusions previously described in his risk psychometric paradigm. He discovered surprising relationships between our judgements of risk and benefit as well:
In the world, risk and benefit are positively correlated, while in people’s minds (and judgements), risk and benefit are negatively correlated. […] People base their judgments of an activity or a technology not only on what they think about it but also on what they feel about it. If they like an activity, they are moved to judge the risks as low and the benefits as high; if they dislike it, they tend to judge the opposite-high risk and low benefit.
The paradigmatic example here is nuclear power. As everybody knows, nuclear power is a Bad Thing, so it must involve a high risk. How beneficial is nuclear power? Considering that it is a Bad Thing, it must involve a low benefit. However, X-rays of radiographies are a Good Thing, since doctors use them to save lives, so they must involve a Low Risk and a High Benefit. This is how our brain works. What about data? We do not need them; the decision is already made. They would only be useful for confirming the initial position. The result is that we overestimate the risks of nuclear power and underestimate the risks of X-rays.
Under this model, affect is the first reaction and guides our judgements of risk and benefit. If a general affective view guides perceptions of risk and benefit, providing information about benefit should change perception of risk and vice versa.
Make risk take its rightful place in your empleyee’s affect
All the studies on risk perception confirm that experts in the assessed field succumb to the Affect Heuristic to a lesser extent. After all, they have a greater awareness of the field, gained through experience and study. This is, they know more accurately the probabilities and nature of the threats, as well as the impact of incidents. In conclusion, they are better equipped to assess the actual risk: their gap between actual and perceived risk is smaller than among laypersons in the field.
The conclusion is clear: if you want to help your employees make better security decisions, you must raise their Information Security Awareness (ISA). This conclusion is so obvious that jotting it down is shameful. However, whether this is made or not is another story. And among the major challenges of this awareness, re-educate users on technologies that are quite familiar and helpful for them is one of the greatest ones, since they end up losing sight of their actual risk.
Therefore, one of the key points of any program must be dishabituation. The more familiar employees are to a technology and the more helpful they perceive it; the less risky such technology will be for them. Cybercriminals exploit precisely these high familiarity, low fear and high benefit of a number of technologies in order to turned them into attack input vectors. Some examples of this type of familiar, nice and helpful technologies are:
- E-mail, a technology we use every single day at any time.
- USB drives, those small and innocent-looking devices that store so many useful information.
- Office files from Word, Excel, PowerPoint, PDF, on which we spend our time every day and which we happily share.
- Ads on legitimate websites, that we view everywhere and are really annoying, even if sometimes they advertise interesting things.
- Games and apps downloaded on the smartphone, so funny, useful and cute.
- Photos and videos shared on social networks.
- The company’s employees themselves, with whom we drink coffee every morning and whose children we know.
There is no harm in carrying out from time to time security campaigns intended to remind employees that e-mail, USBs, office files, browsing, games, multimedia, the colleagues themselves, etc. are the main cyberattack input doors, however familiar and friendly they seem.
Finally, your security perception is not merely a rational issue, but emotional as well. You cannot fight against the affect heuristic directly, because this is how our brain works. Instead, you can guide your employee’s affect towards the various technologies, raising their awareness level.
Gonzalo Álvarez de Marañón
Innovation and Labs (ElevenPaths)