Here’s how to markedly increase their effectiveness
At some point in recent history, firefighters were tackling a domestic kitchen fire in Cleveland.
After locating the fire, the firefighters doused the modest flames with water. But, for some reason, the small fire refused to relent.
So the team blasted the fire a second time. Again, nothing happened.
The firefighters were puzzled. Suddenly, the team’s lieutenant sensed something was wrong.
Without thinking, he screamed at his team to exit the building. Sensing his panic, the team quickly obliged.
Moments later, the building’s floor collapsed.
What was diagnosed as a simple kitchen fire was in fact an expansive blaze careering through the basement underfoot. The lieutenant’s snap decision had saved his team members’ lives.
When it comes to cyber security, we rarely make good snap decisions.
Perhaps that’s where we’re going so wrong.
When interviewed later, the lieutenant admitted he was unaware of the fire in the basement, and believed he’d been blessed with some form of extra-sensory perception. It was this, he believed, that had kept him and his team safe.
The psychologist Daniel Kahneman might think otherwise.
Kahneman has dedicated his life to studying how and why we make decisions, and his seminal work eventually led him to a 2002 nobel prize.
According to Kahneman’s research, our thoughts are governed by two systems: systems 1 and 2. System 1 is unconscious and influenced by emotion. System 2, meanwhile, is conscious, logical and often plays second-fiddle to system 1.
Thanks to system 1, we rarely have to engage in conscious thought. Instead, system 1 picks up cues from our environment and helps us act on instinct – keeping us safe and secure.
Kahneman would argue system 1 kept the firefighter’s team safe from harm that day in Cleveland.
So perhaps it’s time we started thinking about how system 1 might keep us safe from harm online.
The power of emotion
How might we go about training our system 1 thought processes to improve security?
One way might be with emotions.
System 1 – responsible for unconscious thought – can be influenced by emotional experiences. It follows, then, that to influence system 1, awareness programmes could do worse than ignite people’s emotions.
Victims of simulated attacks receive an instant emotional jolt; a jolt that no doubt does as much to help people recognise future attacks as any classroom-based training module.
And yet, some courses rule out simulated attacks on ethical grounds, believing it’s unfair to involve people’s emotions in training exercises.
Ironically, it seems as though the incredible power of emotions is precisely what’s keeping some security awareness campaigns from making an emotional impact in the first place.
Using emotion in cyber security
Whether through simulated attacks or otherwise, psychological research suggests awareness campaigns that connect with people on an emotional level will do far more good than those that don’t.
That might be through simulated attacks – but it doesn’t need to be. Stories have been shown to arouse people’s emotions. As have the personal benefits of security training. Bringing one’s family into just about any equation can immediately conjure up a host of powerful emotions, all of which shape system 1. Making better use of any of these (or any other emotion-evoking methods) in security training is likely to improve the effectiveness of awareness campaigns.
The precise method of bringing emotions into cyber awareness campaigns is unimportant. What is important is that it happens.
Far too many cyber awareness programmes today appeal to reason and logic only. In doing so, they fail to arm people with the tools they need to stay safe online. It’s a situation that needs to change.
Unconscious thoughts keep us safe from physical harm.
By bringing emotions into cyber awareness campaigns (such as through the simulated attacks on the CybSafe platform) and through making every effort to link things to people on a personal and family level, unconscious thoughts can keep people safe from cyber-harm, too.