Resident CybSafe psychologist, Tom Cross, looks into when simulated attacks are of most use to companies
More than a few articles have been circling of late suggesting simulated phishing does little to increase cyber security awareness.
‘Links are meant to be clicked on, attachments are meant to be opened,’ such articles say. ‘[Sometimes a] job consists almost entirely of opening attachments from strangers, and clicking on links in emails’ – thus there is no point in running educational phishing programmes.
We understand the sentiment. But, actually, here at CybSafe, we don’t quite fully agree. We don’t believe in simulated phishing exercises in isolation or for the sake of it, but we do think they can have a place as part of a broader focus on the human component of cyber security.
To be clear, simply phishing your employees and then forcing anyone who clicks a link to sit through a 20-minute video is not raising security awareness – it’s just making people resent their security teams. However, phishing is exactly what we want to teach people to recognise. The fact that some people’s ‘job consists almost entirely of opening attachments from strangers, and clicking on links in emails’ is exactly the point. People need to be better at spotting cyber scams and in order to do this they need to be trained. It’s not about looking in the other direction. You don’t train for penalties in football without a goalkeeper.
CybSafe’s (optional) simulated attacks recreate many of the conditions that people face in reality. The CybSafe platform then educates and increases awareness and understanding of how best to become a safe and secure cog in the socio-technical systems of a company. This is made much more achievable by helping staff understand why secure online behaviour is important to them personally as well as professionally, and how phishing fits in. It’s therefore equally important that the simulated attack activity is coordinated with other thoughtful awareness activity if you are to get the true benefit from the simulated attacks.
In our opinion, setting appropriate and agreed expectations in relation to how an awareness campaign will run beforehand allows a spirit of ‘we’re all in this together.’ Getting the culture right from the outset fosters an internal siege mentality that companies can use to combat internal and external threats. With the right foundations, companies could even create departmental competition with rewards for the ‘safest’ department – without ruffling any feathers whatsoever.
‘I’m not sure’
Experts have now developed an ‘I’m not sure’ button as something of a halfway house that people can click to flag a message as a potential phishing scam. It’s a positive step. But some argue that it still diverts responsibility away from the individual to the IT department – is this really what we want?
Maybe it is. But, owing to the (relatively accepted) trade-off between security consciousness and lost productivity, there exists a case to say otherwise.
Elsewhere, some experts have raised the idea of slapping those who click malicious links with disciplinary procedures. A wise move?
In our opinion, such procedures nurture a culture of fear – which can lead to mistakes being hidden, attacks being concealed and breaches going unreported. It’s a culture that does little to combat the growing cyber security risk.
By contrast, at CybSafe we advocate a culture that’s the mirror opposite: one in which people are trained to know what to look for, are able to ask for help and advice and quick to report (and counter) their mistakes.
The latter is a culture focused on a ‘growth mindset’. It’s about learning. It allows people to make mistakes, learn quickly and move forward with more awareness and understanding, and it’s one in which we people are less likely to make the same mistake twice.
It is here we link into what some refer to as Psychological Safety. Amy Edmonson, an Associate Professor at Harvard Business School reports ‘Psychological safety means no one will be punished or humiliated for errors, questions, or requests for help, in the service of reaching ambitious performance goals.’ Combining the aforementioned growth mindset with a culture of psychological safety allows companies to run simulated awareness campaigns with the positive intention of becoming a more cyber secure and robust entity.
There are three key behavioural influences we target with this approach:
Priming – increasing awareness driving a culture of learning and creating a company priming effect by being very explicit and relentless with expectations of cyber security of our employees
Norms – this is something that the whole company is undertaking and will actually move us ahead of the competitors (siege mentality)
Messenger effect – ideally Chief Execs and MDs need to own this and their voice needs to be a constant through the internal campaign of up-skilling
Clearly, culture varies dramatically from company to company. The recent flurry of articles denouncing simulated attacks seemingly relate to companies and cultures with a fixed mindset (see above link to Carol Dweck’s Growth Mindset work) that don’t like to speak out and that don’t like to show vulnerability.
In such cultures, the focus is invariably on completing training as quickly as possible. However, it’s sometimes worth going slowly initially in order to go faster in the future.
By that we mean focusing on people and culture hand in hand with cyber security processes and policies – investing in them, taking time to set expectations, explaining the why, showing the negative consequences that can result from poor cyber security practices – the list continues.
Ultimately, this leads to ‘bought-in’ people who are ambassadors for cyber security behaviour as part of the culture and who propagate the fact that security consciousness is the way things should be done.
Making simulated attacks work
In our opinion, simulated phishing is seldom counterproductive if done well and contextualised. And it can most definitely increase cyber resilience. The trick lies in encouraging a culture that allows it to do so.
To do that, companies should;
- Encourage a growth mindset
- Be explicit with intentions and expectations and share the reality of the cyber-threat while keeping Protection Motivation Theory (Rogers & Prentice-Dunn, 1997) in mind.
- It’s also worth noting Modic & Anderson (2014) who state that warning effectiveness can be improved by:
- Providing a clear, concrete and nontechnical description of the threat
- Using social influence, ie referencing an authority or social group
- Understand mistakes
- And allow for reflection, education and action
- Develop psychological safety
- Ideally through C-suite Execs and the IT department
- Psychological safety refers to a climate in which people are comfortable being (and expressing) themselves (Edmondson, 2003)
As ever, making simulated attacks work requires multi-layer interventions that link into the essence of a company. Cyber security interventions should also ‘talk’ to people – interventions need to engage the emotional brain, as research shows that people remember feelings far more than they do thoughts, facts or figures.
In the right environment, it’s clear to us simulated phishing emails (and equally as important, other forms of simulated attack) can do this. In the right company, with a culture that embraces learning from failure and supports everyone having a voice, simulated phishing can do a great deal to increase cyber security awareness. That said, failing to contextualise the simulated phishing properly or implementing thoughtless phishing campaigns runs the risk of p@*sing our people off, possibly making the situation worse.