Technology has made scams easier to carry out, but it’s not the reason why they continue to be effective. For that, we need to look to human psychology. What makes some people susceptible to fraudsters’ tactics? What is the relationship between impulsivity and falling for scams? And why does general security awareness training not work for everyone?
Let’s remind ourselves of the types of scams this can involve. Phishing continues to be a major risk. These emails are usually designed to look like legitimate messages, which makes them harder to spot. Sometimes, they’re aimed at tricking people into sending money. Data from the Central Bank of Ireland about payment fraud in 2022 and 2023 found that the value of fraudulent payments went up by 26 per cent between those two years, from €100 million to €126 million.
Other times, phishing scams aim to fool people into sharing their logins or user credentials. The well respected Verizon Data Breach Investigations Report (DBIR) consistently highlights phishing as a major factor in data breaches. In its 2024 report, phishing and pretexting via email accounted for 73 per cent of all breaches: almost three quarters of the total.
No time to decide
What’s more, the median time for users to be tricked by phishing emails was less than 60 seconds. I mention time as a factor because it’s an important point that I’ll come back to later.
Scams can affect us everywhere, not just in workplace environments. Romance scams are common; a female friend of mine was using a calorie counting app which has a community function. Even though she hadn’t posted a photo on the app, she started getting messages from male members of the community that resembled those on dating apps. One of them was from an alleged doctor working in Gaza. After doing a reverse search on the sender’s picture, she discovered that it actually belonged to a doctor, who had an extensive social media presence. So it had all the signs of catfishing, where a fraudster creates a false identity, usually an authority figure, to trick someone, typically into giving money.
Scams on an industrial scale
While researching this, I discovered that companies specialise in industrialised romance scams. They hire people who think they’re going to work in a call centre, but in reality, their jobs are to defraud people, including some high-profile targets. Often, multiple people will pretend to be one person, to make the scam as elaborate as possible. The BBC uncovered how people in poor countries who are desperate for work are forced to become criminals. It goes to show that there’s potentially more than one victim in all this.
Another common tactic is purchase scams. I’ve seen this myself: I like to buy things from Vinted, the second-hand clothes site. On Instagram recently, I started to see ads for brands that I follow featuring heavily discounted items that looked too good to be true. I know I’m not the only one seeing these ads. Offers of ultra-low prices usually means that, at the very least, the goods aren’t genuine. The Irish Times has started covering this trend of online stores that are just e-commerce fashion scams.
The emotional factor in online fraud
What all these scams have in common is their exploiting strong emotions. That could be shame: the other person is threatening to expose some secret about me (whether or not it’s true). Fear: if I don’t act now, something bad will happen. Excitement: I have to avail of this special offer, intended just for me, before time runs out.
The ability to regulate emotions is an important factor in preventing ourselves from falling for scams. Scammers know this: it’s why they carefully craft their messages to seem urgent. I mentioned the time factor earlier, and this element is critical. It’s intended to trick victims into acting fast – before reflecting, or regulating their emotions.
You might expect that this affects older adults most, but the emotional stakes are common to all ages. Contrary to conventional wisdom, young people’s comfort with technology doesn’t make them any less susceptible to scams. Catherine Knibbs, a human behaviour technologist, argues that scammers deliberately target young people more than adults because they’re less capable of regulating their emotions. This makes them easier to manipulate.
Why cybersecurity awareness doesn’t always work
I have also found in my research that people with ADHD might also have a higher propensity for falling for scams. The psychologist Caralyn Bains describes the traits of people with ADHD in relationships, which include impulsivity. Additionally, individuals with lower self-control (e.g. impulsive buyers), might also be more susceptible to scams. That’s why I argue that general cybersecurity awareness training doesn’t work in all cases. Having knowledge without impulse control means that a person has no time to implement their knowledge. Knowing how to run a marathon is not the same as having the skills to do it. You need to train.
I have seen awareness training that instructs people not to react to potential scams. But how does that help a person who is neurodivergent, or who has issues with emotional reactivity or impulsive decision making? It’s not enough to know what a scam looks like. Also, a person’s state of mind could make them more likely to respond to a scam text or a phishing email. That’s also true of someone who’s stressed or tired.
We need to give people the skills to act correctly; to learn emotional regulation and practice it. What if security awareness training incorporated these elements? Don’t just say: ‘stop and breathe’. Teach people how to stop and breathe. When people with ADHD work with a behaviour specialist or CBT therapist, they learn skills that help them regulate their emotions, such as mindfulness, breathing techniques and self-reflection. I believe techniques like these could be used to train people to be less susceptible to scams.
Designing more effective security training
In my opinion, security professionals need to think about how they design awareness training and how they measure its effectiveness. It’s common to carry out phishing tests to check whether people learned to recognise the signs of a possible scam. Trainers see the stats of how many people clicked on the link – “proving” that they would fall for a genuine phishing email. On one level, that’s a useful guide to whether the training worked. But if a person repeatedly clicks on the link, that doesn’t necessarily mean they didn’t acquire knowledge. I would argue it’s wrong to conclude that these people “failed” the test. What’s more, it can also trigger feelings of shame in a person who repeatedly clicked on a fake phishing link after awareness training.
It is arguable that drawing attention to behaviour that is a result of undiagnosed ADHD, for example, effectively “punishes” someone for being that way. Neurodivergent people have protection under the law: in Ireland, the Employment Equality Acts 1998-2015 prohibit discrimination in the workplace based on disability, including during training.
Instead of instinctively punishing “insecure” behaviour, I would urge managers and trainers to be curious about the potential cause (without trying to diagnose). It might mean a person is emotionally reactive and needs a different kind of training.
It’s more instructive for managers to ask whether the awareness training, as designed, is productive for that person or group. Does pointing out “mistakes” make it more likely to reinforce this behaviour or nurture positive actions? Instead of blaming the individual, it’s better to start from the premise: does it help them act in a more security-conscious way? This approach also leads to a more positive security culture where everyone has the skills to spot scams or phishing and takes the right course of action to keep the organisation safe.
Bozena Jaslan is a Cyberpsychologist and Operations Supervisor with BH Consulting.
