I’m sure (hope) you don’t need me to tell you how computers can pose a risk to both yourself and your organisation, be that through local malware or internet attacks.

The number of cases where organisations have been disrupted, crippled or simply inconvenienced by ransomware, viruses, etc. are significant. So too are the number of data breaches and other web-based attacks that we’ve seen recently.

But.

And it’s a big but.

Not all attacks are successful because of the underlying technology. Or should that be in spite of?

Nope, many attacks succeed because of design faults, be that in the design of hardware, the coding of software, or through criminals taking advantage of human nature (think phishing and other socially engineered attacks).

But there is another way in which humans pose significant risks and that’s called human error.

Take, for instance, a story highlighted by the BBC yesterday: Auntie reported how non-technical mistakes accounted for 9 of 17 serious errors by public authorities or communications service providers in 2014.

The highlights from that list of errors include the arrest of one person in connection with a child sex investigation they had no link to, five police searches at innocent civilians’ homes and the tracing of the wrong email user (someone missed out an underscore in the email address) in connection with a child sex exploitation case (the police subsequently paid the innocent person a visit and searched their home).

Sir Anthony May, the Interception of Communications Commissioner, who recently released a detailed report on his findings, said:

Any police action taken erroneously in such cases, such as the search of an individual’s house who is unconnected with the investigation or a delayed welfare check on an individual whose life is believed to be at risk, can have a devastating impact on the individuals concerned.

May revealed how some ‘victims’ had been “incredibly understanding” while others had called in lawyers. Putting that into a business context, both attitudes come with a cost, either in terms of brand damage or possible financial consequences (or both).

So what can we learn from this and what is the answer?

Well, it’s a given that every organisation should be securing its technological assets against bad actors but I think this story highlights how the human element can throw all sorts of unexpected risks into the equation.

While simple, honest mistakes can be unpredictable and hard to mitigate against, there is much to be said for building a culture of security into any organisation.

By simply increasing staff awareness of all security risks, and especially those surrounding the handling and control of data, you can go a long way in improving your chances of not making mistakes. That’s not to say they won’t ever happen though, so now may also be a good time to check that your organisation has a good level of oversight in place, especially if it is dealing with highly sensitive data. You may also wish to check that you have a strong incident response plan and that everyone is familiar with it and their roles in making it happen.

About the Author: admin

Let’s Talk

Please leave your contact details and a member of our team will be in touch shortly.

"*" indicates required fields

Name*