phloxii - STOCK.ADOBE.COM
Technological flaws are often blamed for enabling cyber breaches, but humans remain the weakest links in most organisations’ digital armour – and this human element is all too often forgotten.
In tech terms, this is often known as a PICNIC, which consists not of a basket and sandwiches, but rather a “Problem in Chair, Not in Computer”. This means the greatest risks to cyber security originate from the people, not technology.
According to the EY Global Information Security Survey 2018-19, the biggest perceived cyber threats to organisations are phishing (22%) and malware (20%). For a phishing attack, the individual in the chair must click onto a malicious link to give cyber attackers access to a company’s information systems.
Likewise, malware attacks usually require a human to actively access infected material for the system to be impacted.
While customer, financial and strategic information are reportedly the three most valuable pieces of information that organisations wish to protect, those at the forefront of handling such data will often be employees, consultants or third-party service providers.
Attackers target these “ordinary people” with the aim of exploiting their well-meaning efforts to keep up with the pressures of digital workflows, often oblivious to the cyber risks they create in the process.
Of course, there are many other PICNIC manifestations, including the trifling problem of the email sent to the wrong recipient.
What’s in your PICNIC?
Given the extent to which effective cyber security relies on good cyber discipline within organisations, it’s probably not surprising that the majority of cyber attacks are carried out by insiders. While many will be the work of disgruntled or malicious employees, others may be motivated by a genuine sense of moral purpose.
If an employee accidentally clicks a malicious link, or unwittingly discloses confidential data, how would you approach the situation? Though you may wish to discipline, an employer needs to be reasonable at all times, and so the provision of training might be more appropriate, especially if this resulted from a genuine mistake, or due to an absence of adequate training in the first place.
However, nowhere is the moral PICNIC risk more evident than in the case of a whistleblower, who leaks sensitive information and breaches cyber security to highlight and improve what they believe to be poor corporate practice.
There are strict UK rules on how companies may treat a whistleblower, whose confidentiality should be protected and who should not be punished for raising legitimate concerns about illegal or immoral activity within an organisation.
The Public Interest Disclosure Act 1998 (PIDA) introduced the protection of whistleblowers, and this was expanded by the introduction of the Enterprise and Regulatory Reform Act 2013 (ERRA). If a worker makes a qualifying disclosure (i.e. discloses a certain type of wrongdoing, including a criminal offence such as fraud), which is in the public interest (affecting others, such as the general public), then they will be protected under UK legislation, and should not be subjected to any detriment for making the disclosure.
Reducing the risk of whistleblowing is therefore a question of corporate practice and culture: not even the best firewall will prevent an employee from speaking out if they feel misdeeds need to be exposed.
However, having strong internal reporting practices which employees have confidence in should allow companies to retain control of such situations and solve problems before they blow up, reducing the risk of a morally motivated cyber breach.
Grudges: the rogue employees
Other motivations for cyber attacks, such as anger and malice, are potentially trickier to guard against.
UK supermarket Morrisons suffered a serious cyber security breach in 2013, when a disgruntled employee copied and distributed the payroll data of almost 100,000 staff to a public file-sharing site.
The Court found Morrisons vicariously liable for the breach, concluding that there was sufficient connection between the employee’s position, and his wrongful conduct.
The Court accepted that the question is not whether Morrisons had done anything wrong, but whether the employee’s acts were closely connected with his employment, when he had committed the breach.
This creates real issues for employers, as to whether they can realistically monitor and prevent the actions of rogue employees and whether they should be held responsible for an employee who is intent on causing a breach.
Read more about risk management
- Security experts discuss how an integrated approach to risk and governance can be effectively managed.
- Most third-party risks are discovered after the initial due diligence period, Gartner study shows, highlighting the need for a new approach to risk management.
- Rather than handling risk management and mitigation within your organisation, outsourcing these important processes to a third party comes with substantial benefits.
The Court reasoned that as Morrisons can be expected to have been insured, they are more likely to have the means of providing compensation. However, this serves to neither avoid, nor cure the issue.
Ultimately, insurance is not a fix for everything. The recovery of information is also vital, and may involve seeking injunctions, agreeing undertakings, or requiring employees to deliver up devices for search and deletion.
As with a well-meaning employee, due consideration should be given to organisational improvement. However, given the malicious intent, you may wish to take clear disciplinary action through a dismissal, but this should be pre-empted by a robust digital and forensic investigation to prove culpability, ensuring that you have acted in a responsible manner at all times.
Skill deficits: The accidental breach
Lack of appropriate skill is also a challenge, although this is perhaps more easily surmountable than moral and grudge-driven issues.
Despite the widespread adoption of cloud-based services, according to analysis by KPMG, there continues to be an industry-wide deficiency of skilled, adequately trained personnel capable of protecting a company’s sensitive data, intellectual property and other processes.
Moreover, EY’s survey found that 89% of organisations questioned said they did not have a cyber security function in place that met their needs.
As an employer, whilst you should invest in your tech, remember that there is always a human element to implementing any technical solutions like SSL decryption (inspecting malware threats), or system scanning, such as the correct use of email and responsible use of software.
Likewise, in your efforts to monitor employee cyber behaviour, you must ensure that this remains a justified invasion of the employee’s privacy, in a manner compliant with the applicable data protection and employment laws. Often, periodic situational training might be more appropriate, and just as effective.
Conclusion: Turning a possible PICNIC into a walk in the park
Cyber security is a people issue. Organisations need to treat it as a form of business protection. It should form part of induction, with ongoing controls such as education and training (particularly on identifying a breach), as well as clear policies and practices on dealing with breaches.
Early cyber monitoring and improvement should also take place to protect the organisation from internal, as well as external threat.
This has been a summary of the headline issues, but given that each breach is unique, covering matters from data use and cyber monitoring to the recovery of information, we would recommend that you get in touch with our cyber team to help secure and support your business needs.
Cyber security requires a holistic approach, particularly in an age of remote working and constant innovation – it only takes one bad PICNIC to leave you out in the cold.
David was assisted in compiling this article by Ammar Thair, currently a trainee solicitor with Fieldfisher