FotolEdhar - Fotolia
Do you trust your technology? At the most recent CW500 event, experts discussed the concept of trust, data, systems and people in a digital world.
It is impossible to escape technology – it’s everywhere. Consumers and companies are putting more and more trust in digital technologies, but like the experts pointed out at the event, it is the people behind it that can fail us.
Robert Carolina, executive director at the Institute for Cyber Security Innovation, Royal Holloway, University of London, said there is a complete breakdown of understanding “between people who are working very diligently to try to build trust into a system, but their build ends at the technological line”.
“There is a lot of confusion between what it means to trust a system, and what it means to trust a person,” he said. “A lot of technologists spend a huge amount of time worrying about the concept of trust in terms of systems, but they divorce that from the human element.
“You have to have both the human element and the technological element correct. Ensuring that people do the right thing is a very different art from assuring that machines do what you expect them to do.”
Carolina said that when people use the term trust, they often conflate “the computer science or computer engineering conceptualisation of trust” with what he, as a lawyer, thinks of as trust.
“I think of trust and I think of ethics as an issue of relationships between human beings,” he said. “But I recognise that people who are highly skilled in computer science think of trust as how different systems interact and this is often some combination of authentication, integrity and possibly even confidentiality.”
David Tidey, CIO at Wandsworth Council, said he saw himself as a trusting person, but the need for trust had changed since we moved into a more digital age. “Most of us probably work for organisations where you have an online presence and many are pushing all our customers to make that their only interaction,” he said.
“As IT people, we think security and trust is all about SSL certificates, data security and data sharing protocols, but I think it’s slightly different sometimes.
“We build trust with our customers, our residents, whatever they are, by making systems easy to use, and we build trust by delivering on the promises we make.”
Who do you trust?
As CIO of a local authority, Tidey also has to put his trust in IT suppliers to make sure the council’s IT is up to scratch and, most importantly, safe.
And with more and more organisations moving all their data to the cloud, trust is even more important.
“We are looking at host solutions and cloud solutions because we think it is the sensible thing to do,” he said. “Often we make that decision because we think – for the right reason – that the supplier of the system is obviously the expert – so why wouldn’t they make a good job of running it?
“We pay them a fee for carrying out that work, and we have probably got a robust contract in place to ensure they are delivering, so what could possibly go wrong?”
Wandsworth Council is working on moving a business-critical application that holds sensitive information about vulnerable adults and children to a hosted environment.
But its relationship with the company is breaking down and the application is not performing well, said Tidey.
“If the supplier can’t work out how much memory to allocate to a server to make the system perform, do I trust them with my data?” he asked.
“Trust between us and the supplier is breaking down. We need to rebuild it and work out how we are going to improve the performance of the system. At some point, I need to satisfy myself that I can tell my board that we still trust this supplier to keep hold of this really incredibly sensitive data.”
Data protection regulation
Robert Bond, head of the data protection and cyber group at law firm Charles Russell Speechlys, said that although he mostly trusts technology, he is not sure he trusts how people use it.
Big data itself raises “a whole load of trust issues” in terms of the merger of one dataset with another, and that issue is people, said Bond.
There could always be “an idiot in the organisation who will still do something stupid”, which causes the rest of the organisation to lose trust, he said.
The new General Data Protection Regulation (GDPR), which will come into force on 25 May 2018, and the Network and Information Security (NIS) Directive will certainly have an impact on both trust and ethics.
Bond said the GDPR would be enforced with eye-watering fines, but for most companies, it is not the size of the fine that matters, but the loss of trust.
Read more from Computer Weekly’s CW500 club
- Digital transformation is not an easy feat and requires not just the right technology, but also the right investment, people and engagement.
- To make DevOps work, organisations need to get rid of change request forms, make friends with change control management and get teams to communicate and share ideas.
But trust can be rebuilt, said Tidey, who pointed to the TalkTalk data breach that exposed more than a million customers’ details.
“They realised they had screwed up, and actually they rebuilt trust with their customers by being honest, telling them what had happened and admitting to failure,” he said.
The GDPR threatens fines of up to €20m or 4% of annual worldwide turnover for groups of companies, whichever is greater. Even for the world’s biggest companies, that is a lot of money.
But some big players will simply say it’s just the risk of doing business.
However, what the new regulation, particularly the privacy impact assessment (PIA), does is to require companies to implement data protection by design and to have much stronger methods of securing data.
“It means you have to carry out data protection in-house assessments,” said Bond. “So whether you are doing something different with somebody’s personal data, whether it’s data analytics, whether it’s the use of IoT, Fitbits or connected vehicles collecting large volumes of data, the PIA should make you as a business say stop until we can convince ourselves there is not a significant risk to the individual.
“Without the PIA as part of your arsenal, you can expect to be more heavily fined than if you had it in place.”
Carolina said emerging technologies also bring significant challenges.
“For example, with the internet of things, we are connecting more and more things to each other, but there seems to be less and less attention paid to the security aspect of those things,” he said. “Things that were never designed for broader global connectivity are nonetheless being connected to the broader global internet – and that is creating huge problems.”
From a legal perspective, the question about liability comes first, he said, and there is a concept that people are responsible, not machines.
“So when a machine goes wrong, courts, lawyers and, policy-makers are always going to be looking for who are the people who caused this mistake to happen,” he said.
Bond added that just because technology enables you to do something, it doesn’t always mean you should do it. He pointed to the example of connected cars.
“Just because you can use your wearable tech with this new connected autonomous vehicle to program it, so if you have got diabetes 2 and your blood-sugar level is out of kilter and the car will take you to the side of the road, it doesn’t mean you should,” he said.
“What if you have not built in the security and you are a diabetes sufferer who is also a senior cabinet minister. Somebody could hack a car could basically override the inbuilt safety you put in and force you to fall asleep and crash. You are going to have to have enhanced security.”