weerapat1003 - Fotolia
The Information Commissioner’s Office has ruled the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google-owned artificial intelligence (AI) firm DeepMind.
The ICO began an investigation into the deal in May 2016, after receiving a complaint from a member of the public.
The deal gave DeepMind access to the healthcare records of 1.6 million patients that pass through three hospitals in North London, which fall under the care of the Royal Free Hospital Trust, as part of a trial to test an alert, diagnosis and detection system for acute kidney injury.
Despite assurances Google could not use the data in any other part of its business; that the data is stored in the UK by a third party, and that all data will be deleted when the agreement expires at end of September 2017, the data sharing agreement raised concerns when it was first brought to public attention.
At the time, a spokesperson for the Royal Free said that, although patients would not be aware that data was being made available, it would be encrypted and such an arrangement was standard practice.
“Our arrangement with DeepMind is the standard NHS information-sharing agreement set out by NHS England’s corporate information governance department, and is the same as the other 1,500 agreements with third party organisations that process NHS patient data,” the Royal Free said in a statement.
But the ICO investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.
Read more about NHS data sharing
As part of the investigation, the ICO asked national data guardian Fiona Caldicot to provide advice on the use of implied consent, which means that patients are not asked to consent to the data being used each time.
In her letter, Caldicott said that as the data was being used to test a detection and diagnosis system, and not for direct patient care, her opinion remained that “it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose”.
Acting in line with the law
As a result of the investigation’s findings, the ICO has asked the trust to commit to changes ensuring it is acting in line with the law by signing an undertaking.
Elizabeth Denham, information commissioner, said there was no doubt the huge potential that creative use of data could have on patient care and clinical improvements. “But the price of innovation does not need to be the erosion of fundamental privacy right,” she said.
Calls for transparency
According to Denham, the trust could and should have been far more transparent with patients as to what was happening with their data. “We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome.
“The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used,” she said.
Following the ICO investigation, the Trust has been asked to:
- Establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials;
- Set out how it will comply with its duty of confidence to patients in any future trial involving personal data;
- Complete a privacy impact assessment, including specific steps to ensure transparency; and
- Commission an audit of the trial, the results of which will be shared with the Information Commissioner, and which the Commissioner will have the right to publish as she sees appropriate.
Denham said the undertaking the ICO has asked the Trust to sign, and the letter outlining the conclusions of the ICO’s investigation, have both been published.
She has also outlined four lessons to be learnt from the case in a blog post:
It’s not a choice between privacy or innovation
The shortcomings the ICO found were avoidable, said Denham. “The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights,” she said.
Denham said she is confident the trust can comply with the changes the ICO has asked for and still continue its valuable work. “This will also be true for the wider NHS as deployments of innovative technologies are considered,” she said.
Don’t dive in too quickly
Privacy impact assessments are a key data protection tool and play an increasingly prominent role in data protection, said Denham. “They’re a crucial part of digital innovation. Our investigation found that the Trust did carry out a privacy impact assessment, but only after Google DeepMind had already been given patient data. This is not how things should work,” she said.
Instead, Deham said privacy impact assessments soon be carried out as soon as possible to help factor in findings at an early stage to help meet legal obligations and public expectations.
New cloud processing technologies mean you can, not that you always should
Just because evolving technologies enable organisations to do more does not mean these tools should always be fully utilised, particularly during a trial initiative. “In this case, we haven’t been persuaded that it was necessary and proportionate to disclose 1.6 million patient records to test the application,” she said.
NHS organisations need to remember that when dealing with the medical information of real patients they need to consider whether the risks to patient privacy are likely to be outweighed by the data protection implications for patients. “Apply the proportionality principle as a guiding factor in deciding whether you should move forward,” she said.
Know the law and follow it
When setting out to test the clinical safety of a new service, NHS trusts need to remember to follow data protection rules, said Denham. “Whether you contact the ICO or obtain expert data protection advice as early as possible in the process, get this right from the start and you’ll be well-placed to make sure people’s information rights aren’t the price of improved health.”