NicoElNino - Fotolia

NHS must transform to reap the benefits of AI

Artificial intelligence could ease the pressures on the NHS, but first the service needs fully integrated health records and to tackle challenges around trust and data

Artificial intelligence (AI) has the potential to transform the way the NHS delivers care, but to take advantage of the technology, the health service needs a culture change, say experts.

At the AI Summit in London last week, there were many examples of how AI could improve care – but the NHS is far from ready to benefit from these advances.

Dominic King, clinical lead for Google DeepMind, told the audience at a techUK workshop during the summit that AI could “really help the NHS and the current challenges it faces”, but that it was “challenging to point to any scaled-up example of how AI is impacting in any specific clinical pathway at the moment in the NHS”. 

“If the truth be told, we have been very unsuccessful at tacking some of these big NHS health policy challenges,” he said. 

King said most of the healthcare delivered in the NHS are not evidence-based, with one in 10 patients coming into the NHS and other health systems being “harmed in some way and exposed to an avoidable medical error”. According to King, AI could tackle some of these issues, but only if the service takes a “practical approach”.

The problem is that the NHS is still largely paper-based, despite the drive to introduce electronic records across the country, he said. And where hospitals do have electronic systems, they often do not talk to each other.

“The NHS is still very much reminiscent of a pre-smartphone age and, in most cases, it is entirely not ready for the deployment of AI algorithms,” said King.

“There are silos of data all over the place, captured in hundreds of different electronic systems that don’t speak to one another, and a large majority also still exists on paper.”

Most NHS hospitals still use paper observation charts and paper-based prescribing, he added.

“Even if we were to be successful in generating AI-driven alerts and notifications, there is actually a real problem with how you would deliver them to the clinician or the patient,” he said.

Royal Free pilot

DeepMind is currently running a pilot with the Royal Free NHS Foundation Trust, using the company’s Stream system to alert clinicians as soon as a patient with acute kidney injury (AKI) is at risk.

However, even this system does not incorporate AI because DeepMind felt that although it is confident it can drive algorithms to detect AKI, it was “not useful to go down that route until we had a better way of delivering an AI-driven alert to clinicians”, said King.

Meanwhile, IBM has partnered with Alder Hey Children’s Hospital, which is using its IBM Watson supercomputer and AI to support children before, during and after outpatient appointments.

The hospital uses an app that children and their parents can use to remind them of appointments and inform them about aftercare, as well as enabling children to ask questions about their hospital visit.  

Change the experience

Andreas Haimboeck-Tichy, director for health and life sciences at IBM UKI, said the AI-driven app not only helps children learn about the process and what is likely to happen during their appointment, but that “at the same time, the hospital is learning what the patient is concerned about, and over time, as data points build up, they can change the experience in terms of how they perform their outpatient practices, for example”. 

Haimboeck-Tichy added: “The other piece where it helps is that once the outpatient appointment is completed, six weeks later, the child needs to come back for a checkup. Often, clinicians found the follow-up appointment was a waste of time because either the child was fine and therefore didn’t need to come, or they had been suffering for six weeks.

“So we can now use machine learning as a way of communicating with the hospital and actually intervene when we need to, rather than in six weeks’ time when we could have done something earlier.”

The healthcare sector is “ripe for disruption and needs to change”, said Haimboeck-Tichy, but to fully reap the rewards of AI, the NHS needs to move to a completely different model, and that requires leadership from the top. 

But not all AI projects in healthcare are a success. Earlier this year, Public Health England launched an AI chatbot trial for breastfeeding mothers, called the Breastfeeding Friend chatbot. However, NHS England’s digital and social innovation lead, Dominic Cushnan, said that although the intention was there, it often didn’t work.

When a mother asked a question, the AI chatbot was responding with phrases such as “I don’t understand what you’re saying, I’m only a chatbot”, said Cushnan, which was “not helpful for our patients, and it’s not helpful for future conversations around the use of AI”.

NHS needs to join up

Cushnan said the NHS also suffers from a chronic disease known as “pilotitis”, where many small, and often successful, pilot programmes are run in different trusts, but never see large-scale deployment. Cushnan said that when he first joined the NHS, a colleague told him the service had “more pilots than British Airways”.   

NHS England is trying to combat this problem by getting trusts and clinical commissioning groups (CCGs) to join forces, link health records and share best practice, but Lydia Drumright, clinical informatics lecturer at Cambridge University’s Department of Medicine, said: “We are supposed to be one NHS, but we’re not. Trusts don’t share across trusts.”

Drumright said the NHS suffers from variable electronic data sources and poorly linked infrastructure. And despite the government making all the right noises about the potential of AI, “we have to be realistic”, she said.

“We need fully integrated electronic health records and social records for this to work well,” she added. “Imagine what we could do with fully integrated records. We need to link.”  

Data ownership and consent 

Applying AI to rich and joined-up data could improve clinical care, give clinicians better decision support and help population health management, said Drumright, but the UK needs to standardise information governance and ethics policies.

“There is a cultural barrier associated with information governance,” she said. “People are afraid to release data even where it is legally and ethically appropriate. Ethics structures need to be dealt with.”

The use of personal data, albeit anonymised, has a chequered history in the NHS – most notably concerning the Care.data programme, which was intended to extract anonymised patient data from GPs to a central database held by the Health and Social Care Information Centre (HSCIC). The aim was for the data to be used for better analytics to understand health trends, feeding into improved preventive healthcare and research into new drugs.

But after a disastrous few years of a botched publicity campaign which saw NHS England criticised for brushing off the public’s concerns about data privacy, the project was scrapped last year.  

Cushnan said the PR fallout from that project was “not helpful for building trust”, and Drumright said Care.data had caused a “huge and painful burn”.

Drumright also raised the issue of consent and data ownership. The EU’s General Data Protection Regulation (GDPR), which comes into force in May 2018, aims to bring in stronger safeguards around the use of personal data. Drumright said the GDPR would introduce increased accountability and transparency, but added: “The one concern I have is that it’s going to be potentially harmful to the concept of informed consent.”

Read more about healthcare IT

She said the regulation is causing hype around the need for everyone to consent every single time a patient goes into hospital – but there is a difference between consent and informed consent. 

“From an ethics point of view and a research point of view, if it is not informed consent, it’s unethical,” she said. “It’s far better to say ‘this is what we do with your data, we anonymise it and use it. If you’re unhappy, opt out’.

“That is far more ethical than getting people to agree to something they don’t fully understand and getting them to sign off on that.”

Drumright added that data ownership is a “problematic concept” and the idea that someone owns their personal health data creates a problem. 

“There is no such thing as a data owner,” she said. “Data that is about you is generally not collected by you or held by you. My care record is about me, but actually that sits with whoever collected it, and it’s not information that I generated. It’s the doctor’s perception of me.

“If need surgery, I go to see the doctor and he makes an assessment of me. Is that data his or mine? It is a transaction between the two of us. Nobody can own that, but somebody has to protect the transaction.” 

Drumright said she would like to see an ethics committee set up to handle the use of human data. 

“If we look at culture, trust and confidence, the best way to do this is to involve and engage the people you are working with,” she said. ............................................................................................................

CW+

Features

Enjoy the benefits of CW+ membership, learn more and join.

Read more on Healthcare and NHS IT

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close