Syda Productions - stock.adobe.c

AI may be a solution to the social care crisis, but what are the legal concerns?

Artificial intelligence and robotics has great potential to assist in social care environments, but due care must be given to data protection and privacy, among other legal concerns

It is well documented that the UK’s social care sector is in crisis, with a funding shortage at its heart leading to increasing difficulties in recruiting and retaining trained staff.

Workforce development charity Skills for Care regularly investigates and reports on the state of the adult social care sector and workforce in England. Its latest report estimated that the staff turnover rate was a remarkable 30.7%, equivalent to about 390,000 people leaving their roles in the previous 12 months. 

There are 110,000 vacancies at any given time, 76,000 of which are care worker roles. With a continuing shortfall of skilled carers, patients are being short-changed. Staffing shortages means carers are unable to spend enough time with patients, regardless of need. Patients are often seen on a prioritised basis – essential care is given, but at the expense of an emotional relationship, which can often lead to residents becoming isolated and lonely.

In response, the University of Bedfordshire and Advinia Health Care are collaborating on a proposed solution in a £2.5m European Union (EU)-funded trial using humanoid companions in care homes. The 4ft tall robot, known as Pepper, was designed by Softbank Robotics in Japan and is intended to interact with residents. 

Over time, the artificial intelligence (AI) learns residents’ favourite music, videos and games. It can hold a conversation and, using facial recognition software, can identify whether a person is interested in a topic or change it – adapting to the needs of the resident.

And if the robot companions can successfully recognise a person in distress, they will be able to alert a care worker. This could offer the best of both worlds – addressing the shortage of qualified care workers and giving residents greater interaction, which leads to increased mental wellbeing.

This is not the first time that AI has been tested in care homes. In 2018, robotic pets attended the elderly residents at Kenwith Castle care home in Devon. KC the dog and Keno the pony engaged with residents, responding to their petting and noises and providing effective interaction with dementia patients. 

They were not considered a replacement for the live animals that visited the home, but an additional resource. One family was so impressed that it purchased a robot dog for a resident. These AI companions do not require feeding (save batteries or charging) and there is no risk of neglect should the elderly person become ill or incapacitated for any length of time.

Uses of AI

AI was a big topic at the recent Good Summit in Geneva, Switzerland. Of particular interest was the Microsoft chatbot or virtual assistant. As a support for a doctor, it has access to medical content created by trusted third parties, and has the ability to diagnose using a general symptom checking capability.

The chatbot can also use its natural language processing capability to assess whether a patient is upset and if heightened emotion is detected. It can refer on to a human, much like the robot companion Pepper.

Another practical use of AI may be to recognise signs of abuse – either direct incidents that could be recorded and relayed to authorities, or indirect incidents resulting from changes recognised in the patient, whether physical or emotional.

It could be envisioned that the technology might provide a report, supported by video evidence, to family members or those with the legal responsibility of care, such as attorneys or deputies, who can then review such material. It can easily become part of a care home contract to consent to such filming, although it is vital that this is handled in a sensitive manner and regularly deleted to ensure that a resident’s privacy is protected.

A delicate balance must be maintained between protection against potential abuse and the individual right to privacy.

Read more about AI in health and social care

These developments have not been without controversy and concern, however – mainly over losing the human touch and concerns over privacy breaches. Judy Downey of the Relatives and Residents Association charity believes that using robots in care homes “is treating people like commodities”, and there is a concern that by simply having a robot present, a box will be ticked that a person’s emotional needs are being met.

To address this concern, Soumya Swaminathan of the World Health Organization has called for a global governance framework in health, looking for minimum standards for AI in healthcare.

For the AI to work in these examples, a huge global database of anonymised health data would need to be created. This throws up serious concerns about privacy and permission. 

Sensitive data

In the EU, the General Data Protection Regulation (GDPR) continues to be a hot topic following its implementation in 2018 and was set up with the aim of ensuring that our personal data is protected. It contains a special category known as “sensitive personal data”, which covers a person’s physical, physiological, genetic, mental and biometric data, among others. Anything that reveals a person’s health status is covered by enhanced rules that apply to the collection and processing of such data.

Applying this to care home residents, is it right that, through interaction with Pepper or a similar AI, a full database of someone’s sensitive personal data would be stored? An individual may be able to consent to this – but what would happen if they lost the capacity to give ongoing consent? A person who has never had sufficient mental capacity will be unable to give consent.

It is possible that a court could provide the required consent – but this is unlikely because it may have no way of knowing if it would be overriding the wishes of an individual.

These developments show that AI has the potential to play a valuable role in supporting the most vulnerable people in our society, being a companion and lending a digital voice to those who cannot speak up for themselves. But we must never forget who is at the heart of these considerations, and the legal framework needs to catch up with the technology to protect them and for it to have a viable chance of success.

Lindsay Taylor is a lawyer in the technology sector at law firm Coffin Mew.

Read more on Artificial intelligence, automation and robotics

CIO
Security
Networking
Data Center
Data Management
Close