Production Perig - stock.adobe.c

Cambridge Cognition CTO on applying server monitoring principles to suicide prevention

Cambridge Cognition CTO, Ricky Dolphin, discusses why it pays to think outside the box when trying to find technological answers to business and customer problems

The user community is not averse to finding alternative uses for products that are sometimes far removed from what they were originally designed by their creators to do.

The technology world is no different, but it is a concept that is often talked about negatively or as part of a discussion about the dangers of products falling into the wrong hands or being misappropriated by organisations with nefarious intentions.

In an enterprise IT context, there is sometimes reluctance from IT leaders to use technologies in ways that stray too far away from their original purpose or ideal deployment environment for licensing, support and data protection reasons.

Sometimes, though, looking beyond the original use case or intended deployment setting for a technology can yield huge benefits – both for businesses and society, says Ricky Dolphin, chief technology officer (CTO) of UK-based cognitive assessment software provider, Cambridge Cognition.

For this reason, companies that might be struggling to find a technology that will solve a particular business or customer problem they are facing should never limit their search to products and services that are designed specifically to fit the bill, he tells Computer Weekly.

“There are plenty of examples throughout history where products that were never designed to fulfil a particular function have actually been very successful when they get utilised in that way,” he says.

“The challenge is to get individuals to look at technology and question what problems it solves outside of its current application area.

“We should be challenging [IT leaders] looking for software solutions to not just look at products designed for that particular purpose, but more generally about software out there that does a similar job, even if it was never originally designed for that purpose.”

By the same token, technology firms should also take a broader, more open-minded view of their own product and service portfolios in the hope they might unearth use cases or vertical markets they might prove a good fit for.

“For people like myself who are responsible for setting the strategic [technology] direction in a company, we should look at our own software, product and hardware lines and think out of the box in terms of how something can be used,” says Dolphin.

“This could give us opportunities to put products to use in ways we never envisaged, which could be both good for our business and society.”

Expanding use cases

Dolphin’s views on this subject are informed by his own experience of repurposing a technology that is traditionally used by enterprises to monitor the health and status of their IT infrastructure, and putting it to use in a healthcare setting.

Cambridge Cognition makes software that supports clinical trials pertaining to the development of safer and more effective treatments for brain-related health conditions.

Its products are used by academic institutions, big pharmaceutical companies and healthcare organisations to design their clinical trials, find the right patients to take part in them and gain a better understanding of how participants are responding to the treatments being applied to them.

“As an organisation, we’re always looking at technology with a view to thinking, ‘This is great technology, but how does this help our business?’. I look at pieces of technology and think about how can we apply it to the cognitive space and to clinical trials,” says Dolphin.

“It’s not unusual for us to look at something and say, ‘This wasn’t designed to do what we’re going to do with it, but it will do what we want it to do’, and then twist things to make it more relevant to us.”

Some of the participants in the clinical trails the company’s software assists with present with debilitating, incurable conditions, which its clients are attempting to find new treatment options for.

In many instances, patients with these conditions may also suffer from related mental health difficulties, prompting Cambridge Cognition’s clients to request support from the firm for additional functionality that would make it easier for that to be tracked during trials too, says Dolphin.

“It is quite common across incurable or rare diseases, such as Alzheimer’s, Parkinson’s and multiple sclerosis, to see people experience depression because they are dealing with a disease that is quite debilitating,” he says.

“There is an obligation [for clients] to monitor the whole disease and the individual, and as part of that we are often asked if we can monitor depression, and provide an ability to alert if somebody our customers is studying is ideating suicide.”

This is usually monitored by getting an individual to fill out a form, or by a clinician asking a participant a series of questions based on the Columbia Suicidality Severity Scale, to assess the individual’s suicide risk.

If someone is deemed to be at heightened risk of suicide, time is of the essence to get them the support and help they may need, prompting calls for Cambridge Cognition from its customers to implement some form of alerting system.

This is particularly as the healthcare industry has moved to a model in recent years that increasingly supports clinical trials taking place in the home, which means – depending on where the participant is taking part in the trial – they may not have a clinician on hand to offer immediate support.

“If you do have a clinician there, you want to ensure that any indication that an individual may be actively planning suicide is flagged up and action taken, but you don’t necessarily want to rely on the clinician having to identify that alone,” Dolphin adds.

“So you need a backstop in a face-to-face setting, but you also need a way of detecting and alerting when you’re not face-to-face, and that’s really where the technology came in.”

An application for escalation

The technology in question is PagerDuty’s incident alert system, which Cambridge Cognition already uses in its more “traditional sense”, as Dolphin terms it, for alerting the company’s IT operations teams to any issues that might be arising in the cloud and server infrastructure underpinning its software systems.

If a problem starts to arise in the organisation’s server infrastructure, an alert will go off and a member of the firm’s IT operations team is notified so they can take remedial action. However, if the person misses the alert or does not respond quickly enough, the system will escalate the issue to the next person in the chain of command until someone steps in.

Given how highly regulated the healthcare industry is, and how critically important real-time reporting of how participants respond to treatments are in a clinical trials context, downtime has to be kept to a minimum, says Dolphin, which is how PagerDuty helps.

“When our software is used during a clinical trial where a drug is first used in humans, there will be very tight parameters in terms of when to collect the information needed to do cognitive testing or to collect data,” he says.

“There might be a two-minute window, and for our software not to be available or not to function is not acceptable. This is the same for if the software fails to collect the information, because there is no way to go back and collect that information again.”

There are other systems that could provide similar support from a suicide prevention perspective, says Dolphin, but they have their shortcomings.

“There are plenty of alerting systems for hospitals designed to alert personnel around the hospital, but they’re not really designed to alert someone’s mobile phone or escalate things,” he says.

Using SMS messaging technology would partially solve the problem, but lacks the escalation element that sees the PagerDuty system comes into its own in a clinical trials setting.

“With SMS messaging, my immediate thought is, ‘How do you know if the clinician has received the message and is doing something about it?’, which is like the situations we find ourselves in with IT support,” he says.

“I’ve been on the backend of getting support requests in the middle of the night or the early hours of the morning to say servers have gone down and something needs to be done about it, and it is easy to miss the call or message.

“But it’s one thing when your servers are having problems or going down and you don’t manage to get hold of somebody in IT operations to do something about it, but it is quite another if somebody is actively planning suicide. You don’t have the luxury of time.”

The benefits of repurposing tech

The process of readying the PagerDuty platform for its deployment in this alternative setting was relatively straightforward, says Dolphin.

In our cloud environments, if a server goes down, we get an alert triggered by a piece of logic, and that’s where the application programmable interface [API] in PagerDuty came in useful,” he says.

“By looking at that, we worked out that we could build the logic into our electronic forms [questionnaire] and that logic can send an API call to PagerDuty, asking us to raise specific instances of an alert.

“That can then start a sequence of events where the right clinician is notified, based on the information that has been passed in the API call – and if that person doesn’t respond, PagerDuty will take care of the escalation.”

According to Dolphin, the repurposing of PagerDuty’s technology in this way has been enthusiastically received by the original client it trialled it for, as well as several others that have gone on to deploy it.

This feedback has prompted Dolphin to think beyond the confines of clinical trials about how this functionality could be put to use in other settings to make it easier for those grappling with suicidal thoughts to get access to help and support services in a timely way.

“What we really need as a society is to have this type of technology as an app on an iPhone or Android where an individual can go in, complete one of these forms and [alert] an individual who can take action on it,” he says. “The key is to link the technology to organisations that can help and have the bandwidth to help.”

Suicide prevention charities, for example, could reach out to an individual when an alert is received and let them know someone is looking out for them, and provide additional support as needed.

Beyond that, there is scope for artificial intelligence (AI) and machine learning technologies to be incorporated into this process too, particularly when it comes to analysing and acting on social media posts, for example, that suggest the author might have designs on taking their own life.

“There’s always a delicate balance to be struck between privacy and the correct use of information,” says Dolphin. “It would mean getting the balance right by getting people to consent to that level of monitoring to provide them with help should they need it without being invasive.”

One way of achieving this could by training machine learning models and AI tools on social media or internet forum posts to look out for certain trigger words that indicate the author might experiencing suicidal thoughts.

When those are detected, the tech could generate a response that asks the author if they want to fill out a questionnaire so their feelings can be explored a little deeper.

“If somebody clicks on a link to that questionnaire because it’s been picked up that they’re feeling suicidal, or there is something in the language they’re using that has triggered the monitoring technology to ask if they would like to complete one of these forms, [that means] people are actually consenting to complete it and get help,” he says.

Read more about IT monitoring

Read more on Software-as-a-Service (SaaS)

CIO
Security
Networking
Data Center
Data Management
Close