motortion - stock.adobe.com

Human vs digital therapy: AI falls short when IT pros need help

Stressed IT and cyber professionals are turning to AI chatbots for support, but is handing people's mental wellbeing over to algorithms really such a bright idea?

Over half of cyber security professionals lose sleep due to work-related stress, according to research by the Chartered Institute of Information Security (CIISec: 2022/23 State of the profession survey). They suffer from these and other symptoms similar to those we deal with in combat veterans at PTSD Resolution, the UK ex-Forces mental health charity.

Yet increasingly these stressed IT professionals are turning to AI chatbots for mental health support, largely because they are unable to access proper therapeutic help, or maybe it just seems easier.

To us, this is very concerning. We appear to be facing a mental health crisis in the IT sector, and instead of addressing root causes, we are handing people over to algorithms.

The AI therapy market reality

The numbers are alarming: more than 1.6 million people are on a mental health waiting list in England, and the NHS estimates that up to eight million with diagnosable conditions receive no treatment. Tech entrepreneurs have stepped in to fill this gap at least in part with AI-powered mental health and also companion platforms, which promise a sympathetic ear and even a ‘relationship’ with a chatbot.

We can understand the appeal. These systems are available 24/7, seemingly cost-effective, and for IT professionals working irregular hours under constant pressure, they may offer immediate relief.

But accessibility is not the only consideration when dealing with vulnerable people. In fact, PTSD Resolution successfully pioneered the delivery of therapy over the internet during the Covid-19 pandemic, and we continue to offer this service today, in addition to in-person sessions.

For IT workers, some of whom are ex-military personnel who've moved into cyber security, the stress patterns can mirror combat trauma. The constant vigilance, high-stakes decisions, and responsibility for protecting others. These aren't simple problems that a response automated by an algorithm can solve.

The human advantage

The risks are evident, although specific cases of harm inflicted by therapy chatbots are harder to pin down. Many of these AI services claim to embed suicide-screening algorithms, automatic helpline sign-posting, and, in at least one case, human escalation.

But unlike human therapists bound by ethical codes and professional oversight, most consumer chatbots lack mandated clinical oversight and have only rudimentary crisis-escalation scripts.

From an evolutionary viewpoint, human distress has always required a human response. Our ancestors needed others who could read facial expressions, interpret vocal nuances, and understand contextual factors. This is how our brains are wired to process and heal from trauma.

AI chatbots lack these capabilities. They cannot observe body language during panic attacks, detect subtle voice changes indicating deception about mental state, or understand the complex interplay between work pressures and personal circumstances. Unlike AI, a human may notice that someone in distress, claiming to be ok, might be masking.

General chatbots may not have safety parameters and ways of identifying if the issue needs to be taken over by a therapist. For IT professionals dealing with moral injury such as being forced to implement surveillance systems against their values, or making decisions affecting thousands of users' data security, this contextual understanding is crucial.

There is also automation bias. IT professionals may be particularly susceptible to trusting algorithmic advice over human judgment, creating a dangerous feedback loop where those most likely to use these systems are most vulnerable to their limitations.

Privacy and security concerns

IT professionals should be particularly alarmed by privacy implications. Human therapists operate under strict confidentiality rules, protected by laws and regulations. But ChatGPT acknowledges that engineers “may occasionally review conversations to improve the model.”

Consider the implications: your most private thoughts, shared during vulnerability, potentially reviewed by programmers optimising for user engagement rather than therapeutic outcomes, or even a state intelligence organisation or criminal gang hacking that data for their own nefarious purposes.

Human Givens Therapy

The human therapy alternative has been tested and proven effective. PTSD Resolution uses a therapy developed by the Human Givens Institute and all 200 therapists in the charity’s network are qualified members. HGI recognises that humans have innate emotional needs: security, autonomy, achievement, meaning, and others. When these needs aren't met, psychological distress follows.

Tony Gauvain, an HGI therapist and retired army colonel who chairs PTSD Resolution, explains: “Executive burnout and military trauma share similar symptoms – depression, anger, insomnia. It's about feeling overwhelmed and unable to cope, whether from a military incident or stressful encounters with management.”

HG therapy acknowledges the fundamentals of human psychology: we are pattern-matching creatures. Skilled therapists can identify metaphors in language, recognise processing patterns, and work with imagination to reframe traumatic experiences. Crucially, they adapt in real-time based on the client's often very subtle responses – something no algorithm can replicate. At least not yet.

There is clear evidence for this approach. PTSD Resolution achieves a 68% reliable improvement rate with 80% treatment completion, typically delivered in around six sessions, according to a King's College London study, published in Occupational Medicine in March 2025.

At £940 per treatment course – delivered free of charge to UK Forces’ veterans, reservists and their families – it is highly cost-effective compared to the long-term impacts of untreated trauma, and even to other person-to-person therapies. We are very lean in our operation, owning no assets and channeling donations to pay for the therapists’ time for each session.

Read more about mental health and tech

Real-world success

We've seen this approach work with IT professionals experiencing constant fight-or-flight mode due to work pressures, but unable to take the natural action their stress response demands. Unlike our ancestors who could fight or flee threats, modern workers must sit at desks pretending everything's fine while their nervous systems are in overdrive.

Through our Trauma Awareness Training for Employers (Tate) programme, the charity has worked with companies like Anglo American. Following training, 100% of delegates reported significantly increased confidence in identifying and supporting colleagues experiencing trauma.

The King's College evaluation found that our therapy clients showed sustained improvement, despite often working with people who had complex post-traumatic stress disorder (PTSD) and had been failed by other services.

Most recently, we formed a strategic partnership with CiiSec, with services now available to their membership of more than 10,000 cyber security professionals. This collaboration provides both mental health support through trauma awareness training and access to professional therapy.

The bottom line

AI may have supplementary roles – perhaps for basic education or support between therapy sessions. But as a replacement for human therapists? No. No AI chatbot has UK or FDA approval in the USA to treat mental health conditions, and documented risks are too significant.

For IT professionals struggling with burnout, depression, or work-related trauma, the solution is not better algorithms – it's better access to qualified human therapists who understand this industry's unique pressures.

Ultimately, healing happens in a relationship. It occurs when one human truly understands another's experience and guides them towards meeting fundamental emotional needs. No algorithm can replicate that.

The choice is not between convenience and inconvenience, not when a full HG therapy session is available over Zoom, often within days of a first exploratory contact call. The choice is in fact between genuine help and digital simulation of care.

Malcolm Hanson is clinical director at PTSD Revolution.

Read more on Security policy and user awareness