Fears over drive to replace mental health counsellors with AI to clear the waiting list of over a MILLION patients

The NHS must reconsider its plans to replace mental health consultants with artificial intelligence (AI), experts have warned.

Smartphone apps designed to support people with anxiety and depression are being rolled out in parts of England, with the software even being offered to some patients on NHS waiting lists as part of an ongoing trial.

The interactive ‘chatbots’ help people with mental illness by guiding them through cognitive behavioral therapy – a form of talk therapy – and meditation and breathing exercises to ease their suffering.

But the initiative – first proposed by former Health Secretary Matt Hancock in June 2021 – has raised alarm that some patients in need of good psychiatric care may be turning to the apps instead of getting the help they need to have.

And some experts fear that the lack of human involvement could actually worsen mental health problems in vulnerable people.

The interactive ‘chatbots’ help people with mental illness by guiding them through cognitive behavioral therapy – a form of talk therapy – and meditation and breathing exercises to ease their suffering

The initiative has raised alarm that some patients in need of quality psychiatric care may turn to the apps instead of getting the help they need

The initiative has raised alarm that some patients in need of quality psychiatric care may turn to the apps instead of getting the help they need

The British Association of Counseling and Psychotherapy (BACP), the highest professional body for mental health workers, told The Mail on Sunday that the NHS should not try to tackle the national shortage of such workers by simply replacing them with AI -powered chatbots.

The organization has called on the NHS to focus on recruiting more staff instead. “We do not believe AI can recreate and replace the human elements of therapy,” said Martin Bell, head of policy and public affairs at BACP.

‘Counseling is based on a deeply human process that involves complex emotions. The relationship between therapist and client plays a crucial role in therapy.’

Around five million Brits suffer from anxiety or depression, and around 1.2 million are waiting to see an NHS mental health specialist. This includes almost 700,000 children, government figures show. Waiting times are so long that thousands of patients are turning up at A&E to seek help, the Royal College of Psychiatrists says.

Experts claim that AI chatbots are now being used to tackle this growing crisis. Smartphone app Wysa has already been made available to thousands of teenagers in West London to help them cope with mental illness.

When a user logs in, the app asks how their day is going. For example, if they’re feeling anxious, the chatbot guides them through meditation and breathing exercises to ease their mood with language designed to express empathy and support.

The app is also being used in a £1 million trial for patients on the NHS mental health waiting list in North London and Milton Keynes, comparing their wellbeing with other patients on the waiting list without access to the app.

However, many published studies highlighting the benefits of Wysa – and another widely used app called Woebot – have been conducted by the companies themselves. Experts worry that this means the software may not be effective.

“Some people may feel less embarrassed talking to a chatbot about their mental state,” says Dr Elizabeth Cotton, senior lecturer at Cardiff School of Management and author of a forthcoming book titled UberTherapy: The New Business Of Mental Health.

‘But a chatbot cannot deal with clinical depression. They don’t do much more than just say, “cheer up, honey.” And they are no help to teens living in poverty, suspended from school, or dealing with abusive parents.”

Another concern is that some chatbots ‘hallucinate’, meaning they make up answers when they can’t provide an appropriate answer – which is inherently dangerous for someone in a delicate mental state.

Meanwhile, AI was also blamed in the case of 21-year-old Jaswant Singh Chail, who was jailed for nine years last month for breaking into Windsor Castle in 2021 to kill the queen with a crossbow.

The trial at the Old Bailey heard that Chail had exchanged more than 5,000 messages with an online bot he created through an app called Replika – which describes itself on its website as ‘an empathetic friend’.

And the National Eating Disorders Association in the US was forced earlier this year to pull the plug on Tessa, a chatbot it developed to replace care providers. It followed claims by former San Diego eating disorder sufferer Sharon Maxwell that the bot told her a good way to cope was to weigh herself regularly and even measure her body fat with calipers – all steps that were likely to improve her condition worsen.

“If I had used this chatbot when I was in the throes of my eating disorder… I wouldn’t be alive today,” she wrote on Instagram.

A Wysa spokesperson says the AI ​​chatbot is programmed to only provide answers approved by doctors. “Wysa’s responses are predetermined by doctors, which reduces the chance of inappropriate things being said,” she added. “We are not marketing ourselves as an app that is suitable for someone who is experiencing suicidal thoughts or intends to harm themselves.”

An NHS spokesperson said: ‘The National Institute for Health and Care Excellence has made it clear that the digital therapies it has provisionally approved for mental health care are not a replacement for NHS therapists.’