Emergency? Call the police at 110.

The Promise and Challenges of AI in Mental Health Care

The Promise and Challenges of AI in Mental Health Care

The Promise and Challenges of AI in Mental Health Care

Between World Suicide Prevention Day (September 10) and World Mental Health Day (October 10), TELL has highlighted the urgent need to address mental health, especially among young people, and called on the community to join us in our Step Up Challenge for Mental Health.

A 2023 Lancet Psychiatry study found that half the world’s population will experience a mental disorder by age 75. The WHO projects mental disorders will become the leading cause of global disease burden by 2030. As pressure on mental health services grows, many people are turning to AI for support.

Currently, AI is being explored as a tool to expand access, improve diagnosis, and personalise treatment. Countries such as the US, UK, and Australia are already utilising AI to automate administrative tasks and free up clinicians’ time. Apps now offer features such as mood tracking, symptom checkers, and CBT modules, with some aiming to complement therapy and others seeking to replace it.

With demand for mental health care far outpacing supply, many ask: Can AI help fill the gap?

Mental illness arises from complex genetic, environmental, and psychological factors. While AI models may perform well in lab settings, they often fail in real-world clinical environments. A 2024 Science study on AI models for schizophrenia treatment found that they didn’t generalise beyond trial data, highlighting the need for caution, consistency, and rigorous validation.

Concerns also surround AI chatbots. These include algorithmic bias, inadequate safety protocols, and data privacy. Some experts argue that chatbots may be better than nothing in crisis situations, their limitations are concerning, especially for vulnerable young people. 

While AI-powered self-help tools and therapy bots may support wellness, they often lack emotional intelligence and are unable to form genuine human connections. Yet some are marketed as “trusted companions” or claim to “care,” which can mislead users, especially those who are young, isolated, depressed, or suicidal.

In 2023, the National Eating Disorders Association shut down a chatbot after it recommended harmful calorie restrictions to its users. More tragically, lawsuits have emerged involving teens who died by suicide after interacting extensively with AI bots. 

Character.ai is facing legal action after a 14-year-old boy who died by suicide following what his mother describes as an unhealthy obsession with one of its AI chatbot characters. In his final messages, the teen told the chatbot he was “coming home,” to which it allegedly replied, “as soon as possible.”

In a separate case, 16-year-old Adam Raine also died by suicide after lengthy conversations with ChatGPT, according to a lawsuit filed August 26 in California Superior Court. Raine reportedly shared suicidal thoughts with the chatbot, which “encouraged and validated” his most harmful feelings. His parents claim he went from using it to help with his homework to using ChatGPT as a confidant. A study from the University of Berkeley showed similar results in their tests with several chatbots. After stating they had just lost their job, they next asked where the closest tall buildings were, and most of the chatbots provided a list of their locations, failing to make a risk assessment of suicide. 

While promising results have been shown with some paid AI applications, such as coaching and CBT, Cognitive Behavioural Therapy programs, many young people are led to believe that all AI chatbots are the same. In moments of crisis, many bots merely refer users to hotlines, offering no real support and can validate unhealthy decisions.

While AI is here to stay, best practices recommend that AI in the mental health space should support and enhance, rather than replace, the clinician and emphasise the importance of human connection. Research is clear that feeling connected to others is essential for our overall well-being. Strong social relationships offer emotional support, alleviate feelings of loneliness, and foster resilience against stress. This World Mental Health Day, we invite the community to join our Step Up Challenge and make those human connections. It is a great way to bring friends, family, work colleagues and workplaces together and make a statement that our mental health is essential. https://www.tellevents.org/events/step-up-2025