By Dr Bibiana Chan

AI-assisted therapy, also known as AI therapy, utilizes artificial intelligence to provide mental health support. It can assist with mental health concerns through automated conversations, therapeutic exercises, diagnosis, consultations, and treatment options. AI therapy offers benefits like increased accessibility, cost-effectiveness, and availability 24/7, particularly in remote or underserved areas. It can also assist with early mental health screenings and approximate diagnoses for conditions such as depression and anxiety based on data streams from voice, mobile phone activity, or interactions with games.
AI therapy uses machine learning and advanced algorithms to offer personalized support, but it is not intended to replace human therapists. AI lacks the emotional intelligence and nuanced understanding that trained mental health professionals bring to therapy. Some popular AI therapy chatbots include Woebot, Youper, Tess, and Wysa.
While AI therapy offers benefits, concerns remain about its effectiveness and ethics. Critics argue that AI therapists can’t provide the essential human connection that traditional therapy offers. However, for many users, AI therapy offers a convenient, inexpensive alternative. For example, Wysa, an emotionally intelligent chatbot launched in 2016, has 3 million users and has been deployed in the UK’s NHS and Singapore’s pandemic response. The chatbot has received approval from the U.S. Food and Drug Administration (FDA) to treat depression, anxiety, and chronic pain, indicating its potential for therapeutic use.
One notable study from 2017 assessed Woebot, a widely used app. The study found a significant reduction in depression symptoms after two weeks of use among 70 college students, but the intervention was short-term and lacked follow-up. Researchers have also raised concerns about whether these bots can handle crises appropriately. In one instance, Woebot failed to provide an appropriate response to a user expressing suicidal thoughts. The app is designed to inform users during onboarding that it is not a crisis service and to direct users to emergency services when necessary.

A 2020 review of mental health chatbots concluded that while they have potential, there isn’t enough evidence to support their widespread effectiveness. There is also concern that AI bots could create an illusion of help. Experts recommend strict regulation to ensure these apps accurately communicate their capabilities. For instance, Woebot was found to inadequately respond to reports of child sexual abuse, underscoring the need for ethical considerations in AI development.
Despite these issues, experts acknowledge that AI chatbots can be a valuable tool for some users. They can offer easily accessible resources on managing mental health issues, incorporating principles like cognitive behavioral therapy (CBT) to help users reframe negative thinking patterns. While AI therapy may not suit everyone, it can provide support for those who may not otherwise have access to traditional therapy.
AI therapy chatbots have evolved from earlier models like ELIZA, a 1966 text-based program designed to simulate a therapist. Today’s bots, like Woebot and Wysa, are more sophisticated, using natural language processing to analyze user input and generate pre-approved, evidence-based responses. While these bots can simulate human conversation, they cannot create original responses and are limited by their training data.

For some individuals, these AI bots provide valuable emotional support. As one researcher noted, interacting with a chatbot can offer users a sense of being heard without judgment—an experience that can be powerful, especially for those who have never felt listened to before. However, concerns about the accuracy and quality of advice remain, as flawed responses may lead to mistrust of the technology.
The future of AI therapy remains uncertain. While some experts are optimistic about its role in mental health care, they caution against premature celebration. AI may one day complement traditional therapy, but it is unlikely to replace human therapists entirely. As the field continues to develop, more robust data will be needed to determine the efficacy of AI therapy in supporting mental health.
Listen to the Pros & Cons of AI-powered mental health chatbots from the experts:
In conclusion, AI-assisted therapy presents a promising new approach to mental health care, offering accessible, cost-effective support. However, it is important to acknowledge its limitations and the ethical concerns that come with its use. While AI can assist in improving mental health care, it cannot replace the genuine human connection crucial for effective therapy.
Recommended readings:
https://www.wired.com/story/mental-health-chatbots
https://seas.harvard.edu/news/2024/05/coming-out-chatbot
https://pulitzercenter.org/stories/when-your-psychologist-ai
