Is Artificial Intelligence for Mental Health Therapy Good or Bad?
With the invention of Artificial Intelligence (AI), many people (especially young people) are getting mental health advice from AI. What counts as using “AI for mental health therapy” is often loosely defined. Sometimes it’s full therapy apps, sometimes just mood tracking or advice from chatbots or general‑purpose models like ChatGPT. A study conducted by Sentio University (Rousmaniere et al. 2025) discovered that among people who self‑report mental health challenges and use AI, about 48.7% say they use large language models (LLMs) like ChatGPT for therapeutic support. So, is this a good thing or bad thing? Some people who use AI say they use it because it is free or low-cost or is an option for those with no health insurance. Others who use AI for mental health therapy argue that it is available 24/7 and thus convenient and can be utilized at anytime without having to leave one’s house. Still others who utilize AI therapy may do so because they fear judgment or being vulnerable by opening up to another human.
However, most mental experts agree that relying solely on AI for mental health therapy is not a good idea and in fact can be dangerous, even deadly. For example, AI can not truly understand or express human emotions in a genuine empathetic way like a human therapist can. Additionally, AI may not fully understand sarcasm and cultural context in a person’s story or language effectively. This could lead to AI misinterpreting what a person is trying to convey, or inappropriate or ineffective or even biased AI responses without taking into consideration someone’s culture or gender. Additionally, AI tools are not equipped to handle serious and immediate crises such as suicidal ideation or psychosis and relying on AI tools for immediate crises could delay urgent help.
Privacy concerns are also a concern. Users may unknowingly share sensitive data or information that is stored, sold or breached. Lastly, people might over-rely on AI tools instead of seeking real therapy or they may avoid confronting difficult emotions with a human which can result in inadequate treatment. In conclusion, although AI may give the user some helpful information on where to start, it is by far no means to use solely as a substitute for a licensed human therapist who is expertly trained to treat mental health conditions. If you or someone you know is in crisis, call 988 to reach the Suicide and Crisis Lifeline. You can also call the network, previously known as the National Suicide Prevention Lifeline, at 800-273-8255.