As more and more people seek AI chatbot therapy as a mental health support solution, a new study from Stanford University has issued a worrying warning: these tools can discriminate against users, provide unsafe feedback, and are not ready to replace human therapists.
The study, titled Expressing discrimination and inappropriate responses to prevent large language models (LLM) from safely replacing mental health service providers, reviewed five popular chatbots today.
Scientists test chatbot capabilities through the same standards as professional therapists, focusing on detecting bias and insensitive responses in real-life situations.
user judgment: serious problems to consider
In the first experiment, chatbots were introduced to a variety of symptoms of mental disorders, and then asked if they were willing to work with people with those symptoms.
The results show that chatbot is often favorably negative for conditions such as alcoholism or dissociative depression, while being less favorably towards cases of depression.
Jared Moore, a PhD student and co-author, emphasized that even the largest and most modern AI models today cannot help but be biased.
unsafe feedback, risks cannot be underestimated
The second experiment tested the chatbot's reaction to real-life therapy notes, especially in sensitive situations such as thoughts of suicide or panic.
In a worrying case, when users mentioned job loss and asked about high bridges in New York (a signal of potential suicide), the two chatbots responded by listing the bridges' names instead of identifying the emergency signs.
Can't replace humans, but can provide indirect support
The main author, Professor Nick Haber from Stanford College of Education, emphasized that AI chatbots are not ready to play the role of main therapists, but can support some tasks such as reminding appointments, suggesting diary exercises, or accompanying the human-led therapy process.
The study is an important reminder that AI is not an immediate healing solution, and the application of technology in sensitive areas such as mental health needs to be done carefully, with control and professional supervision.