Can you come out to an AI chatbot?
Mental health chatbots are increasingly being used to provide support for a variety of issues. Still, new research indicates that these tools often fail to address the specific needs of LGBTQ+ individuals.
Coming Out to an AI Chatbot
Researchers from Harvard, Emory University, Vanderbilt University, and the University of California Irvine found that while chatbots can offer quick support, they tend to fall short in understanding and addressing the unique challenges faced by the LGBTQ+ community, sometimes resulting in unhelpful or harmful advice.
The researchers interviewed 31 participants to assess the impact of large language models (LLM) -based chatbots on mental health support, with 18 of them identifying as members of the LGBTQ+ community and 13 as non-LGBTQ+.
Many participants acknowledged that chatbots offered a sense of solidarity and a safe space for self-expression. They report using them to practice coming out or asking someone for the first time.
However, they also highlighted the significant limitations of these programs.
Generic and Emotionally Detached Chatbots
Participants noted that chatbots frequently provided generic and emotionally detached responses.
One participant mentioned that the chatbot would offer sympathy but rarely provide constructive solutions, especially when dealing with instances of homophobia, according to the research's findings.
Zilin Ma, a Ph.D. student at SEAS and co-first author of the paper, emphasized that chatbots cannot effectively handle hostile interactions, making them unsuitable for delicate conversations like coming out.
Ma stressed that while improvements are possible through targeted fine-tuning of LLMs, technology alone cannot address all aspects of LGBTQ+ mental health.
The researcher suggested using chatbots to train human counselors instead of direct crisis intervention. According to Ma, this approach could empower counselors with technology while ensuring a human connection.
Read Also : OpenAI Working on ChatGPT Search Feature that Could Provide You With Citations For Results
"Holistic Support System"
Krzysztof Gajos, the Gordon McKay Professor of Computer Science, emphasized the need for broader societal changes to address discrimination, bullying, and the stress of coming out.
He highlighted the importance of training counselors and fostering supportive online communities to create a holistic support system for LGBTQ+ individuals.
"We can optimize all these LLMs all we want, but there are aspects of LGBTQ+ mental health that cannot be solved with LLM chatbots - such as discrimination, bullying, the stress of coming out, or the lack of representation. For that, we need a holistic support system for LGBTQ+ people," Ma said in an official statement.
The experts ultimately advocate for socio-technical solutions that combine technology with human intervention to improve mental health support for vulnerable communities.
"Research in public health suggests that interventions that directly target the affected individuals—like the chatbots for improving individual well-being—risk leaving the most vulnerable people behind," said Gajos.
"It is harder but potentially more impactful to change the communities themselves through training counselors or online community moderators."