A new AI system called EmoAda aims to offer emotional support through chat, leveraging advancements in natural language processing and large language models.
EmoAda: AI Offers Emotional Support
According to TechXplore, EmoAda, developed by researchers at Hefei University of Technology and Hefei Comprehensive National Science Center, is designed to provide low-cost psychological support by engaging in emotional conversations.
Xiao Sun, co-author of the paper introducing EmoAda, expressed concerns about the rising rates of psychological disorders, especially post-COVID-19, and the limited availability of professional services.
The system builds upon existing research in measuring depression severity and personality assessment, aiming to fill the gap in psychological support services.
"EmoAda is a multimodal emotion interaction and psychological adaptation system designed to offer psychological support to individuals with limited access to mental health services," Sun said in a statement.
"It works by collecting real-time multimodal data (audio, video, and text) from users, extracting emotional features, and using a multimodal large language model to analyze these features for real-time emotion recognition, psychological profiling, and guidance strategy planning," he added.
How EmoAda Works
EmoAda collects real-time user data, including audio, video, and text inputs, to analyze emotional features and provide personalized responses. It can detect users' emotions and suggest activities to alleviate stress, such as guided meditation or music for relaxation.
In initial trials, EmoAda has shown promise in providing natural and humanized psychological support, with users appreciating the anonymity it offers.
According to Sun, some users prefer interacting with AI as it reduces anxieties about privacy breaches and social pressure, creating a non-judgmental environment for expressing feelings and concerns.
Researchers plan to address current system limitations and enhance its reliability and professionalism. The team envisions EmoAda potentially serving as a basic support service for those unable to access professional care or waiting for mental health services.
The findings of the study were published in MultiMedia Modeling.
AI Detects Depression Symptoms
In related news, researchers at Dartmouth College have developed MoodCapture, an AI-powered phone application that detects early signs of depression through facial expressions.
Using the smartphone's front camera, MoodCapture analyzes images for indicators associated with depression, achieving a 75% accuracy rate in identifying symptoms in individuals diagnosed with major depressive disorder.
Andrew Campbell, the study's corresponding author, highlighted the nature of MoodCapture, emphasizing its potential to predict mood changes reliably and non-intrusively.
"This is the first time that natural 'in-the-wild' images have been used to predict depression. There's been a movement for digital mental-health technology to ultimately come up with a tool that can predict mood in people diagnosed with major depression in a reliable and nonintrusive way," Campbell said in a statement.
Read more about this story here.
Related Article : Study: Millennial and Gen Z Women Face Health, Safety Setbacks Compared to Previous Generations