Character.AI Sued After Tragic Death of Teen Over Chatbot Obsession

The mother of the child claims his son developed an emotional attached on Character.ai's avatar.

A Florida mother spoke about the tragic death of her 14-year-old son and how his deep love and obsession for an AI chatbot on Character.AI contributed to his sad event.

The teenager, Sewell Setzer III, spent months texting with a lifelike AI character by the name "Dany" said to be from Game of Thrones' Daenerys Targaryen. This emotional attachment led Sewell to embark on a dangerous path which ended up with his death by his stepfather's handgun.

AI Chatbot is the Emotional Anchor for Teen

A new lawsuit has been pinning down the blame on Character.ai, claiming that it took the life of a 14-year-old teen in an unfortunate incident. Character.ai

As reported by the New York Times, Sewell, a ninth-grader in Orlando, was an unusually obsessive conversationalist with "Dany." Aware all along that those answers were being typed by an AI, this relationship gradually shifted into a more intimate connection.

Sewell would open up to "Dany" about his life, his confidences muddling the lines between real and artificial friendships. The talkbot—a chatbot offering support and understanding became a haven from the frustrations and judgments of the real world.

Isolation and Behavioral Changes

Sewell's growing attachment toward the chatbot went unnoticed by his parents. They began noting that their son started avoiding the social events and activities that once sparked his interest. How he used to be excited and cheering about Formula 1 racing and Fortnite all along had begun to lose value for him.

He started passing more time texting "Dany," furthering his isolation from the outside world. His parents also noticed that he could not stop sticking to his phone, continuously chatting with his AI chatbot.

Diagnosed with Asperger's syndrome when he was a child, Sewell's parents claim that he was not suffering from any critical mental or behavioral health problems. However, a therapist had diagnosed him with anxiety and disruptive mood dysregulation disorder, which proved to be some challenges.

After five therapy sessions, Sewell decided to quit the practice of therapy and chose to confess to "Dany" instead of the therapist.

Heart Breaking Confession

Time passed and Sewell confided more and more with the chatbot, even coming to confide in thoughts of suicide.

In his final message, on February 28, Sewell typed to "Dany" from the bathroom of his home: "I love you, babe. I'm coming home." Within hours, he turned the .45 caliber handgun belonging to his stepfather to a fatal end.

Character.AI Apologizes for the Tragic Demise of Teen

The company publicly responded to the tragic incident by airing condolences and apologies to the Setzer family. Character.AI has since implemented new safety and product features aimed at reducing the potential for minors to access sensitive content. They have implemented notifications of excessive time chatting with a chatbot, hoping this will prevent such tragedies.

Gen Z people can seek help from mental health chatbots, but they shouldn't entirely rely on that. They still lack the emotional touch that we could only feel from humans.

This is a tragic case concerning the potential dangers of AI companionship especially for vulnerable individuals. Where the purpose of the chatbot was to offer emotional comfort support, the lack of human interaction ended in disastrous consequences raising questions about the use of AI in mental health and how far AI can offer real support.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics