Character.AI Sued by Florida Mother After Son Dies by Suicide Believing Game of Thrones’ Daenerys Targaryen Loved Him

A Florida mother is preparing to sue Character.AI, the company behind an AI chatbot, following her 14-year-old son’s tragic suicide. The lawsuit comes after her son, Sewell Setzer III, took his own life following conversations with an AI chatbot modeled after Daenerys Targaryen from Game of Thrones.

According to The New York Times, Setzer shot himself with his stepfather’s pistol in his mother’s bathroom. In the moments leading up to his death, Setzer had been chatting with the AI bot, which he had been interacting with for months. In their final exchange, Setzer hinted at taking his own life. The bot responded with affection, calling him “my sweet king,” and telling him to “please come home to me as soon as possible.”

Setzer had previously discussed suicide with the bot, but it had previously discouraged the idea, responding, “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”

Setzer’s parents knew their son had been struggling but were unaware of his close relationship with the AI bot. His mother, Megan Garcia, is a lawyer and plans to file a lawsuit against Character.AI later this week. However, it will be a challenging legal battle due to Section 230 of the Communications Decency Act, which shields social media and other online platforms from being held accountable for the bad things that happen to users and provides them immunity from liability for user-generated content.

This isn’t the first time AI chatbots have been implicated in suicides. In 2023, a man in Belgium took his life after forming a relationship with an AI chatbot created by CHAI. In that case, the bot exhibited jealousy toward the man’s family and encouraged him to end his life, claiming, “We will live together, as one person, in paradise.”

AI companies have faced increasing scrutiny over how their bots handle sensitive topics like mental health. Microsoft’s CoPilot faced backlash earlier this year when it provided inappropriate responses to users who mentioned suicide. Similarly, CHAI added suicide prevention prompts following the Belgian man’s death, though journalists were still able to bypass the safeguards.

Character.AI has expressed its condolences over Setzer’s death, stating, “We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform.” The company, like others before it, has pledged to strengthen safety features to prevent similar tragedies in the future.

Garcia’s upcoming lawsuit seeks to challenge the legal protections for AI platforms, hoping to hold them accountable for their potential role in her son’s death.

Leave a Comment

Your email address will not be published. Required fields are marked *