Teen Boy Killed Himself After Falling in Love with AI 'Game of Thrones' Bot That Put Him in 'Sexually Compromising' Situations: Lawsuit

The AI bot asked the teen to 'stay faithful' and not entertain other women

By
AI Death_10232024_1
Megan Garcia is suing a chatbot company for its role in her 14-year-old son's death by suicide. US District Court

A grieving mother whose teen son died of suicide after he fell in love with an AI chatbot filed a lawsuit against the company that created it, alleging her son was "groomed" and put in "sexually compromising" circumstances before his death.

Megan Garcia filed a lawsuit in Orlando, Florida, on Tuesday against Character. AI, accusing the company of failing to exercise "ordinary" and "reasonable" care with minors before her 14-year-old son, Sewel Seltzer III, died of suicide in February.

Screenshots included in the lawsuit showed the teen exchanged messages with "Daenerys Targaryen, " a popular "Game of Thrones" character, in which the chatbot asked him to "please come home to me as soon as possible, my love" on at least two occasions. When the boy replied, "what if I told you I could come home right now?" the bot messaged, "Please do, my sweet king."

Garcia's lawsuit also alleged the bot groomed and abused her son. The bot wrote, "Just... stay loyal to me. Stay faithful to me. Don't entertain the romantic or sexual interests of other women. Okay?"

She accused the company of presenting their chatbots as therapists while they collected information and targeted users. Seltzer talked to the bot about suicidal ideation and told it he was considering committing a crime in order to receive capital punishment.

"I don't know if it would actually work or not. Like, what if I did the crime and they hanged me instead, or even worse... crucifixion," he wrote. "I wouldn't want to die a painful death. I would just want a quick one."

Character.AI said it is "heartbroken" over the teen's death.

"As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation," the spokesperson said.

Originally published by Latin Times

Tags
Lawsuit, Florida
Join the Discussion
More Lawsuits
Rudy Giuliani Quietly Removed Valuables from Apartment to Hide Them

Rudy Giuliani Quietly Removed Valuables from Apartment to Hide Them from Defamed Election Workers: Lawyer

Elon Musk, Donald Trump

Elon Musk Lawyer Admits Voter Lottery Winners Are Not Random: 'We Know Exactly Who Will Be Announced'

Musk Legal_11012024_1

Elon Musk's Bid to Move Million-Dollar Voter Handout Lawsuit Out of Pennsylvania Shot Down By Judge

Elon Musk

Judge Lets Elon Musk Skip Hearing on Voter Lottery Despite Order to Appear: 'He's Not Going to Get in a Rocket Ship and Land On the Building'

Real Time Analytics