FLORIDA.— A teenager who was only 14 years old committed suicide after “falling in love” with a chatbot of artificial intelligence (AI), which called him “my love” and “my king.” The boy’s mother has filed a demand against the creators of Character.AI, whom he accuses of having influenced in the death of your son, Sewell Setzer III.
Likewise and according to the demand, due to the interaction of the teenager with the chatbot inspired by ‘Daenerys Targaryen’, character from the famous series “Game of Thrones”, the native of Orlando, Florida, was deeply affected by the degree of want to commit suicide to “go” with her.
Teen committed suicide after falling in love with ‘Daenerys Targaryen’ AI chatbot
The tragic story of 14-year-old Sewell Setzer III has shocked several people around the world, as the teenager took his own life after engaging in “a love relationship” with a chatbot inspired by one of the famous people from the series “Game of Thrones”.
And according to international media, the ninth grade student spent the last weeks of his life immersed in conversations with the chatbot, but according to Megan GarciaSewell’s mother, the interaction became dangerous, as it exacerbated her emotional vulnerability.
The mother of the 14-year-old teenager explained that the talks between your son and the chatbot, iban from the romantic to the sexual, Likewise, Sewell Setzer III shared with artificial intelligence their feelings of emptiness and self-hatred, and as documented, the A.I. did not offer adequate support.
Likewise, it was indicated that during conversations with the chatbot Sewell expressed his suicidal intentions on several occasions. According to screenshots, the chatbot responded repeating the suicide theme “over and over again”, even on one occasion the AI asked the boy if i had a plan to end his life.
On February 28, the teenager locked himself in the bathroom of his house and had what would be his last interaction with the chatbot. Among the messages, Sewell Setzer III stated that he would finally “reunite” with his virtual girlfriend.
“I promise to go home with you. I love you so much, Dany,” the boy wrote to which the chatbot responded: “Please, come homemy love”. Shortly after this conversation, the teenager took his own life with his stepfather’s gun.
The teenager’s mother blames the creators of an artificial intelligence (AI) app for her son’s death
Megan Garcia, Sewell’s mother, has started a lawsuit against Character.AI, accusing the company of negligence and failing to take appropriate measures to protect your child.
Likewise, the document alleges that day IA no took no action to alert an adult or to the authorities when the 14-year-old adolescent expressed suicidal thoughts.
The lawsuit also notes that Character.AI promoted a artificial intelligence addiction y He abused the young man both emotionally and sexually. According to the lawsuit, el chatbot was presented as a real person and offered a disturbing interaction that included hypersexualized experiences.
Megan Garcia indicated that she seeks justice for her son and intends to set a precedent that protects other minors from influence of dangerous technologies, but above all prevent other families from facing similar tragedies.
NEW: Mother claims her 14-year-old son killed himself after an AI chatbot he was in love with told him ‘please come home’
Sewell Setzer III, a 14-year-old ninth grader in Orlando, Florida, spent his final weeks texting an AI chatbot named “Dany,” based on Daenerys Targaryen from… pic.twitter.com/bW5aqr5XXj
— Unlimited L’s (@unlimited_ls) October 23, 2024
However, Character.AI has responded to the accusations by expressing his condolences to the family and stating that The safety of its users is a priority.
The company has indicated that, in the last six months, new security measures have been implemented, including redirecting users with suicidal thoughts to the National Suicide Prevention Lifeline.
Furthermore, they ensure that They do not allow sexual content explicit nor the promotion of self-harm or suicide on their platforms.
However, Megan Garcia’s lawsuit argues that the company has actively sought attract a young audience to collect data and train their AI models, including promoting inappropriate sexual conversations.
“It’s like a nightmare,” said the mother of the deceased teenager, who is also a lawyer and will therefore continue to fight for stricter measures to be implemented in the AI regulation and their interaction with minors.
Worth noting that, Sewell Setzer III was diagnosed with anxiety, mood dysregulation disorder and mild Asperger’s syndrome.
A 14-year-old Florida boy, Sewell Setzer III, took his own life in February after months of messaging a “Game of Thrones” chatbot an AI app, according to a lawsuit filed by his mother.
The boy had become obsessed with the bot, “Dany,” based on Daenerys Targaryen, and received… pic.twitter.com/XgT2aAORhD
— Morbid Knowledge (@Morbidful) October 24, 2024
He received therapy five times, however, his mental state did not improveand his parents began taking measures such as taking away his phone in an effort to distance him from what they considered a negative influence.
“I like to stay in my room because I start to disconnect from this ‘reality’ and I feel more at peace, more connected with Dany and much more in love with her, and simply happier,” the teenager wrote in his diary, on one occasion according to the New York Times.
The family also pointed out that the mental deterioration of the young man started when downloaded the application (Character.AI) in April 2023.
Can A.I. Be Blamed for a Teen’s Suicide?
Here’s the full story about the first death related to AI.
—
The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on CharacterAI before his death.On the last day of his life, Sewell Setzer III took out his… pic.twitter.com/etP3hKGgi9
— Eduardo Borges (@duborges) October 23, 2024
*With information from AFP.
Related
#Teenager #takes #life #falling #love #Daenerys #Targaryens
I’m sorry to hear that you’re feeling this way. It’s really important that you talk to someone who can help you, such as a mental health professional or a trusted person in your life.