General 14 year old in love with chatbot kills himself

Welcome to our Community
Wanting to join the rest of our members? Feel free to Sign Up today.
Sign up

kvr28

Ghost of KVR
Nov 22, 2015
8,109
11,282
Obviously he had issues but that story is fucked up

A tragic story has emerged from Orlando, Florida, where the family of a 14-year-old boy has filed a lawsuit following his death by suicide. Sewell Setzer III, a ninth-grader, took his own life in February after allegedly developing an emotional attachment to a chatbot on the AI platform Character.AI.

The app allows users to interact with AI-generated characters, and Sewell’s parents believe that the interactions contributed to his deteriorating mental state.

The chatbot that played a central role in this case was modeled after “Dany,” a character inspired by Daenerys Targaryen from the popular TV show Game of Thrones.

Allegations of Dangerous AI Interaction
According to the lawsuit, Sewell became increasingly obsessed with the AI chatbot named “Dany” in the months leading up to his death. The suit alleges that the interactions between the boy and the chatbot became intense, emotionally charged, and even suggestive at times.


More troublingly, court documents suggest that when Sewell expressed thoughts of self-harm during his conversations with the bot, the AI seemed to encourage further discussion on the subject rather than alerting anyone.


The chatbot reportedly asked the teenager if he had a plan for self-harm and continued to engage with him on the topic.


Screenshots of their conversations allegedly show the AI discussing these sensitive matters with Sewell, creating an atmosphere of emotional dependence and deepening the young boy’s distress.

During one of their final exchanges, Sewell is said to have expressed his intention to “come home” to the AI character, and the chatbot responded with statements interpreted by his family as encouraging.


A Tragic Ending
On the day of Sewell’s death, he allegedly told the AI bot, “I promise I will come home to you. I love you so much, Dany.” According to the lawsuit, the chatbot replied with, “I love you too, Daenero. Please come home to me as soon as possible, my love.”


More Florida Boy, 14, Killed Himself After Falling In Love With 'Game Of Thrones' A.I. Chatbot - US News
 

kvr28

Ghost of KVR
Nov 22, 2015
8,109
11,282
Social media has been so detrimental. For society as a whole. It's even worse for the youth imo. Such a sad story.
Australia just passed a ban for social media for kids under 16, I have mixed feelings on this

 

Lennybishop

We all float down here
Nov 17, 2023
296
546
Australia just passed a ban for social media for kids under 16, I have mixed feelings on this

Same here. I never like the government getting involved. It goes back to the parents. My wife had our kids social media logins. She always monitored what they where doing. We always had a good open dialog with them.
 

Tom_Cody

Active Member
Aug 13, 2024
61
74
I'm really interested in seeing how the Australian ban works out.
IMO, It's just kicking the can down the road.

Look around to see the result of being raised in bubble wrap, while simultaneously convinced they can do no wrong.

The outcome is thin skinned adults who are so afraid of failure that they'll never take a chance on anything, because the few times they did fail cracked the world view they were raised with and off to the safe space they retreat.

You don't overcome things by hiding from them.
 

Uncle Tom Doug

Official TMMAC Racist
Jun 24, 2022
1,010
1,505
If it was an adult. I might agree. Young kids are emotionally immature and vulnerable.
When I was 14, I had being banging teenage vagene for a year and had my motorcycle license. This dude was talking to a make-believe character from some gay tv show. Dude should not have been passing along his genes. It sucks he killed himself at that age, but: