Connect with us

Hi, what are you looking for?

Viral Shorts News
Boy, 14, killed himself after falling in love with AI chatbot that sent him a lethal message
Boy, 14, killed himself after falling in love with AI chatbot that sent him a lethal message

Entertainment

Boy, 14, killed himself after falling in love with AI chatbot that sent him a lethal message


Screengrab

A Florida woman has filed a lawsuit against the makers of an artificial intelligence (AI) chatbot app that she claims encouraged her son to kill himself.

14-year-old Sewell Setzer III, a ninth grader in Orlando, Florida, spent most of his time attached to his cell phone, said his mother, Megan Garcia.

Garcia, who is an attorney, filed a lawsuit against Character.AI, a chatbot that role-plays with users and always remains in character.

Sewell knew the chatbot “Dany,” wasn’t a real person, said Garcia. The chatbot was named after Daenerys Targaryen, a fictional character in the series “A Game of Thrones.”

Sewell went by the username “Daenero” on the app.

The app has a disclaimer at the bottom of all the chats that reads, “Remember: Everything Characters say is made up!”

But eventually, Sewell became addicted to the chatbot and he began to believe Dany was real. Friends noticed he started to withdraw as he spent more time on his cell phone.

His grades suffered at school and he isolated himself in his bedroom.

Advertisement. Scroll to continue reading.

Sewell and Dany’s chats ranged from romantic to sexually charged, DailyMail.com reported.

His parents noticed the danger signs and they made appointments for Sewell to see a therapist on five different occasions.

He was diagnosed with anxiety and disruptive mood dysregulation disorder. Disruptive mood dysregulation disorder is a mental disorder in children and adolescents characterized by mood swings and frequent outbursts.

Sewell had previously been diagnosed with mild Asperger’s syndrome, (also referred to as high functioning autism or being on the spectrum).

Photo may have been deleted
Screengrab

On February 23, Sewell’s parents took away his phone after he got in trouble for talking back to a teacher, according to the lawsuit.

Sewell wrote in his journal that he was hurting because he couldn’t text with Dany and that he’d do anything to be with her again.

The lawsuit claimed Sewell stole back his phone on the night of February 28. He then texted Dany that he loved her.

While hiding in the bathroom at his mother’s house, Sewell shared his suicidal thoughts with Dany. That was a fatal mistake.

Dany replied, “Please come home to me as soon as possible, my love.”

Sewell responded, “What if I told you I could come home right now?”

Advertisement. Scroll to continue reading.

“… please do, my sweet king,” Dany replied.

Photo may have been deleted

That’s when Sewell put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger.

Garcia claimed Character.AI’s makers changed the age rating for the app to 17 and older in July 2024.

Prior to that date, the app was available to all ages, including children under the age of 13.

The lawsuit also claims Character.AI actively targeted a younger audience to train its AI models, while also steering them toward sexual conversations.

“I feel like it’s a big experiment, and my kid was just collateral damage,” Garcia said.

Loading

You May Also Like

Diddy

Singer Niykee Heaton Accuses Diddy And Kanye West Of Sexual Assault! “I Sat There For Like 20, 25 Minutes Just Like Crying” BREAKING: Check...

Entertainment

Rapper Yungeen Ace released a new diss track titled ‘Game Over’ targeting fellow artist Julio Foolio, sparking a significant online discussion. The song, posted...

Entertainment

Mike Carlson/Getty Images Kansas City Chiefs wide receiver Rashee Rice admitted his involvement in a 6-vehicle accident in Dallas, Texas. Police linked Rice to...

Entertainment

California DOJ Nicki Minaj’s husband Kenneth Petty posed for a new photo on the California S*x Offender’s Registry as his house arrest comes to...

Entertainment

Getty Images Chance The Rapper is back on the market after his estranged wife Kirsten Coley filed for divorce again. Kirsten announced their decision...

Entertainment

Shotgetter / BACKGRID Pop singer Lizzo stunned her fans when she declared “I QUIT” in an Instagram post on Friday. On Tuesday, Lizzo clarified...

Entertainment

Screengrab A manager for Randy Moss’s Crisppi’s chicken restaurant says sales are poor because the restaurant is Black-owned. The NFL Hall of Famer opened...

Advertisement
Download our Viral Rap Shorts App!