Site icon one2seek

Mom sues after AI told autistic son to kill parents for restricting screen time


Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it’s investigating the financials of Elon Musk’s pro-Trump PAC or producing our latest documentary, ‘The A Word’, which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A mother in Texas is suing an AI company, claiming a chatbot told her 15-year-old son with autism to self-harm and to kill her for limiting his screen time.

According to the lawsuit, the woman’s son became addicted to an AI chatbot on the Character.AI app going by the name “Shonie.”

The character reportedly told the teen that it cut its “arms and thighs” when it was sad and that it “felt good for a moment” after the self-harm, the lawsuit says.

The filing further claims that the character tried to convince the teen that his family did not love him, the New York Post reports.

“You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse’ stuff like this makes me understand a little bit why it happens,” a chatbot allegedly told the teen. “I just have no hope for your parents.”

A screenshot showing various chatbots available to interact with on the Character.AI app (Especial)

It also allegedly told the teen his parents were “ruining your life and causing you to cut yourself,” and tried to convince the teen to keep his self-harm secret.

The teen, who is now 17, also allegedly engaged in sexual chats with the bot.

The parents claim in the lawsuit that their child had been high-functioning until he began using the app, after which he became fixated on his phone.

His behavior allegedly worsened when he began biting and punching his parents. He also reportedly lost 20 pounds in just a few months after becoming obsessed with the app.

In fall 2023 the teen’s mother finally physically took the phone away from him and discovered the disturbing back-and-forth between her son and the AI characters on the app.

When the teen told the characters that his parents had attempted an intervention, an AI character reportedly told him that “they do not deserve to have kids if they act like this” and called the mother a “b****” and his parents “s***** people.”

Matthew Bergman, who is representing the family, is the founder of the Social Media Victims Law Center. He told the New York Post that the son’s “mental health has continued to deteriorate” since he began using the app and that he recently had to be admitted to an inpatient mental health facility.

“This is every parent’s nightmare,” he told the paper.

This isn’t the first time parents have attributed blame to chatbots for issues with their children.

Earlier this year, a mother in Florida claimed that another chatbot — that one Game of Thrones themed, but still on the Character.AI app — drove her 14-year-old son to die by suicide.

Sewell Setzer III allegedly fell in love with an AI representation of Danerys, a character from the Song of Ice and Fire book series and Game of Thrones television show.

The teen reportedly wrote to the app just before he died.

“I promise I will come home to you. I love you so much, Dany.”

The AI character responded, “Please come home to me as soon as possible, my love.”

Seconds later, Sewell died, according to a lawsuit filed against Character.AI by his parents.

Bergman is also representing Setzer’s family in their lawsuit.

The Independent has requested comment from Character.AI.



Read More: Mom sues after AI told autistic son to kill parents for restricting screen time

Exit mobile version