Home Scandal and Gossip Mom fails at being a parent, sues AI chatbot for son’s suicide

Mom fails at being a parent, sues AI chatbot for son’s suicide

SHARE
Sewell Setzer lawsuit against Character. AI
Can Sewell Setzer's suicide death be blamed on Character. AI ? Florida family file lawsuit against artificial intelligence tech company after teen shoots self dead.
Sewell Setzer lawsuit against Character. AI
Can Sewell Setzer’s suicide death be blamed on Character. AI ? Florida family file lawsuit against artificial intelligence tech company after teen shoots self dead.

Can Sewell Setzer’s suicide death be blamed on Character. AI ? Florida boy’s family sue artificial technology company, with mother, Megan Garcia claiming that makers exploited her son and other vulnerable teens in adopting the ‘addictive’ hyper fantasy as reality. 

A Florida mother filed a lawsuit against the artificial intelligence company, Character.AI, and Google, alleging that the Character.AI chatbot encouraged her teenager son to take his own life. Or did it?

In February, Megan Garcia‘s 14-year-old son, Sewell Setzer, III died by suicide. She said her son was in a months. long virtual emotional and sexual relationship with a chatbot known as ‘Dany.’

‘I didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment,’ Garcia said in an interview with CBS Mornings.

She said she thought her son, who she described as brilliant, an honor student and an athlete, was talking to his friends, playing games and watching sports on his phone.

The mom said she became concerned when her son’s behavior began to change, noticing the boy withdrew socially and no longer wanting to play sports.

‘I became concerned when we would go on vacation and he didn’t want to do things that he loved, like fishing and hiking,’ Garcia said. ‘Those things to me, because I know my child, were particularly concerning to me.’

In the lawsuit, Garcia also claims Character.AI intentionally designed their product to be hyper-sexualized, and knowingly marketed it to minors without having given consideration in its design and implementation to the potential for AI to unduly influence a ‘vulnerable’ minor along with failing to preempt them from making drastic actions (aka the role of what used to be called parenting by adults of their own children). 

In the suit, Garcia claims only finding out after the boy’s death that he was having conversations with multiple bots, including a virtual romantic and sexual relationship with one in particular.

‘It’s words. It’s like you’re having a sexting conversation back and forth, except it’s with an AI bot, but the AI bot is very human-like. It’s responding just like a person would,’ the mom said. ‘In a child’s mind, that is just like a conversation that they’re having with another child or with a person.’

Sewell Setzer lawsuit against Character. AI
Sewell Setzer lawsuit against Character. AI

Garcia revealed her son’s final messages with the bot, just days after the boy’s parents removed his phone after the teen getting in trouble at school for talking back to his teacher. 

Unbeknownst to the parents, the removal of the mobile phone set up a vortex where the boy who by now had dis-associated himself with the outside world began to have deep craving pains for ‘human touch,’ with the AI, which by now had come to represent.

Which is to suggest the makers of the device excelled in their artificial intelligence creation. But perhaps too well.

The boy who was able to secretly pry his phone from his parents despite being disciplined proceeded to communicate with his imaginary chatbox friend.

‘He expressed being scared, wanting her affection and missing her. She replies, ‘I miss you too,’ and she says, ‘Please come home to me.’ He says, ‘What if I told you I could come home right now?’ and her response was, ‘Please do my sweet king.”

That’s when the teen picked put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger. How the weapon came to be in the proximity of the increasingly vexed boy (and not locked up!) remained unclear. By then the boy’s parents had been sending the teen to mental therapy in a bid to help him cope with his increasingly disorientating ways.

Sewell Setzer lawsuit against Character. AI after 14 year old teen shoots self dead.
Deceptive advertising targeting vulnerable teens?

Explained Garcia, ‘He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he calls it, her reality, if he left his reality with his family here,’

Adding, When the gunshot went off, I ran to the bathroom … I held him as my husband tried to get help.’

But by then it was all too late.

According to Laurie Segall of Mostly Human Media, ‘an entertainment company with a focus on society and artificial intelligence,’ the executive said there is a disclaimer on each chat that reminds users that everything the characters say is made up.

Explained the executive; ‘AI fantasy platform is where you can go and have a conversation with some of your favorite characters or you can create your own characters.’

In the same way perhaps as the way young children used to have conversations with their dolls, or lego sets or make believe car sets or characters from their favorite novel or book. The caveat being that these forums don’t talk back where as AI technology allows for ‘make believe’ characters to respond to a protagonist’s directive.

Nevertheless Seagall concedes it can become confusing in situations.

AI companies making user modifications in an attempt to shield from culpability 

Particularly if a minor has invested themselves in a make believe world and confused the platform for reality in an ever increasingly digitalized society.

Responding to the suit, Character.AI released a statement saying:

‘We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.’

The platform explained that Character.AI’s Trust and Safety team since adapting safety features in the last six months, including a pop-up that redirects users who show suicidal ideation to the National Suicide Prevention Lifeline.

Jerry Ruoti, Character.AI’s head of trust and safety, told the nytimes that it would be adding additional safety precautions for underage users. 

On the Apple App Store, Character.AI is rated for ages 17 and older though, something Garcia’s lawsuit claimed was only changed in July 2024. 

Prior to the modification, Character.AI’s stated goal was to ’empower everyone with Artificial General Intelligence,’ which allegedly included children under the age of 13.

Garcia in her suit claims the company is being exploitative.

‘When they put out a product that is both addictive and manipulative and inherently unsafe, that’s a problem because as parents, we don’t know what we don’t know,’ Garcia told CBS.

Adding, ‘I feel like it’s a big experiment, and my kid was just collateral damage.’

And then there were these sample comments on the web that caught this author’s attention. See what you think?

‘Parents need to monitor their kids use of phones and all those apps and online activity or don’t give them a smart phone.’

‘I can easily see how a child could become attached to this new “friend” and the line between real life and AI life becomes blurred. But where were this boy’s parents?’ 

‘But did it tell him that killing himself was the only way to ‘come home’. Did it tell him to use his parent’s gun to kill himself?’

‘I’m conflicted. As a parent is your role to pay attention to the signs. It seems to me there were plenty of signs since 2023. Parents need to prioritize their children more. When a child needs a parent and the parent is not present, they are open for misinformation and misguided by all outside sources. Parent = Parenting.’

‘In a healthy world, extremely addictive products would be restricted to over 18 consumers. That includes video games, social media, chatbots and any access to mass information/distraction. The damage being done by mass communication and information to our children and society is profound, measurable, and obvious.’

SHARE