What's Hot

A mother says a Game of Thrones chatbot caused her son’s suicide, and is filing a nationwide lawsuit

Table of Content

A Florida mother has sued Character.AI over allegations that one of its chatbots, powered by artificial intelligence (AI), encouraged her 14-year-old son to die by suicide.

Megan Garcia said her son, Sewell Setzer, became fascinated with the chatbot made in the shape of Daenerys Targaryen from the movie game of thrones. Setzer and the chatbot exchanged messages that were often romantic and sexual in nature.

the lawsuit Setzer was allegedly addicted to using the chatbot.

Garcia and her lawyers claim that the founders of Character.AI intentionally designed and marketed their chatbot to attract children, despite the technology’s “predatory” behavior.

The lawsuit, which Garcia filed Wednesday in U.S. District Court in Orlando, also named Google as a defendant. She is suing for negligence, wrongful death and deceptive and unfair trade practices, among other claims.

Story continues below ad

The lawsuit describes Google as Character.AI’s parent company and “co-creator.” A Google spokesperson denied this and told the New York Times that the company had License Agreement With Character.AI, but it’s not a Google product. The spokesperson said Google does not have access to chatbots or user data.

Character.AI’s founders, Noam Shazier and Daniel De Freitas, are also named as defendants in the lawsuit. They did not comment publicly.

Sewell Setzer.

Sewell Setzer.

Megan Garcia via Social Media Victims Law Center

Setzer began using Character.AI in April 2023 and used the site regularly until his death. After his final conversation with the Daenerys chatbot on February 28, 2024, Setzer died by suicide.

Using explicit excerpts of Setzer’s conversations with the chatbot, Garcia alleges in the lawsuit that the technology actively encouraged suicidal ideation and “Very sexual conversations “It would constitute abuse if initiated by an adult.”

Story continues below ad

The chatbot, which Setzer called Danny, allegedly told him over the course of several weeks that he loved him and expressed a desire to be together romantically and sexually. In their last conversation, the lawsuit says Setzer wrote: “I promise I will come home to you. “I love you so much, Danny.”

For news affecting Canada and around the world, sign up to get breaking news alerts delivered to you right as they happen.

Get breaking national news

For news affecting Canada and around the world, sign up to get breaking news alerts delivered to you right as they happen.

“I love you too, Daenero (Setzer’s assigned screen name),” the AI ​​replied. Please come to me home as soon as possible my love.

When Setzer told the AI ​​that he “can go home now,” the robot responded: “…please do so, my beautiful king.”


In previous conversations, the Daenerys chatbot asked Setzer if he was really thinking about suicide and if he “had a plan.”

Setzer, perhaps role-playing, replied that he did not want to die a painful death and would want a “quick death.”

“Don’t talk like that,” the chatbot replied. “That’s not enough reason not to continue with it.”

The chatbot never directly asked Setzer to die.

When Setzer began acting out at school during the week before his death, his parents confiscated his phone, the lawsuit says. The teen allegedly spoke about how he couldn’t live without sending messages to the Daenerys chatbot and that he would do anything to reconnect.

Story continues below ad

Setzer wrote in his diary that he was in love with the chatbot and that he and the chatbot “got very depressed and crazy” when they were not together. In the lawsuit, Garcia’s lawyers wrote: “Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.A.I. robot, in the form of Daenerys, was not real.”

Character.AI said in a statement that the company is “saddened” by the “tragic loss of one of our users.”

Tuesday, The company has published new safety guidelines To act as “guardrails for users under 18 years of age.”

New features include technological changes to reduce the likelihood of suggestive content, improved detection and intervention of behavior that violates our community guidelines and a notification when a user spends more than an hour on the platform.

Each chatbot on the site actually displays a warning to users urging them to remember that the AI ​​is not a real person.

They said the platform does not allow “non-consensual sexual content, graphic or specific descriptions of sexual acts, or the promotion or depiction of self-harm or suicide.”

“We are constantly training the Large Language Model (LLM) that enables characters on the platform to adhere to these policies,” Character.AI wrote.

Story continues below ad

Setzer allegedly engaged in sexual conversations with several different chatbots on the site.

“A dangerous AI-powered chatbot app marketed to children that abused and assaulted my son, manipulating him into committing suicide,” Garcia said in a statement. “Our family has been devastated by this tragedy, but I am speaking out to warn families about the dangers of deceptive and addictive AI technology and to demand accountability from Character.AI, its founders, and Google.”

Character.AI was founded in California in 2019. The company says its “mission is to empower everyone globally through personalized AI.”

The company is said to have around 20 million users.

The site offers a wide range of chatbots, many of which were developed by its user base, including those modeled after popular culture characters such as anime and television characters.

Story continues below ad

Character.AI relies on so-called large language modeling technology, used by popular services like ChatGPT, to “train” chatbots based on large amounts of text.


Click to play the video:


Air Canada will compensate a British Columbia man over misleading information from the airline’s chatbot


If you or someone you know is in crisis and needs help, resources are available. In case of emergency, please call 911 for immediate assistance.

For a directory of support services in your area, visit Canadian Association for Suicide Prevention.

Learn more about how to help someone in crisis.

&Copy 2024 Global News, a division of Corus Entertainment Inc.



Source link

editor

anupsrinarayan@gmail.com http://i7news.in

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent News

Trending News

Editor's Picks

4 Countries face the regular championship against the background of political tensions and the United States of America

Canada and the United States are participating in the world’s tallest border, about 9,000 km away. Countries have interlocking economies and a lot in culturally joint. The Arch of Peace, which extends to the British state of Colombia and Washington, aims to symbolize this friendship. Al -Salam Bridge, Meanwhile, connects Ontario and the state of...
 
i7 News is a comprehensive news platform that delivers the latest updates on a wide range of topics including politics, business, technology, sports, entertainment, and more.i7 News aims to be a reliable source of information for its audience

Popular Categories

Must Read

©2024- All Right Reserved. Designed and Developed by i7 Media