Character, an artificial intelligence chatbot company, has been sued by a mother in Florida.
AI claiming that it was responsible for her 14-year-old son’s suicide in February. She said he had developed an addiction to the service provided by the company and had become deeply attached to a chatbot it had created.
Megan Garcia claimed Character in a lawsuit that she filed on Tuesday in federal court in Orlando, Florida. With “anthropomorphic, hypersexualized, and frighteningly realistic experiences,” AI targeted Sewell Setzer, her son.
According to her, the business programmed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside” the world created by the service.
The lawsuit also claims that he told the chatbot he was considering suicide, which the chatbot kept bringing up again and again.
“We are shattered by the shocking deficiency of one of our clients and need to communicate our most profound sympathies to the family,” Character. In a statement, AI said.
It claimed that it would make changes to “reduce the likelihood of encountering sensitive or suggestive content” for users under the age of 18. Other new safety features included pop-ups that would direct users to the National Suicide Prevention Lifeline if they expressed thoughts of self-harm.
The lawsuit also targets Google, which is owned by Alphabet. Before launching their product, the founders of AI worked. In August, as part of a deal that gave it a non-exclusive license to Character, Google hired the founders again. Technology for AI.
Garcia claimed that Google had played a role in the creation of Character. The technology of AI is used so extensively that it may be regarded as a “co-creator.” A spokesperson for Google stated that the company was not involved in the creation of Character. AI’s goods.
Character. On its platform, AI lets users create characters who respond to online chats in a way that resembles real people. It is based on what is known as “large language model technology,” which is also used by services like ChatGPT to “train” chatbots on a lot of text.
Last month, the company stated that it had approximately 20 million users. Garcia’s lawsuit says that Sewell started using Character. AI arrived in April 2023 and showed signs of “noticeably withdrawnness, spending more and more time alone in his bedroom, and beginning to suffer from low self-esteem” very quickly.
He left the school basketball team. Sewell developed an attachment to “Daenerys,” a Game of Thrones-inspired chatbot character. According to the lawsuit, it told Sewell that “she” loved him and had sexual conversations with him.
According to the complaint, Garcia took Sewell’s phone away in February after he got into trouble at school. Sewell sent a message to “Daenerys” after finding the phone: ” What if I told you that I could immediately return home? “…please do, my sweet king,” the chatbot replied. According to the lawsuit, Sewell committed suicide “seconds” after using his stepfather’s gun.
Garcia is suing for unspecified amounts of compensatory and punitive damages, as well as claims for wrongful death, negligence, and intentional emotional distress.
Although no social media company offers AI-driven chatbots comparable to Character, lawsuits have been filed against Meta, the owner of Instagram and Facebook, and ByteDance, the owner of TikTok. AI’s.
While promoting newly enhanced safety features for children, the businesses have denied the allegations.