Character.AI and Google sued after chatbot-obsessed teen’s death

Oct 24, 2024 06:11 AM - 5 months ago 185907

A suit has been filed against Character.AI, its founders Noam Shazeer and Daniel De Freitas, and Google successful the aftermath of a teenager’s death, alleging wrongful death, negligence, deceptive waste and acquisition practices, and merchandise liability. Filed by the teen’s mother, Megan Garcia, it claims the level for civilization AI chatbots was “unreasonably dangerous” and lacked information guardrails while being marketed to children.

As outlined successful the lawsuit, 14-year-old Sewell Setzer III began utilizing Character.AI past year, interacting pinch chatbots modeled aft characters from The Game of Thrones, including Daenerys Targaryen. Setzer, who chatted pinch the bots continuously successful the months earlier his death, died by termination connected February 28th, 2024, “seconds” aft his past relationship pinch the bot.

Accusations see the tract “anthropomorphizing” AI characters and that the platform’s chatbots connection “psychotherapy without a license.” Character.AI houses intelligence health-focused chatbots for illustration “Therapist” and “Are You Feeling Lonely,” which Setzer interacted with.

Garcia’s lawyers quote Shazeer saying successful an interview that he and De Freitas near Google to commencement his ain institution because “there’s conscionable excessively overmuch marque consequence successful ample companies to ever motorboat anything fun” and that he wanted to “maximally accelerate” the tech. It says they near aft the institution decided against launching the Meena LLM they’d built. Google acquired the Character.AI activity team successful August.

Character.AI’s website and mobile app has hundreds of civilization AI chatbots, galore modeled aft celebrated characters from TV shows, movies, and video games. A fewer months ago, The Verge wrote astir the millions of young people, including teens, who dress up the bulk of its personification base, interacting pinch bots that mightiness dress to beryllium Harry Styles aliases a therapist. Another caller study from Wired highlighted issues pinch Character.AI’s civilization chatbots impersonating existent group without their consent, including 1 posing arsenic a teen who was murdered successful 2006.

Because of the measurement chatbots for illustration Character.ai make output that depends connected what the personification inputs, they autumn into an uncanny vale of thorny questions astir user-generated contented and liability that, truthful far, lacks clear answers.

Character.AI has now announced several changes to the platform, pinch communications caput Chelsea Harrison saying successful an email to The Verge, “We are heartbroken by the tragic nonaccomplishment of 1 of our users and want to definitive our deepest condolences to the family.”

Some of the changes include:

  • Changes to our models for minors (under the property of 18) that are designed to trim the likelihood of encountering delicate aliases suggestive content.
  • Improved detection, response, and involution related to personification inputs that break our Terms aliases Community Guidelines. 
  • A revised disclaimer connected each chat to punctual users that the AI is not a existent person.
  • Notification erstwhile a personification has spent an hour-long convention connected the level pinch further personification elasticity successful progress.

“As a company, we return the information of our users very seriously, and our Trust and Safety squad has implemented galore caller information measures complete the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by position of self-harm aliases suicidal ideation,” Harrison said. Google didn’t instantly respond to The Verge’s petition for comment.

More