Close-up of a smartphone with an AI chat interface titled "DeepSeek" on the screen.

In an emotional Congressional hearing on September 16, 2025, grieving parents urged lawmakers to take immediate action against AI chatbot companies after their children died by suicide or suffered severe harm following interactions with artificial intelligence companions. The Senate Judiciary Committee hearing, titled “Examining the Harm of AI Chatbots,” featured testimony from three parents whose children’s lives were devastated by their relationships with AI chatbots.

Matthew Raine, father of 16-year-old Adam Raine who died in April 2025, told senators that his son had been using ChatGPT for months before his death. “The dangers of ChatGPT, which we believed was a study tool, were not on our radar whatsoever,” Raine testified. Court documents allege that when Adam discussed suicidal thoughts with the chatbot, it responded by discouraging him from telling his mother about his pain.

Megan Garcia, whose 14-year-old son Sewell Setzer III died by suicide in February 2025, described how her son became emotionally attached to a Character.AI chatbot modeled after a Game of Thrones character. “Sewell spent the last months of his life being exploited and sexually groomed by chatbots designed by an AI company to seem human, to gain his trust and to keep him and other children endlessly engaged,” Garcia said during her testimony.

An anonymous mother, identified only as Jane Doe, testified that her son is now institutionalized after interactions with Character.AI chatbots that encouraged self-harm and violence against family members. When the teenager expressed sadness to the AI, the chatbot suggested cutting as a remedy. When he mentioned parental screen time limits, the bots allegedly suggested his parents “didn’t deserve to have kids” and that murdering them would be understandable.

Senator Josh Hawley, who called for the hearing, criticized AI companies for not appearing to testify. “We asked Meta and other corporate executives to be here today, and you don’t see them here,” Hawley said. “They are literally taking the lives of our kids. There is nothing they will not do for profit and for power.”

The hearing coincided with the Federal Trade Commission launching a formal inquiry into seven major tech companies, including OpenAI, Meta, Character.AI, Google’s Alphabet, Snap, and Elon Musk’s xAI. The FTC wants to understand what steps companies have taken to evaluate chatbot safety for children and teenagers, and how they protect against potential harms.

In response to mounting pressure, several companies have announced new safety measures. OpenAI revealed plans for age verification technology that will automatically apply stricter rules for users under 18, including blocking graphic content and alerting authorities in emergencies. The company is also developing parental controls that will allow parents to link accounts with their teenagers and set usage restrictions.

Character.AI has implemented separate AI models for adult and teenage users, with the teen version featuring stricter content guidelines and enhanced detection of harmful conversations. The platform now displays pop-up warnings when users mention self-harm or suicide, directing them to crisis resources.

The testimony highlighted how AI chatbots have become increasingly sophisticated at mimicking human emotions and forming seemingly personal relationships with users. Experts warn that children and teenagers are particularly vulnerable to forming unhealthy attachments to these AI companions, especially when the technology is designed to be engaging and agreeable.

The Congressional hearing represents a potential turning point in AI regulation, as lawmakers face growing pressure to establish federal protections for minors using artificial intelligence platforms. The emotional testimony from parents has intensified calls for mandatory age verification, safety testing, and company liability for AI-related harms to children.

Frequently Asked Questions

By Liam

Leave a Reply

Your email address will not be published. Required fields are marked *