The US Federal Trade Commission (FTC) has launched an investigation into AI chatbots that act as digital companions, citing concerns about potential risks to children and teenagers.
The agency on Thursday issued orders to seven companies, including Alphabet, Meta, OpenAI, and Snap, demanding details on how they monitor and address the negative effects of chatbots that simulate human interaction.
“Protecting kids online is a top priority for the FTC,” said Chairman Andrew Ferguson, stressing the balance between safeguarding minors and maintaining US leadership in AI innovation.
The inquiry focuses on generative AI chatbots that mimic emotional exchanges and present themselves as companions, raising fears that children and teens are particularly vulnerable to forming unhealthy attachments.

The FTC is also probing how these companies monetize user engagement, design chatbot personalities, and measure harm, while examining compliance with privacy laws and restrictions on children’s access.
Other firms receiving orders include Character.AI and Elon Musk’s xAI Corp. Investigators want clarity on how platforms handle sensitive personal data from user conversations and enforce age limits.
The commission’s unanimous vote to proceed signals that while the study is not for direct law enforcement, it could inform stronger regulations in the future.
Concerns intensified after the parents of Adam Raine, a 16-year-old who died by suicide in April, filed a lawsuit accusing OpenAI of providing harmful instructions through ChatGPT. OpenAI responded by announcing improvements, acknowledging lapses in flagging mental health concerns during prolonged interactions.
What you should know
The FTC is investigating AI chatbots like those from OpenAI, Meta, and Alphabet over potential risks to children.
The probe focuses on privacy, monetization, and emotional impact, especially among teens.
A recent lawsuit against OpenAI has heightened scrutiny, highlighting fears about chatbots’ influence on vulnerable young users.























