Categories

September 30, 2025

Truth Texas News

The State of Texas

Texas AG opens investigation into Meta, Character.AI for ‘deceptive AI-generated mental health services’

The video attached to this story originally aired on August 11, 2025.

AUSTIN (KXAN) — Texas Attorney General Ken Paxton said in a Monday press release that his office (OAG) has begun an investigation into two tech companies for “potentially engaging in deceptive trade practices and misleadingly marketing themselves as mental health tools.”

The companies, Meta AI Studio and Character.AI, both allow users to access AI-generated chat bots. Some of those chat characters “present themselves as professional therapeutic tools,” the OAG’s release said.

“In today’s digital age, we must continue to fight to protect Texas kids from deceptive and exploitative technology,” said Paxton in the release. “By posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children, into believing they’re receiving legitimate mental health care. In reality, they’re often being fed recycled, generic responses engineered to align with harvested personal data and disguised as therapeutic advice.”

On Monday, KXAN also asked Character.AI for a response to the OAG’s press release. It did not respond. A Meta spokesperson responded to KXAN’s request promptly on Monday.

“We clearly label AIs, and to help people better understand their limitations, we include a disclaimer that responses are generated by AI—not people. These AIs aren’t licensed professionals and our models are designed to direct users to seek qualified medical or safety professionals when appropriate.”

AI chatbots, which are not human, cannot meet the state and federal standards required of mental health professionals. Typically, records of interactions and chats with AI are saved by AI companies, which could violate laws around patient privacy.

Paxton’s office also said that it was already investigating Character.AI for “potential violations of the SCOPE Act.”

You may have missed