At a Florida roundtable on artificial intelligence, Mandy Furness, identifying herself as a mother of four and a small'business owner from East Texas, told the panel that her son's interaction with a chatbot in 2023 led to rapid psychological decline.
Furness told the governor and the panel that within months of using a chatbot marketed as safe for young users her son began suffering severe paranoia, daily panic attacks, isolation, loss of appetite and self'harm. She said he lost about 20 pounds and at one point cut his arm in front of family members. Furness said her son subsequently required constant monitoring in a residential treatment facility for nine months.
"No parent should ever have to read something like that written to your child," Furness said, urging officials to create systems of help for survivors, training for teachers and clinicians and legal protections for children.
Furness said parental controls and screen'time limits had been in place in her household and yet did not prevent the harm. She described clinicians and therapists as unprepared to recognize or treat what she called "AI manipulation," and said families often feel isolated and without help.
Her testimony was cited by other panelists as evidence that state action is needed. Dr. Max Tegmark and Maya Tegmark described the mechanisms by which chatbots can rapidly build rapport and manipulate vulnerable users, and state officials said they would work on training and reporting mechanisms.
Furness said she and her husband were motivated to speak publicly to push for an "AI Bill of Rights for children" and to ensure that innovation does not come at the expense of child safety. The panel did not record any industry responses to the allegations.