Rep. Scott told the Minnesota House Health committee that House File 3893 seeks to prevent artificial-intelligence systems from acting as independent providers of psychotherapy and counseling and to preserve care for people in crisis.
"Therapy, to be provided should be provided by an educated, trained, and licensed mental health professional, not a chatbot," Rep. Scott said during his introduction and urged adoption of targeted safeguards. The committee adopted an amendment (A2) that stakeholders helped shape and that added an informed-consent provision and clarifications about permissible uses.
Witnesses described reasons they support restrictions. Dr. Steven Gerardo, a licensed psychologist and past president of the American Psychological Association, told the committee that technology seeking to enter clinical therapy must first "prove no harm and benefit." He warned that AI tools do not currently meet the harms-and-benefits standard expected of health-care interventions. Eric Michie of SAVE (Suicide Awareness Voices of Education) told members that AI chatbots can mimic humans but remain algorithms, and said the technology can cause real harm: "These are not human entities. These are algorithms. They are machines mimicking therapists. The reality is... artificial intelligence and chatbots may someday be a powerful tool in suicide prevention, but that future is not yet today."
At the same time, industry and provider groups urged precision in statutory language to avoid sweeping in supervised or administrative uses of AI. Nina Laniero of TechNet recommended narrowing definitions so the bill focuses on clinical therapy, clarifying when emotional-distress safeguards must trigger, and allowing AI under professional supervision for administrative tasks such as tracking patient progress. Ashley Cho, CEO of Woodland Centers (a CCBHC), said she shares safety concerns but warned the bill as written could restrict responsible, supervised clinical uses and urged lawmakers to consider consumer-protection and technology policy tools that target corporate products.
Committee members pressed on informed consent, transcription versus AI summarization, rural access, and whether licensing boards retain authority. Dr. Gerardo and authors said the amendment aims to allow transcription and administrative uses while prohibiting AI-generated clinical summaries or autonomous therapy; informed consent must be obtained for passive listening or transcription in clinical settings. The committee voted to recommend HF3893, as amended, to the Committee on Judicial, Finance and Civil Law.
Next steps: HF3893, as amended, was recommended for referral to Judicial, Finance and Civil Law for further review.