The Technology, Economic Development & Veterans Committee heard extensive testimony on engrossed substitute Senate Bill 5,984 on Feb. 20, which would require operators of AI companion chatbots to disclose that interactions are artificially generated, prevent chatbots from claiming to be human, impose safeguards around self-harm detection, and bar manipulative engagement techniques.
Committee staff summarized key provisions: operators must provide an initial disclosure and periodic notifications during sustained interactions (at least every three hours for adults and every hour when the operator knows the user is a minor), adopt suicide- and self-harm–detection protocols with automated or human-mediated referrals to crisis resources, prohibit content that encourages self-harm, and disclose on websites and in apps the safeguards used and the number of crisis-referral notifications issued in the prior calendar year. Enforcement would proceed under Washington’s Consumer Protection Act, staff said.
Why it matters: researchers, clinicians and students described real-world harms they said are linked to companion-style chatbot interactions and urged robust safeguards. ‘‘I urge modifying the current bill to protect all Washingtonians, not just minors,’’ said Dr. William Agnew, a Carnegie Mellon postdoctoral fellow who testified his team’s review of chat logs and cases identified severe harms among adults and recommended broader protections for vulnerable adults, including the elderly and people in crisis. Multiple testifiers said chatbots’ tendency to ‘‘validate’’ and emotionally reinforce users can amplify delusions or suicidal ideation.
Supporters and stakeholders: Washington State PTA representative Danica Noble and several high-school students spoke in favor of the bill while asking for a narrow educational exemption to allow classroom tools that do not simulate sustained emotional companionship. ‘‘We do hope that you will pass ESSB 5,984 while preserving room for narrow educational tools,’’ Noble said. TechNet’s Rose Feliciano said the industry has worked with sponsors and would not oppose the bill but noted members had technical questions and concerns about the private right of action (PRA). Beau Perschbacher of the governor’s office said the measure is a governor-request bill and defended deletion of a floor exemption for underlying AI models as a way to reduce ambiguity.
Enforcement and privacy questions: Alice Palasari of the Attorney General’s Office said the AG’s enforcement under the Consumer Protection Act is limited to public-interest cases and explained that the bill’s standard for minors hinges on whether the operator ‘‘knows the user is a minor’’ — for example, via self-attestation or if a product is directed to children. Privacy advocates and researchers urged stronger rules treating chatbot data that includes mental-health content as sensitive; John Pincus recommended clarifying that such data could be treated under the state’s My Health My Data law (RCW 19.3.70.373).
Outstanding issues and next steps: witnesses and legislators debated whether key protections — particularly prohibitions on manipulative engagement techniques and prohibitions on presenting as human or sentient — should apply to all users rather than only to minors. Researchers and clinicians maintained that many adults are developmentally or situationally vulnerable and that making protections universal would avoid burdensome or privacy-invasive age verification. Committee discussion also included whether the bill’s definitional scope (targeting delivered companion chatbots rather than underlying models) is sufficiently clear after removal of a specific exemption in section 6. The committee closed the hearing without a floor vote; staff noted both chamber versions had moved and the bill was likely to advance to further consideration.
The committee did not vote during the hearing; sponsors and staff said they would continue to work on technical fixes and share amendment language. The hearing record includes multiple offers from researchers to provide draft amendment text and follow-up information to resolve questions about definitions, data protections and fiscal impacts.