ANNAPOLIS — The House Criminal Law Subcommittee on March 28 moved a senate bill that would fold AI-generated “deepfake” images into the state’s existing revenge pornography statute while keeping a civil defamation-per-se remedy for fabricated images.
The chair opened the meeting saying the panel would consider competing measures from Delegate Lopez, Delegate Pippe, Delegate Worman and a senate bill from Senator Hester. Committee counsel presented a head-to-head comparison and advocates urged a single definition that treats real and computer-generated images “on a continuum.” Lisa Jordan of the Maryland Coalition Against ****** Assault said, “there really should be no distinction” between actual and AI-created visual representations.
A central issue at the hearing was whether criminal liability should require an intent-to-harm element. Jeremy Zachar of the Office of the Public Defender told members that removing an intent requirement would create a strict-liability offense and could sweep in people who merely redistribute images online: “To remove criminal intent would create a strict liability offense where anyone who would distribute or redistribute this deep fake could run afoul of this law,” he said.
Members and counsel discussed technical points the committee must resolve in drafting a unified definition, including whether wholly fabricated images assembled from public photos would meet a “reasonable expectation of privacy” standard already embedded in the revenge pornography law. Counsel observed that current law criminalizes distribution rather than production, which shaped the committee’s focus on targeting redistributors.
On the civil side, sponsors and advocates favored the senate language creating a defamation-per-se cause of action for fake images so plaintiffs need not prove individual harm. Lisa Jordan and co-sponsors cited defamation as an appropriate civil remedy for AI-manufactured images that present a false representation of a real person.
After discussion, members moved a set of conceptual amendments into the Hester senate bill: the committee will ask counsel to prepare a single, revised definition of “visual representation” that incorporates computer-generated images into the revenge pornography code while retaining intent language for criminal charges and preserving the civil defamation-per-se remedy proposed in the senate bill. A motion to move the senate bill was made and seconded, and the chair said the subcommittee had moved the senate bill forward. The chair also said house bill work would be revisited at the next meeting and that the house bills were being “pinned” for further drafting.
Members also flagged the Worman bill, which would impose platform-level removal or content-moderation obligations, as a separate policy question that the committee will take up later. Several members cited 47 U.S.C. §230 and federal preemption concerns when debating whether to create additional web-host liability.
The subcommittee directed counsel to draft a combined definition and signaled it would adopt the JPR (Judicial Proceedings Committee) civil language as its subcommittee position. The committee deferred further action on the platform/host rules to a future meeting.
The subcommittee did not produce a final reprinted combined bill during the session; instead members gave staff direction to prepare draft amendments and scheduled further discussion at a subsequent meeting.