A new, powerful Citizen Portal experience is ready. Switch now

Assembly privacy committee hears parents, researchers and tech companies on parental controls and child safety online

March 17, 2026 | California State Assembly, House, Legislative, California


This article was created by AI summarizing key points discussed. AI makes mistakes, so for full details and context, please refer to the video of the full meeting. Please report any errors so we can fix them. Report an error »

Assembly privacy committee hears parents, researchers and tech companies on parental controls and child safety online
The California State Assembly Privacy & Consumer Protection Committee convened a multi-panel hearing on online safety controls that brought survivor parents, academic researchers, child-advocacy groups and representatives from Meta, Google, OpenAI and Roblox to the Capitol to debate the limits of parental controls and next steps for law and product design.

The hearing began with testimony from Victoria Hinks, a mother who said: "We lost her to suicide 587 days ago," and told lawmakers that despite using device limits and parental settings her family could not prevent the harm that befell their daughter. Her husband, Paul Hinks, a former Silicon Valley software engineer, testified that the household had implemented multiple parental controls but still could not stop dangerous content from reaching their child, arguing the chain of product responsibility "leads nowhere" when platforms and app makers do not share liability.

Researchers summarized evidence that parental controls are inconsistent, difficult to set up and often easy to bypass. Sunny Lu of the Stanford Social Media Lab said parents and youth "are aligned on goals but misaligned on approaches," and cited four core barriers: the difficulty of digital parenting, technical complexity, constraints on controls, and accessibility and equity gaps. Lu and other researchers estimated the time burden for parents to set up and maintain protections can be large and pointed to studies showing many controls fail to operate as advertised.

Children Now senior policy analyst LaShawn Francis presented statewide polling and data linking digital experiences to worsening youth mental-health indicators and argued parental controls alone were insufficient: "This is a child safety problem, not a purely tech problem," she said, calling for corporate accountability and policy guardrails.

Tech-industry witnesses outlined steps their companies have taken. Nicole Lopez of Meta described "teen accounts" and parental supervision tools that default teens into stricter settings on Instagram, including limits on messaging and time-of-day sleep modes; Emily Cashman Kirstine of Google highlighted Family Link, default SafeSearch, YouTube'specific timers and Gemini safeguards; Lauren Haber Jonas of OpenAI described layered age estimation, default safety layers for minors and parental notifications for distressing prompts; and Eliza Jacobs of Roblox emphasized factory-default safety settings, automated moderation and a recent facial age-estimation rollout for communication features.

Despite product changes, lawmakers and advocates pressed the companies on persistent gaps: inconsistent age verification, the ability of teens to circumvent protections by falsifying ages or switching devices/apps, language and cost barriers to third-party tools, and limited transparency about what content moderation and reporting processes actually accomplish. Several witnesses and lawmakers urged independent testing and standardized, easily discoverable controls so parents need not hunt through dozens of menus across many apps.

Experts and advocates offered a set of concrete policy and product proposals: privacy-preserving device-level age signals so apps can apply appropriate defaults; standardized parental-control icons and flows across platforms; automatic "safe-by-default" settings for minors that are on without opt-in; independent audits and third-party verification of safety claims; and regular, easy-to-read "highlights" for parents summarizing a child's usage and exposure.

The hearing did not produce votes or formal committee action. Chair (S1) closed by thanking parents and witnesses and said lawmakers would continue to pursue bipartisan regulatory and consumer-facing tools; multiple members signaled intent to press for stronger disclosure and enforcement mechanisms to ensure safety features work in practice.

Why it matters: California lawmakers and child-safety advocates urged a shift away from placing the full burden of safety on parents toward a combination of product defaults, regulatory standards and independent verification. The committee's next steps could affect device manufacturers, app stores and major platforms used by millions of California children.

Next steps: Witnesses urged more rigorous, publicly reportable research into harms and interventions and recommended that the Legislature consider standards for age assurance, mandated safety-by-default settings and independent testing regimes so regulators and families can track whether safeguards actually reduce harm.

Don't Miss a Word: See the Full Meeting!

Go beyond summaries. Unlock every video, transcript, and key insight with a Founder Membership.

Get instant access to full meeting videos
Search and clip any phrase from complete transcripts
Receive AI-powered summaries & custom alerts
Enjoy lifetime, unrestricted access to government data
Access Full Meeting

30-day money-back guarantee