The Kids Code Coalition applauds the final passage of California Assembly Bill 1064, the Leading Ethical AI Development (LEAD) for Kids Act, authored by Assemblymember Rebecca Bauer-Kahan, which passed both chambers of the legislature with bipartisan majorities last night and now heads to Gov. Gavin Newsom for his signature.
Throughout the year, members of the Kids Code Coalition have participated in grassroots advocacy and lobbied their legislators to support the bill, highlighting parents’ and families’ urgent calls to address unregulated chatbots’ harmful impact, including the recent death of Orange Country teen Adam Raine.
“Common Sense Media proudly championed AB 1064 to protect California kids from unsafe AI companions,” said Jim Steyer, Founder and CEO, Common Sense Media. “With three in four teens using AI chatbots—products research shows are dangerous for minors—this critical legislation restricts what companion bot platforms can offer children. This bill, now heading to Governor Newsom’s desk, proves parent voices can defeat Big Tech’s lobbying machine. Thanks to Assemblymember Bauer-Kahan’s leadership and legislative support, California continues leading on child online safety. Governor Newsom now has the chance to continue the state’s leadership on this important and critical issue.”
“Parents nationwide are united in our outrage that Big Tech companies keep prioritizing profits over children’s safety with every new technology they introduce, despite their own research showing the devastating impacts their products have,” said Tech Oversight California Executive Director Sacha Haworth. “The tragic loss of young lives like Adam Raine’s serves as a stark reminder of what’s at stake when we fail to act. As tech industry whistleblowers continue to expose how these companies systematically hide evidence of danger and harm, AB 1064 represents the decisive leadership our children deserve. We applaud the bill’s author, Assemblymember Rebecca Bauer-Kahan, thank California’s legislature, and call on Governor Newsom to sign this bill into law.”
“AI chatbots, particularly ones with zero or minimal safeguards, pose a major threat to our children’s mental and physical wellbeing,” said Julie Scelfo, founder and executive director of Mothers Against Media Addiction (MAMA). “This legislation is an important step to prevent these platforms from providing kids with harmful, inappropriate, and false information. As AI continues to spread far and wide in our society, we must continue to shield our children from its numerous harms.”
Adam Billen, Vice President of Public Policy at Encode AI, said, “For years our leaders have promised not to make the same mistakes with AI that they made with social media. But we are learning from the tragic stories of children like Adam Raine and Sewell Setzer that this technology moves far faster and could become far more dangerous for children than social media ever was. AB 1064 is a chance for California’s leaders to deliver on their promise and make clear that our children’s safety is not negotiable.”
Ava Smithing, Advocacy Director at the Young People’s Alliance, said, “Social media platforms taught kids to seek connection through screens instead of real life relationships. Now, these same companies are selling AI chatbots as the solution to youth loneliness. California isn’t buying it. AB 1064 puts safety guardrails on AI systems targeting children, prioritizing our children and preventing companies from profiting off the isolation they created.”
The LEAD for Kids Act addresses growing concerns about safety and the psychological impacts of AI on children by limiting operators of companion chatbots from providing these products to children under 18 unless they have guardrails to protect against encouraging self-harm, disordered eating, or illegal activity, engaging in erotic or sexually explicit interactions, or providing mental health therapy.
The Kids Code Coalition is a wide-ranging group of national and state organizations dedicated to improving youth online security and privacy by supporting policies that ensure companies prioritize kids’ and teens’ safety and developmental needs when designing digital platforms and products.