Chatbot Protections
The harms of chatbots are not theoretical for kids. Investigations and lawsuits document cases in which chatbots have encouraged or failed to intervene in self-harm, suicide, sexual exploitation, and other dangerous behaviors. Multiple lawsuits and investigations allege that prolonged interactions with AI companion chatbots contributed to the deaths of several children in the U.S. Together, these harms and risks demonstrate the urgent need for accountability and regulation for chatbot design, development, and deployment to ensure protections for kids and teens.
The Kids Code Coalition supports legislative action in states that advances meaningful protections for kids on chatbots. This legislation can take a variety of forms — from AI companion chatbot safeguards to product liability standards to data protections to transparency requirements — depending on the priorities of different states. As Big Tech continues to prioritize profits over young people’s wellbeing, states across the country are filling the gap.