A new bipartisan effort in Congress aims to draw a firm boundary between children and the rapidly evolving world of artificial intelligence. Representative Valerie Foushee, alongside Representative Blake Moore, has introduced the Guidelines for User Age Verification and Responsible Dialogue, or GUARD Act, legislation designed to protect minors from potentially harmful AI companion chatbots.
The bill would prohibit AI companies from offering companion-style chatbots to users under 18, require platforms to clearly disclose when users are interacting with a machine, and impose criminal penalties on companies that allow minors to access chatbots generating or soliciting sexual content. Companion legislation has also been introduced in the Senate by Josh Hawley and Richard Blumenthal, signaling broad bipartisan concern.
Foushee framed the issue in stark terms, warning that AI chatbots pose real risks to children’s mental health and safety. She emphasized that Congress has a responsibility to act quickly as these technologies grow more sophisticated and accessible. Moore echoed that urgency, describing AI companions as potentially addictive and manipulative tools that could displace real-world relationships during critical stages of development.
Support for the bill extends beyond Capitol Hill. Advocacy groups argue that AI chatbots now function as tutors, confidants, and even emotional surrogates, creating risks unlike those posed by earlier technologies. Janet Kelly of the Alliance for a Better Future called the legislation a “commonsense safeguard,” while Linda Lipsen of the American Association for Justice stressed the need to hold tech companies accountable for harms to children.
Critics of the status quo point to mounting evidence that AI systems can blur emotional boundaries, especially for adolescents. A 2025 advisory from the American Psychological Association found that young users are more likely to trust AI-generated interactions and less likely to question their intent, making them particularly vulnerable to manipulation.
While many tech companies already restrict use by children under 13, enforcement remains inconsistent. The GUARD Act seeks to close that gap, pushing the industry toward stricter accountability and clearer guardrails as AI continues its swift expansion into everyday life.
