New York Attorney General Letitia James has released draft regulations to enforce the Stop Addictive Feeds Exploitation (SAFE) for Kids Act. The proposed rules aim to limit the kinds of social media feeds that minors under age 18 can see unless a parent gives consent. In particular, platforms like TikTok and Instagram must cease showing algorithmically personalized content to users under 18 who do not have verified parental approval. Also, social media companies cannot send notifications related to algorithmic or “addictive” content to minors between midnight and 6 a.m. without permission.
How Age and Parental Consent Would Be Handled
Under the draft rules, platforms must verify a user’s age and gain parental consent when required. Age verification methods might include verifying via email or phone, or requiring uploads of images or documents to confirm age. Especially when minors want algorithmic content or nighttime notifications, platforms need verified consent from a parent or guardian. The rules also stipulate that companies protect user data in the process of verifying age and consent. New York’s plan gives social media platforms 180 days after final rules for full compliance.
Why Supporters Back the Rules
Supporters say the proposed rules respond to concerns over youth mental health. They point out that addictive features on social media, such as endless scrolling and algorithmic content designed to keep engagement high, boost screen time or sleep disruptions. They argue that minors exposed to those features face higher risks of anxiety, depression, or other psychological strain. Beyond mental wellness, they believe these rules also uphold parental rights by giving guardians more control over what children see online.
Criticism and Legal Concerns
Technology companies and free speech and privacy advocacy groups have raised objections. They claim the rules risk violating free speech, that algorithmic content can include legitimate expression or educational material. Some critics argue that verifying age and parental consent might collect more personal information than necessary, opening risks for misuse or privacy breach. Others warn of high implementation costs for platforms, particularly for smaller ones. As a result, legal challenges seem likely once the rules are finalized.
What the Rules Mean for Kids and Platforms
If the proposed regulations take effect, minors without parental consent will see only posts from accounts they already follow, rather than algorithmic recommendations. Platforms will need to redesign parts of their feed logic and adjust notification systems to stop alerts during restricted hours. For many families, this could mean fewer distractions and more predictable content. For platforms, this involves technical work and compliance planning to avoid penalties or legal risk.
What Happens Next
The rule proposal opens a 60-day public comment period. During that time, families, tech companies, developers, and civil rights groups can send feedback. After that, the Attorney General’s office will finalize the rules, and then companies will have 180 days to start implementing them. Observers expect this process to involve negotiations and possible revisions. Meanwhile, states like California and even some federal discussions may influence how New York’s version evolves. Ultimately, if enforced well, these rules could reshape how social media works for minors across the state and serve as a model elsewhere.
Bonus Read: Travel Ban Under Trump Blocks Thousands of International Students