Scrolling Into the Night: How New York’s SAFE for Kids Act Fights Social Media’s Invisible Toll on Young Minds
Every night, countless kids lie awake with glowing screens in their hands. What starts as “just one more video” turns into hours of scrolling, pushed by endless recommendation feeds and late-night pings. Parents see it. We worry about it. But until now, there haven’t been strong legal guardrails to help stop platforms from nudging our kids into addictive patterns.
New York’s Stop Addictive Feeds Exploitation for Kids Act (SAFE for Kids Act)
New York passed this act in 2024, and now moving into enforcement with new rules proposed in September 2025, it represents one of the most sweeping attempts in the U.S. to protect children from algorithm-driven loops, all-night notifications, and the mental health harm tied to social media.
For moms, educators, and anyone raising kids in the digital age, the implications are serious, and a small first step that will help us begin to be hopeful.
Some might see this as an invasion of privacy, but the reality is that our kids do not have privacy when they are online or using any app that will eventually connect to the internet.
One of my concerns with this is that submitting identification and age verification is very tricky. However, my biggest concern is that parents will struggle with keeping this boundary and somehow bypass the age restriction for their own child. I don't say that lightly. I say that knowing that kids are relentless and there are already tech solutions at parents ' fingertips, but they either don't know how to implement them, or they really struggle with their kids' mood swings and their kids saying, "I will be left out." I am curious if you would like something like this to help you feel more empowered to maintain this boundary, or if it still feels like a fight with your child.
What the SAFE for Kids Act Requires
Here’s what the rules will do once finalized:
Algorithmic Feeds Restricted
Platforms cannot show personalized, algorithmic feeds to minors without verifiable parental consent.
If no consent is given, feeds default to “following only” or chronological order, meaning fewer rabbit holes of suggested content like “you might also like…”
Nighttime Notifications Curtailed
Platforms may not send feed-based notifications to minors between 12:00 a.m. and 6:00 a.m., unless parents explicitly allow it.
Age Verification Standards
Companies must use “commercially reasonable” methods to confirm a user’s age and parental consent.
They must offer options beyond government ID and delete verification data after use.
Enforcement Timeline
Once the rules are finalized, platforms will have 180 days to comply.
The New York Attorney General’s office will oversee compliance, with civil penalties possible for violations.

Why This Matters for You and Your Kids
This isn’t just about annoying notifications. Research shows these features impact sleep, mood, and mental health:

Mental health risks: Youth exposed to addictive feeds are more likely to struggle with anxiety, depression, self-harm, and suicidal thoughts.
Sleep disruption: Late-night notifications keep kids awake, hurting focus, school performance, and emotional regulation.
Addictive design: Algorithms exploit the adolescent brain’s reward system, leading to compulsive checking and distress when offline.
Greater harm for younger kids: Studies show 13-year-old's accounts encounter harmful or disturbing content much more quickly than older teens.
What We Know & What We Don’t
Strengths of the law:
First U.S. law to set clear legal limits on algorithmic feeds for minors.
Explicit rules for nighttime notifications.
Acknowledges the youth mental health crisis as urgent.
Challenges ahead:
Will age verification work without compromising privacy?
Will kids outside New York benefit, or will other states need to follow?
Could legal challenges slow down implementation?
Will platforms create “workarounds” that sidestep the spirit of the law?
What If You Don’t Live in New York?
Even if you’re not a New York parent, this law is still a wake-up call. It shows that states are finally recognizing what moms like you have known for years: our kids need stronger boundaries online. But you don’t have to wait for legislation to protect your family.
The SAFE for Kids Act mirrors the same principles I teach in my CPR Framework:
Connect: A following-only feed keeps kids closer to real friends instead of endless strangers.
Protect: Nighttime curfews for notifications shield kids during their most vulnerable hours.
Regulate: Verified parental consent puts you back in the driver’s seat, instead of letting platforms decide.
You can start now, right in your own home, with the CPR Framework:
Connect: Keep conversations open. Talk with your kids about how apps are designed to hook them—and remind them you’re their safe place when they feel overwhelmed.
Protect: Use the tools already available to silence notifications at night, limit endless feeds, and block risky content.
Regulate: Set family rules around screen-free time, healthy sleep, and financial safeguards so your child learns balance and resilience.
No matter where you live, these steps give you the same kind of backup the SAFE for Kids Act promises because strong parenting and proactive choices travel further than any state law.
And if you’re ready to go deeper, our Screen Smart Moms community is opening up soon, where you’ll find guidance, coaching, and other moms walking this same journey alongside you. Sometimes it just takes knowledge and accountability to transform how you manage screen time in your home.
What Parents Can Do Right Now
Even before these rules take effect, you can act today:
Audit devices: Use parental controls on iOS/Android to silence nighttime notifications and enforce “downtime.”
Talk about feeds: Explain how algorithms are designed to keep kids scrolling. Encourage them to try chronological or “following only” views.
Prioritize sleep hygiene: No screens 30–60 minutes before bed. Use physical alarms instead of phones.
Model balance: Let your kids see you set limits for yourself, too.
Stay informed & advocate: If you’re in New York, join public comment periods on these rules. Anywhere else? Bring these issues to your PTA or school district.
Statistics That Underscore the Urgency

Over 40% of kids ages 10–14 experience compulsive use and distress when not on social media.
In test accounts, 13-year-olds encountered harmful content in ~15% of recommendation feeds—nearly double the rate of 18-year-olds.
Platforms could lose an estimated $100–150 million annually in New York ad revenue if minors default to non-algorithmic feeds.
Final Word, Mom-to-Mom
As a mom who raised seven kids through the rise of smartphones and social media, I know the late-night battles over glowing screens all too well. The SAFE for Kids Act is not a magic shield; our kids are smart, and loopholes always exist, but it’s a major step toward holding platforms accountable. At least we know that someone is paying attention.
That means we finally might be getting some backup. We’re not in this alone. If you feel alone and don't have support or the tech knowledge, please do not hesitate to reach out to me.
If you want to stay one step ahead, learning not just how to parent through screen time but how to manage the technology itself, I invite you to join me on a call and keep an eye out for our Screen Smart Moms community that is coming soon. Together, we’ll protect our kids, build confidence, and give them the boundaries they need to thrive offline.
Ready to protect your kids online with confidence?
Book a call and follow me on Facebook today.
Comment below and share this post, let’s spark a conversation about protecting kids from endless late-night scrolling!
References
New York Attorney General, Proposed Rules for SAFE for Kids Act, September 2025.
“New York’s ban on addictive social media feeds for kids takes shape with proposed rules,” AP News, September 15, 2025.
“Age checks, curfew alerts, parental consent: Inside NY’s proposed social media rulebook,” Times Union, September 2025.
“Here’s how NY plans to regulate kids’ use of social media,” City & State NY, September 2025.
Research articles: Social Media Algorithms and Teen Addiction, De et al., 2025. ; Protecting Young Users on Social Media: Evaluating the Effectiveness … (accounts age 13 vs 18), 2025.
