top of page
Search

COPPA updates

  • Writer: Katarzyna  Celińska
    Katarzyna Celińska
  • May 5
  • 2 min read

Finally, a positive step at the U.S. federal level in the area of children’s privacy.

 

The FTC has finalized long-awaited updates to COPPA, the first major update since 2013. The amended rule is effective from June 23, 2025, with most regulated entities required to comply by April 22, 2026.

 

 

This matters because protecting children online based on rules largely shaped in the early 2010s is simply not enough anymore.



Photo: Freepik

 

The FTC’s update introduces several important changes. Operators covered by COPPA must obtain separate verifiable parental consent before disclosing children’s personal information to third parties for targeted advertising or other purposes. The rule also strengthens limits on data retention, making clear that children’s personal information cannot be retained indefinitely and must be kept only as long as reasonably necessary for the specific purpose for which it was collected.

 

The definition of personal information is also expanded to include, among other things, biometric identifiers and government-issued identifiers. This is particularly important in a world where voiceprints, faceprints, gait patterns, genetic data and other biometric signals can be used to identify or profile children in ways that were not realistic when COPPA was last updated.

 

The update also increases transparency requirements for FTC-approved COPPA Safe Harbor programs, including public membership lists and additional reporting to the FTC.

 

From my perspective, this is a good direction, but not enough.

In the EU, children benefit from stronger baseline protection under the GDPR, although large social media platforms and digital services still create very serious practical problems. Legal rights are one thing. Real protection, enforcement, dark patterns, addictive design, adtech and age assurance are another.

 

In my view, the biggest remaining issue is still this:

How do we reliably know whether the user is a child?

 

Without adequate age assurance / age verification mechanisms, many obligations remain difficult to enforce in practice. Privacy policies and terms of service are not enough. A checkbox asking “Are you over 13?” is not enough. A self-declared date of birth is often not enough. If a service is likely to be used by children, then the design must reflect that risk.

 

California enforcement already shows the direction.

Cases involving mobile games, streaming services, school-related digital platforms, and children’s data have demonstrated that regulators are looking not only at privacy policies and ToU, but also at real age-related controls.


 
 
 

Comments


Stay in touch

ITGRC ADVISORY LTD. 

590 Kingston Road, London, 

United Kingdom, SW20 8DN

​company  number: 12435469

Privacy policy

  • Facebook
  • Twitter
  • LinkedIn
  • Instagram
bottom of page