Children's Online Privacy Protection Rule
The Federal Trade Commission amends the Children's Online Privacy Protection Rule (the "Rule"), consistent with the requirements of the Children's Online Privacy Protection Act. The amendments to the Rule, which are based on the FTC's review of public comments and its enforcement experience, include one new definition and modifications to several others, as well as updates to key provisions to respond to changes in technology and online practices. The amendments are intended to strengthen protection of personal information collected from children, and, where appropriate, to clarify and streamline the Rule since it was last amended in January 2013.
What this rule actually says
The FTC updated COPPA to protect kids' personal information online more strictly. The rule now covers newer tech practices—like tracking, AI profiling, and voice data—that the old 2013 version missed. If a product collects or uses data from kids under 13, COPPA requires explicit parental consent before that happens, plus stronger safeguards on how that data gets stored and shared.
Who it applies to
- If your product is directed at children under 13 (or knowingly collects data from them): COPPA applies. This includes apps, websites, and AI assistants marketed to kids or with UI/marketing clearly designed for kids.
- If you collect any "personal information": name, email, location, device IDs, browsing history, voice recordings, biometric data, or inferred profiles (e.g., interests or behavioral patterns learned by your AI). Even anonymous data tied to a device counts.
- Geography: This is a U.S. federal rule. If your product is accessible to U.S. users and collects data from kids under 13, it applies—regardless of where you're based.
- AI use cases that trigger it: Medical scribes recording kids' sessions, hiring assistants screening young applicants, support chatbots used in schools or marketed to parents of young children, any AI learning from kids' data to build profiles.
- Data scope in: Nearly everything. Age, location, device IDs, audio/video, behavioral data, AI-generated inferences about the child.
- Data scope out: Aggregate, truly anonymous data (not tied to a device or identifier); data collected but not retained; limited data collected solely for parental communication consent.
What founders need to do
- Audit your user base (1-2 days): Determine if anyone under 13 uses your product. Check your analytics, user signups, and marketing materials. If unclear, assume they do.
- Design for parental consent (3-5 days): Build a system to verify parental identity and get documented consent before collecting any data from kids. This is non-negotiable and requires real verification—an email checkbox doesn't cut it.
- Document your data practices (2-3 days): Write down exactly what personal info you collect, how long you keep it, who you share it with, and how your AI uses it. The FTC expects this to be public-facing.
- Implement safeguards (1-2 weeks): Encrypt stored data, limit retention, restrict internal access, and avoid selling or sharing kids' data with third parties unless required by law.
- Monitor ongoing compliance (ongoing): When you update your AI model or add features, re-audit for new data collection. Document changes.
Bottom line
If your product could reach kids under 13, assume COPPA applies and start with the parental consent system immediately—everything else follows from that.