Skip to main content

On-demand webinar coming soon...

Blog

US Children’s Data Laws and Consent: What Businesses Need to Know in 2026

Evolving age assurance requirements and renewed federal attention are reshaping expectations around children’s consent, product design, and youth data protection.

Harry Chambers
Regulatory Content Strategist
April 2, 2026

N/A

While children’s online experiences have continued to expand, so has regulatory scrutiny over how organizations collect, use, and protect minors’ personal data. In 2025 and early 2026, legislators and regulators have accelerated efforts to strengthen protections for children and teens, with a growing emphasis on age assurance, parental consent, and safety-by-design obligations.

In 2026 alone, the governor of South Carolina enacted the South Carolina Age Appropriate Code Design, and the governor of Alabama enacted House Bill 161 on app store providers and developers. These laws add to legislation focused on children's data protection that is already in force, or set to take effect at the end of 2026, such as Vermont's Age Appropriate Design Code on January 1, 2027.

From state-level social media regulation to app store accountability laws and renewed federal momentum behind the Kids Online Safety Act (KOSA), the compliance landscape for children’s data is becoming more complex and harder to ignore.

For example, a social media platform used by teenagers may need to disable targeted advertising by default, introduce parental supervision tools, and limit data collection for features such as location sharing or personalized recommendations.

 

Why Children’s Consent Is Back in Focus

Children’s data privacy laws in the U.S. have traditionally centered on the Children’s Online Privacy Protection Act (COPPA), which requires verifiable parental consent for online data collection from children under 13. However, recent legislative developments reflect a broader shift: expanding protections to teens, regulating platform design, and embedding duty-of-care standards into law.

States are increasingly moving beyond notice-and-consent frameworks, focusing instead on how digital products are designed, how defaults are set, and how minors are protected by default, even when parental consent is not the primary mechanism. The scope of services subject to children's data laws is increasingly broad, with services under the Vermont Age Appropriate Design Code being considered to be 'reasonably likely to be accessed by a minor' if at least 2% of users are between 2 and 17 years old.

In practice, this means a video streaming service, gaming platform, or educational app may fall under youth data rules even if it does not explicitly target children, as long as a measurable portion of its audience includes minors.

State Attorneys General in California and Connecticut have also outlined that privacy protections given to children's data will remain a regulatory focus, noting ongoing investigations into privacy and safety risks.

 

South Carolina’s Social Media Regulation Act

In February 2026, the South Carolina Age Appropriate Code Design entered into effect on the same date as its passage, imposing sweeping obligations on online services reasonably likely to be accessed by minors. The law requires covered services to exercise reasonable care in the use of minors’ personal data, limit data collection, prohibit certain targeted advertising practices, and provide parental tools and transparency measures.

Notably, the law reflects an age-appropriate design approach rather than a consent-first model, signaling a broader regulatory trend toward default protections and product-level safeguards for children and teens.

For instance, a social platform may be required to disable algorithmic content recommendations that promote compulsive use patterns for younger users or ensure privacy settings default to the most protective configuration.

Other State legislation in the US, taking initial inspiration from the UK Age Appropriate Design Code which was passed in 2020, has started to move beyond initial requirements of risk based protections given to minors' personal data. Proposed laws in State legislatures such as Arizona, New Mexico, Kentucky, and Virginia now go beyond earlier laws in:

  • Giving children greater agency over their online experiences
  • Prohibiting certain types of processing activities outright
  • Requiring entities to evaluate design features for risks of compulsive use
  • Centralize user control over privacy and autonomy in settings

 

Alabama’s App Store Law

Alabama’s newly enacted app store law is also reflective recent trends by introducing another compliance layer by shifting responsibility upstream to app stores and developers. Signed in February 2026, the law requires age category verification and, for minors, confirmation that verifiable parental consent has been obtained before app downloads or significant app changes occur.

Developers are also restricted in how age data can be used and must apply the lowest applicable age category when implementing restrictions or defaults. App store providers, meanwhile, must request and verify age categories using compliant verification methods.

A gaming developer releasing an update that introduces social chat or in-app purchases may need to verify the user’s age category and ensure parental consent is obtained before enabling those features for minors.

Alabama has joined other States such as Texas, Utah, Louisiana, and California in passing legislation that requires age verification and age gating of content accessible to minors, alongside parental disclosures by developers of apps. These laws go beyond more common consent requirements in comprehensive privacy legislation such as the California CCPA or Colorado CPA, in mandating verifiable consent based on industry standards.

 

Federal Momentum: The Kids Online Safety Act

At the federal level, children’s data protection remains fragmented, but pressure is mounting. In February 2026, a bipartisan coalition of 40 state Attorneys General urged Congress to pass the Senate version of the Kids Online Safety Act (KOSA), citing concerns that the House version would preempt state laws already protecting minors.

The Senate version of KOSA emphasizes duty-of-care obligations for platforms and seeks to preserve states’ ability to respond to evolving online harms affecting children’s mental health and safety. This federal-state tension highlights a key compliance challenge: organizations must track both current state laws and potential federal changes without assuming preemption will simplify requirements.

Under duty-of-care expectations, a platform hosting user-generated content may need to assess whether recommendation algorithms amplify harmful material to minors and implement safeguards to limit exposure.

This federal-state tension highlights a key compliance challenge: organizations must track both current state laws and potential federal changes without assuming preemption will simplify requirements.

 

What This Means for Children’s Consent and Age Assurance

Across these developments, a few consistent themes are emerging:

  • Consent is no longer enough on its own. Laws increasingly focus on design, defaults, and data minimization rather than relying solely on parental permission.
  • Age verification and categorization are becoming foundational. App-focused legislation demonstrates how age assurance is moving into infrastructure layers like app stores, not just individual services.
  • Protections increasingly extend beyond under-13 users. Many State laws now apply to minors under 18, reshaping how companies define “children” in their compliance programs.
  • Disclosures must cater to minor users. Disclosures about the use of minors personal data must be given in a way that is understandable by minors, before or at the time services are provided.
  • Prohibited purposes. Some State laws now prohibit outright the processing of minors personal data for specific purposes, including the sale or sharing of minors personal data.

 

Preparing for a More Demanding Compliance Landscape

For organizations that design, market, or distribute digital services, children’s data compliance is no longer a niche issue. The combination of state-level enforcement, expanding age scopes, and renewed federal attention means companies must:

  • Reassess how they identify and classify minor users
  • Evaluate whether product design and defaults align with emerging duty-of-care expectations
  • Ensure consent and age assurance workflows scale across jurisdictions

As lawmakers continue to experiment with new models for protecting minors online, organizations should expect continued regulatory evolution rather than consolidation.

 

How OneTrust Can Help

OneTrust helps organizations operationalize parental consent by enabling parent–child identity relationships within Collection Points. By configuring OneTrust Hosted Web Forms or API-based Collection Points to capture a parent identifier, organizations can link parent and child identities into a single data subject group and manage those relationships centrally. This setup ensures consent is correctly attributed and that Double OptIn and preference communications are routed appropriately, typically to the parent, based on the identifiers provided, supporting consistent and auditable consent management for children’s data.

Beyond consent collection, OneTrust supports identity verification through integrations with external identity providers using OpenID Connect (OIDC) and the OneTrust ID Verification API. These options allow organizations to verify identities using approaches such as score-based authentication, one-time passcodes, and knowledge-based authentication, while maintaining control over verification workflows and limiting unnecessary exposure of personal data. Together, these capabilities help marketers and digital service providers collect children’s data responsibly, verify identities when needed, and manage parental consent in line with evolving regulatory expectations.

 

Key Takeaways

Children’s data laws are entering a new phase defined less by static consent checkboxes and more by ongoing responsibility for how digital experiences impact minors. With states like South Carolina and Alabama setting benchmarks and federal debates over KOSA intensifying, organizations that proactively align privacy, product, and governance strategies will be better positioned to adapt as expectations continue to rise.

 

Frequent Questions About Children’s Data Laws

 

No. Many laws apply to services that are reasonably likely to be accessed by minors. This means platforms such as gaming services, streaming platforms, educational tools, and social networks may fall within scope even if they do not explicitly target children.

Organizations should assess how minors interact with their digital services, implement age assurance or age-gating mechanisms where appropriate, and ensure parental consent workflows and product design choices align with emerging duty-of-care expectations.


You may also like