Written by Grace Sargent
Opportunity on the rise: why the 2026 Legislative Session could make further strides to protect children from explicit content through App Store age verification.
Scrolling through the App Store is already overwhelming. Games, social media apps, practical apps, reading, musical, and video apps — they’re all in one place, and we’ve chosen to surrender that digital bazaar to our children. A 2025 Fortune report found that less than half of parents utilize available safety features, leading to unsupervised access to explicit content, risky links, and potential data sharing, despite widespread parental concern about online dangers. When kids are handed an iPad, they’re given the controls to access things that could negatively impact them for life.
Two states, Texas and Utah, have decided this is a major problem, and have passed legislation to require parental consent for app downloads for minors.
But this debate shouldn’t be just about parental vigilance — it’s not the responsibility of mothers and fathers to ensure that tech giants are following the law. The industry should be held responsible for following the rules, rather than shrugging its shoulders and pointing to this or that band-aid that parents could theoretically use.
From parental control gaps to systemic responsibility
Right now, much of children’s safety online depends on whether parents know about and properly use device/app safety tools. By mandating age verification at the App Store level, the regulation creates a baseline safeguard: before a minor can download an app, the store must obtain parental consent.
This is not a “parental convenience” issue, it is an industry-wide realignment in responsibility. It acknowledges that expecting every individual parent to police every app is unrealistic; instead, it makes companies whose systems grant access to content responsible for guarding that gate.
Why this isn’t just “another content law” — it’s a contract and consumer-protection law
The App Store Accountability Act (recently passed in states like Utah and Texas) isn’t a “content law” — it’s a matter of “contract law.” That distinction matters. Children cannot sign contracts. Under these laws:
- When a user, adult or minor, sets up an account, the store must ask for age information and verify it using “commercially reasonable methods.”
- If the user is a minor, their account must link to a parent or guardian account, and verifiable parental consent is required for downloads, purchases, and in-app purchases.
- No company may enforce terms of service or contracts against a minor without that consent.
Agreeing to the “terms of service” in an application download is similar to signing a contract, and minors aren’t legally capable of signing contracts without a guardian. Age verification simply enforces that baseline principle in the digital world.
Real parental control — with the responsibility in the right place
This is a powerful legal framing: it doesn’t revolve around policing “bad content,” but ensuring that minors don’t legally bind themselves to contracts or access potentially harmful apps without consent.
This ultimately gives parents real control, and offers an accountability mechanism if companies violate those protections (for example, by allowing a minor to use an app even after consent is revoked). Indeed, the legislation includes a “private right of action” allowing parents to sue if those protections are ignored.
Moreover, by adopting standardized age categories (child, younger teen, older teen, adult), and requiring “commercially reasonable” verification like a credit card, ID, or other standard method, states can protect kids without overburdening families with invasive surveillance. In fact, many devices would not need to collect any new information.
Why 2026 is the moment for other states to act, including South Dakota
In 2026, South Dakota lawmakers have a real chance to re-align one aspect of the digital world. They can help ensure that the digital marketplace operates ethically, and place the burden where it belongs: on the industry. Children cannot sign contracts, and parents should not be required to pick up the slack from an industry that is ignoring a basic principle of contract law: a consenting adult.
The message is clear: don’t wait. The model already exists, and it’s already working. Let’s build a safer, more responsible digital environment for children before the next generation binds itself to a lifetime’s worth of apps, data sharing, and potentially consequences.
Protecting childhood online shouldn’t be optional or a matter of luck. It should be the default.