Download PDF ↓
10/2/2026

Children's Access to Online Services: a Governance Framework for Online Safety Compliance

Across global digital regulation, a deceptively simple question is doing an increasing amount of legal and regulatory work: can children access this service?

In the UK, services must conduct a “Children’s Access Assessment” under the Online Safety Act to determine whether child-specific duties apply. In Brazil, the ECA Digital framework applies where children are present on a platform. In Australia and other jurisdictions, a wider range of companies are increasingly becoming obliged to demonstrate children cannot access their services. Despite significant differences, the underlying regulatory move is the same: children’s access is being used as a gateway condition for enhanced protection.

Yet “access” is rarely defined with precision. Legislators tend to rely on phrases such as “likely to be accessed by children”, leaving platforms to translate ambiguous, probabilistic concepts into binary compliance outcomes. This risks an over-reliance on formal age limits or age gates, an under-examination of real-world use, and limited documentation of how conclusions were reached.

In practice, children’s access to online platforms is difficult to treat as a yes-or-no fact. Instead, service providers must seek to establish a defensible position reached through structured judgement under uncertainty. Treating it as such allows services to move beyond checkbox compliance and towards governance that regulators can interrogate in good faith.

A useful way to frame this judgement is to distinguish between two analytically separate questions.

Question one is formal access: does the service, by design and policy, permit children to access it? This includes terms of service age limits, statedeligibility rules, app store ratings, and the presence of age gates or identity checks. Formal access establishes a service’s declared posture. It answers what the service intends, asserts to regulators, and communicates to its users.

Question two is practical access: regardless of intent, can children realistically access and use the service in the real world? This requires examining the operational effectiveness of controls, including the strength and coverage of age assurance, ease of circumvention, device and account-sharing practices, onboarding flows, payment gateways, and distribution channels. Practical access is empirical and probabilistic. It asks not what the system says it does, but what happens when it meets real users.

The same control may appear in both analyses. For example, age assurance can be a formal rule and an operational safeguard. The distinction lies not in the technology, but in the claim being evaluated: policy intent versus real-world performance. Keeping questions one and two separate allows services to acknowledge residual risk without undermining their stated position, and to show improvement over time as controls mature.

Importantly, many regulators are now also concerned with a third questions: attraction and reach. Even where access is formally restricted, services whose features, cultural presence, or distribution channels strongly appeal to children may still be considered likely to be accessed by them.

Taken together, these three dimensions can be used to inform high-quality, cross-jurisdictional children’s access assessments: a repeatable methodology that evaluates formal access, practical access, and attraction, documents assumptions and evidence, and maps the resulting judgement onto jurisdiction-specific legal thresholds.

Above all, regulators expect to see credible reasoning, high quality evidence, and a clear decision trail. Services that can demonstrate how they reached their conclusions - and how they would revisit them as their product or user base changes- will be better placed to meet both current and emerging expectations.

Children’s access is becoming a central organising concept of online regulation, and taking a binary approach will not satisfy regulators. What is now required is demonstrable governance: a clear theory of access, evidence proportionate to risk, and a defensible account of how conclusions are reached and reviewed over time.

If you need support developing a defensible position on children’s access and understanding how it shapes your approach to compliance across jurisdictions, the team at Illuminate Tech can help.

Request a demo

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.