Regulators are trying to close a glaring safety gap on the open web: preventing children from seeing porn. But the first big enforcement test under the UK’s Online Safety Act is exposing how messy that goal is in practice – legally, technically and politically.

Ofcom has fined a company called 8579 LLC £1.35m for failing to introduce ”highly effective” age verification across its adult sites. The regulator says the firm did not implement robust checks for UK visitors on most of its porn sites between 25 July and at least 19 November 2025, and it added a further £50,000 for failing to respond to information requests.

”Those that fail to do this – or ignore legally binding requests from us – should expect to face fines,” said George Lusty, director of enforcement at Ofcom.

Ofcom

The decision is the largest fine Ofcom has levied so far under the Online Safety Act. The regulator has also ordered 8579 LLC to hand over a complete list of the sites it runs, and threatened additional daily penalties if the company does not implement checks on its remaining site and supply the information requested.

Why this matters beyond one fine

This isn’t only about one operator failing a compliance checklist. It’s an early stress test of a broader public policy trade-off: how do you stop minors seeing age-restricted content online without creating new privacy and security problems for everyone else?

Britain’s OSA requires platforms to prevent under-18s encountering pornography and other harmful content. That goal has wide public support. The tricky part is the mechanics. Age checks can range from simple credit-card and mobile-operator verification to identity-document uploads and biometric scans. Each approach brings different costs and risks: user friction, data collection, potential breaches, and exclusion of people without mainstream ID or banking access.

Industry players have warned – and the company behind Pornhub’s parent, Aylo, said publicly – that the law risks diverting users to less regulated corners of the internet. That’s a valid concern, though it also conveniently serves the business case of large platforms that can afford strict, centralized systems.

The Online Safety Act ”has not achieved its goal of protecting minors” and ”has diverted traffic to darker, unregulated corners of the internet,” Aylo said in response to enforcement activity.

Aylo

History and how we got here

This is not the first time the UK has tried to tackle online adult content. An earlier attempt under the Digital Economy Act 2017 floundered amid technical and political pushback. The OSA, with Ofcom as regulator, revived the effort with clearer duties and enforcement teeth – including fines and information orders.

Other jurisdictions have wrestled with the same tension. The EU’s regulatory focus has largely been on illegal content and platform transparency, while age-gating strategies have varied widely in scope and enforcement. The result is a patchwork of approaches worldwide, and the UK is now providing one of the first major real-world enforcement case studies.

What enforcement reveals about industry options

Practical realities will shape how quickly and how well sites comply. Smaller operators – the kind regulators found here – often lack the engineering and compliance budgets of big platforms, making rapid rollouts of privacy-preserving age checks hard. Larger vendors can deploy third-party identity services, but those raise data-handling and surveillance worries.

There are technical workarounds on the horizon. Privacy-preserving attestations, cryptographic proofs of age and token-based systems promise to confirm someone is over 18 without exposing their identity. Those methods are not yet widely deployed at scale, and regulators and firms will have to agree on standards before they can become mainstream.

Who wins, who loses

Winners: regulators that want to show the law has bite, and large platforms that can absorb compliance costs and thus gain market advantage. Losers: small operators who will face fines or be pushed out, and users who may be nudged toward unregulated services if mainstream options become too burdensome or invasive.

Children may ultimately be safer if effective, privacy-conscious checks take hold. But that outcome depends on the government, regulators and industry agreeing on methods that actually work without creating new harms.

The likely road ahead

Expect more enforcement notices and fines while Ofcom establishes precedent. Some operators will comply; others will try to evade detection or move infrastructure offshore. Legal challenges are possible, especially if firms argue that enforcement methods violate privacy or are disproportionate.

Policy-makers and technical communities should accelerate work on standards for privacy-preserving age verification. If they fail to do so, the UK risks a cycle of punitive headlines and shallow compliance that satisfies no one: not parents, not privacy advocates, and not responsible operators.

For now, the Ofcom fine sends a clear message: the regulator will use the OSA’s teeth. Whether that message produces a safe, fair system – or simply pushes problems out of sight – is the bigger question still to be answered.

Contact and comment requests from companies and regulators will shape the next chapter. Watch for further decisions from Ofcom and for emerging standards from privacy-focused technologists attempting to bridge the gap between protection and privacy.

Leave a comment

Your email address will not be published. Required fields are marked *