404 content not found: Illegal harms and the Online Safety Act

As Ofcom begins enforcing the Online Safety Act, we review the Illegal Harms Codes of Practice and their impact on service providers.
by A&L Goodbody LLP
02 May 2025
A&L Goodbody LLP
A&L Goodbody LLP
25 North Wall Quay
IFSC
D01 H104

Background

The Online Safety Act (the Act), passed in October 2023, is the UK’s most ambitious attempt to regulate digital platforms. Aiming to make the UK “the safest place in the world to be online”, its focus is threefold:

  1. removing illegal content
  2. enforcing in-scope platforms’ terms of service
  3. empowering users to exclude potentially harmful content

Our first Online Safety Act in Focus article outlined key challenges arising from the Act for online services, search engines and tech companies. These included free speech concerns, questions over the Act’s long-term viability and divergence from regulatory regimes including the EU’s Digital Services Act. Since then, we’ve seen several significant milestones including, from 17 March 2025, the implementation of Ofcom’s Illegal Harms Codes of Practice (the Codes).

Illegal Harms Codes of Practice

Both search and user-to-user (U2U) services must now comply with Ofcom’s safety measures or use other effective measures to protect users from illegal content and activity. The Information Commissioner has stressed that providers must also ensure “data protection by design and default approach” when implementing safety systems under the Act.

In this update, we outline the scope of illegal content under the Codes, key requirements on platforms and Ofcom’s approach to enforcement. We also provide a breakdown of the Act’s notice and takedown regime, focusing on the Section 10(3) duty on U2U services to swiftly remove illegal online content.

What is illegal content?

The Act introduces a new legal concept of illegal content, defined at section 59 as “content that amounts to a relevant offence”. For these purposes, both relevant offences and content are given broad interpretations. Where the former includes over 130 terrorism, CSAM and other priority offences, content means certain words, images, speech or sounds whose use, possession, viewing, access, publication or dissemination constitutes an offence.

The Act requires platforms to collect and assess all relevant information to determine whether online content is illegal.  The threshold here is reasonable grounds to infer a relevant offence, albeit that Ofcom has clarified that providers are not required to make findings to a criminal standard or determine whether there has in fact been a breach of UK criminal law.

Ofcom’s Illegal Content Judgements Guidance highlights the following key points:

  • Third-party reports: Third-party reports alone cannot demonstrate reasonable grounds to infer a relevant offence. Only judgments of the UK courts (and, in limited circumstances, Financial Conduct Authority opinions) will be binding on platforms reviewing potentially unlawful content.
  • State of mind: Providers must now infer the conduct; state of mind and defences open to users interacting with potentially illegal content. For example, in cases where one user may lawfully upload content, it remains illegal if its use, viewing or access would be a relevant offence under the Act.
  • Jurisdiction: In determining if content is illegal, it is not relevant whether anything done in relation to the content takes place in any part of the United Kingdom. Content could be illegal even where the service provider, uploader, viewer and accessor are all based outside the UK.

The regulator also highlights the need for providers to ensure a sufficient understanding of UK law including, where necessary, the distinctions between its three legal jurisdictions.

The onus is now on providers to detect and remove illegal content, as well as to reduce the likelihood of priority criminal content being made available online. Proposed measures include:

  • Nominating an individual who is accountable to the platform’s most senior governance body for compliance with the illegal content safety, reporting and complaints duties.
  • Introducing systems and processes designed to review and assess content that the provider has reason (whether by third-party report or otherwise) to suspect may be illegal.
  • Ensuring that providers’ systems and processes are designed to swiftly take down illegal content and/or illegal content proxy of which it is aware, unless it is not technically feasible to do so.
  • Prioritising reviewable content with regard to the severity of its potential harm, the likelihood of illegality and the desirability of minimising the number of affected UK users.

Ofcom also recommends that large or multi-risk platforms track and regularly report any evidence of new and increasing illegal harms online, including increased user complaints or referrals from law enforcement.

Notice and takedown

While Section 10 of the Act creates a duty on service providers to use proportionate systems and processes to swiftly remove illegal content once alerted to it by a person or made aware of it in some other way, its practical outworkings remain unclear. Neither the Act nor its associated guidance touch upon the existing regime under the Electronic Commerce (EC Directive) Regulations 2002 (the Regulations). In particular, Regulation 19 provides that an information society service provider (ISSP) cannot be found liable for storing unlawful data where it lacked actual knowledge that the data was illegal or, upon learning of the illegal nature of the data, acted expeditiously to remove or disable access to it.

In determining whether an ISSP has actual knowledge for these purposes, Regulation 22 requires the Court to take into account all matters which appear to it in the particular circumstances to be relevant and, among other things, to have regard to:

(a) whether the ISSP received a notice through a specified email address and

(b) the extent to which any such notice included its sender’s full name and address, details of the location of the information and details of the unlawful nature of the activity or information in question.

The focus here is on facts or circumstances of which the host is actually aware, as opposed to facts of which the host could, or ought to, have been aware.

It’s also unclear what constitutes swift removal under the Act. Like expeditious removal under Regulation 19, however, this is likely to remain a fact-specific exercise which will vary on a case-by-case basis. This is reinforced by Ofcom’s recommendation that large and multi-risk services setting performance targets on content moderation should “balance the need to take relevant content moderation action swiftly against the importance of making accurate moderation decisions.” It remains to be seen how (and to what extent) this will impact upon enforcement action under the Act.

Outlook

Ofcom is expected to make additional announcements on formal enforcement action in the coming weeks. It has said that its first priority remains child protection, as well as ensuring providers introduce the backend protections required to fulfil their legal duties under the Act. Early enforcement will also focus on sites and apps that Ofcom considers may present particular risks of harm from illegal content because of their size and nature, including the UK’s largest search and U2U service providers.

  • By Ciaran O’Shiel, Partner & Niamh Flanagan, Solicitor @ A&L Goodbody LLP.

More News