March 19, 2024

The UK Online Safety Regime: Five Months On

Share

When the UK Online Safety Act (the "Act") became law on 26 October 2023, it had established one of the most comprehensive online safety regulatory frameworks in the world. The Act's intention is to make the use of online services for individuals in the United Kingdom, especially children, safer. It introduces a long list of new duties on providers of online services. For example, providers will have to conduct appropriate risk assessments and implement "proportionate systems and processes" in relation to illegal content and content that is harmful to children. However, the exact scope of these duties will depend on the size of the service, its risk category, and the likelihood of the service being accessed by children.

The Act also brings significant enforcement powers for the UK's communications regulator – Ofcom – which has now been designated as the UK online safety regulator. Among others, Ofcom has received powers to request information from and inspect online service providers, impose fines up to £18 million or 10% of worldwide annual revenue (whichever is greater), and apply to courts for service restricting orders. Senior managers of in-scope organisations may also be criminally liable for a company's failure to comply with Ofcom's information requests.

While some provisions of the Act are already in force (including Ofcom's information gathering powers), the core duties for online service providers will start applying only after the relevant statutory codes of practice come into force following a period of public consultation and approval by the UK Parliament. Ofcom's roadmap for implementing the Act assumes that the codes of practice will gradually come into force between Spring 2025 and Spring 2026.

Who does the Act apply to?

The Act applies to:

  1. user-to-user (U2U) services – e.g., social media, marketplaces, audio and video-sharing services, messaging services and information sharing services,
  2. search services – i.e., search engines, and
  3. pornography platforms.

The Act has extraterritorial application and can apply to businesses established outside the United Kingdom. It applies to U2U and search services if the service has "links" with the United Kingdom, for example, because the service has a "significant number" of UK users or if UK users form a "target market" for the service. Ofcom published a short questionnaire to help businesses determine if the Act applies to them.

Ofcom estimates that more than 100,000 online services, ranging from very large business to micro businesses, will be within scope of the new rules. Many of these businesses will be subject to the regulator's jurisdiction for the first time and will need to assess what steps they need to take to comply with the new rules.

What obligations do service providers have under the Act?

The Act imposes various new duties on service providers depending on the size of the service, its risk category, and its likelihood of being accessed by children.

While some duties will apply to all service providers (e.g., to risk asses harm from illegal content and content harmful to children, to implement content / search moderation for illegal content, or to keep records), some duties will apply only to certain "categorised" services.

The Act distinguishes between three categories of categorised services based on user numbers and their functionalities:

  • Category 1: Covering largest U2U services that will be subject to the strictest duties under the Act.
  • Category 2A: Covering high risk and high reach search services and combined U2U and search services.
  • Category 2B: Covering high risk and high reach U2U services that do not meet the Category 1 threshold.

The UK Government will set out specific thresholds for categorisation of online services in secondary legislation (expected in Autumn 2024). Ofcom will then decide which services meet the thresholds and add them to a public register. If a service is not on the register, it will not need to comply with the additional obligations for categorised services. Ofcom expects that most of the 100,000 online services that are in scope of the Act will not be categorised.

In categorising online services based on user numbers, the Act takes a similar approach to the approach adopted under the EU's Digital Services Act in relation to very large online platforms (VLOPs) and very large online search engines (VLOSEs). In light of this, we can expect to see a similar number and scale of objections to the categorised services finally included on Ofcom's public register, given the number of cases that have already been brought in relation to European Commission's designation of VLOPs and VLOSEs in front of the Court of Justice of the European Union.

In addition to categorised services, companies meeting certain worldwide annual turnover thresholds (to be announced) will have to notify Ofcom and pay Ofcom fees which are supposed to fund the UK online safety regulatory framework. These obligations are not expected to come into force until Spring / Summer 2025 following a consultation on fees by Ofcom.

When will the obligations under the Act become enforceable?

While some duties under the Act are already in effect (e.g., in relation to responding to Ofcom's information requests and disclosures to be included in the terms of services of the online service), most of the duties under the Act will not take effect and will not be enforced by Ofcom until the Parliament approves the code of practice covering the relevant duties.

The Act is being implemented in three phases:

1. Phase one – illegal harms duties:

Ofcom published draft codes of practice and guidance on protecting people from illegal harms (e.g., child sexual exploitation and abuse, terrorism, fraud and hate speech) for consultation which closed on 23 February 2024.

Following review of the responses, Ofcom plans to publish final versions of the documents by the end of 2024. The code of practice will then be submitted for Parliamentary approval (expected in Spring 2025) after which the duties will come into force.

2. Phase two – child safety and pornography:

Ofcom published draft guidance on age assurance for providers publishing online pornographic content for consultation which closed on 5 March 2024. Ofcom anticipates that it will publish the final guidance by Summer 2025. Ofcom expects Parliamentary approval to follow by the end of 2025, after which the duties will come into force.

In Spring / Summer 2024, Ofcom expects to consult on guidance and codes for protection of children before submitting them for Parliamentary approval. Among others, providers of online services will have to complete a Children's Access Assessment to determine whether their service is likely to be accessed by children. Ofcom plans to publish final guidance on this in Spring 2025 after which service providers will be required to complete the assessment ahead of the entry into force of the relevant duties in relation to protection of children (expected Autumn 2025).

3. Phase three – transparency, user empowerment, and other duties on categorised services:

This phase focuses on additional duties which will apply to services meeting the thresholds for Category 1, Category 2A and Category 2B services, for example, in relation to transparency reporting, allowing users to control what content they see, preventing fraudulent advertising, offering identity verification etc.

Ofcom plans to publish categorisation advice and guidance on transparency reporting in Spring 2024. The UK Government will then set out the relevant thresholds in secondary legislation, after which Ofcom expects publishing a register of categorised services by the end of 2024. Ofcom anticipates publishing final codes of practice and guidance towards the end of 2025, with Parliamentary approval expected in Spring 2026.

Data protection and online safety crossover

There are significant crossovers between the emerging UK online safety regime and existing data protection legislation. For example, the Children's Code published by the Information Commissioner's Office (the "ICO") contains 15 standards that providers of information society services likely to be accessed by children must follow to comply with the data protection legislation.

The Act also requires Ofcom to consult with the ICO in the course of preparing draft codes of practice. To support Ofcom's efforts, the ICO has published guidance on content moderation. The guidance outlines how data protection obligations apply to content moderation, and how this process can affect people's information rights. For instance, moderation decisions based on incorrect information could result in a user's content mistakenly being identified as illegal or resulting in users being banned from online platforms without reason. Under the UK General Data Protection Regulation people have the right to have inaccurate data rectified.

International dimension

The Act is the second piece of prominent online safety regulation in Europe. Together with the EU's Digital Services Act, both frameworks seek to regulate a wide range of online services with the aim of ensuring that users are able to use online services safely by imposing new duties on providers of those online services.

Outside of Europe, regulators in Australia (eSafety Commissioner), South Africa (Film and Publication Board), South Korea (Korea Communications Standards Commission) and Fiji (Online Safety Commission) have also been working on online safety regulation. Together with Ireland's Coimisiún na Meán, France's Arcom and UK's Ofcom, these regulators have established the Global Online Safety Regulators Network (GOSRN) to discuss solutions to global online safety challenges, and share information and best practice.

International businesses are likely to welcome any harmonization among the emerging online safety regimes as they navigate different duties and reporting obligations. However, due to the relatively new nature of these regulatory regimes, we are unlikely to see coordinated enforcement among online safety regulators in the nearest future.

Commentary

A successful online safety regime needs to strike the right balance between protecting people from illegal and harmful content while not encroaching on their privacy and freedom of speech. This is not an easy task and the creation of the UK online safety regime almost from scratch is a big undertaking,

Ofcom's online safety team has grown to over 350 people (with more recruitment expected over the coming months) and has published documents for consultation running to over 1,800 pages in the first five months of the new regime being in place. Ultimately, Ofcom estimates its costs of preparing for and implementing the new regime will be £169 million by the end of 2025.

Despite what many people would expect from a 300-page piece of legislation, the Act does not include the exact thresholds and scope of duties. These will be set out in forthcoming secondary legislation, codes of practice and guidance so the exact burden on businesses is yet to be seen. With Ofcom expecting to publish more than 50 regulatory documents over the next two years, many businesses without a dedicated online safety team might find compliance with the new rules challenging, especially if they need to grapple with similar (but sufficiently different) emerging online safety regimes in the European Union and in other jurisdictions.

While Ofcom has set out an ambitious timeframe for implementing the online safety regime over the next two years, its success will depend, at least in part, on the willingness of online service providers to engage constructively with Ofcom and help Ofcom understand the technical aspects of their online services. Moreover, the looming General Election in the United Kingdom might delay Ofcom's roadmap for implementing the online safety regime due to the need for UK Governmental and Parliamentary approval of various aspects of the emerging regime.

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe