top of page
  • Writer's pictureSigurður Ragnarsson

Ofcom’s Role in Managing the UK’s Online Safety Regulatory Regime

Updated: Oct 16, 2023

The UK’s Online Safety Bill (OSB) is projected to come into force as early as spring 2023, instilling new legal requirements for social media platforms with users in the UK. Ofcom, the UK’s regulator of communications services, is the appointed organisation that will be responsible for managing the new regulatory regime established by the Online Safety Bill.

Designated user-to-user services and search engine services will be regulated by Ofcom via periodic risk assessments and mandatory reporting requirements. In this article, you’ll learn about the role Ofcom has in the Online Safety Bill, its responsibilities, and priorities.

 


Included in this article:



What is Ofcom responsible for?

As the UK’s regulator of communications services, Ofcom’s duty is to ensure that the public benefits from high quality telecoms, radio, broadcasting and postal services. Ofcom’s duties and powers come from the direction of Parliament; it has day-to-day independence from government. Its responsibilities currently include:

  • Ensuring the ability to use communication services, including broadband

  • Makes sure universal postal service accounts for all UK addresses six days a week for letters and five days a week for parcels

  • Viewers and listeners are protected from harmful or offensive media released on TV, radio, and on-demand, and from invasion of privacy and unfair treatment in programs

  • Regulate a range of companies to ensure they provide quality content for a variety of audiences within television and radio programs

(Ofcom, What is Ofcom?)

Should the Online Safety Bill (OSB) come into force, Ofcom will assume regulatory responsibility for certain online services with a focus on protecting the public from harm.


Ofcom’s preparations for the OSB - regulating video-sharing platforms (VSPs)


Starting in November of 2020, Ofcom gained experience with online regulation with transposed European legislation obligating video-sharing platforms to adhere to new rules protecting users from harmful content. Ofcom has published their guidelines for UK-established VSPs, outlining the five following regulatory priorities:

  1. Reducing the risk of child sexual abuse material on adult sites

  2. Laying the foundations for age verification on those sites

  3. Tackling online hate and terror

  4. Ensuring an age-appropriate experience on platforms popular with under-18s

  5. Ensuring VSPs’ processes for reporting harmful content are effective.


Conducting online regulation according to these published guidelines has given Ofcom a strong understanding of how to regulate online services and platforms and guide them in the challenges they face in implementing online safety, poising Ofcom to hit the ground running when the new regime goes into force in the spring.

Introduction to the online safety regime

Introducing a new duty of care

The OSB’s overarching, long-term intention is to build a culture of online safety and a strong implementation of better platform management practices. The OSB implements this by introducing, for the first time, a specific duty of care for platforms towards their users and other members of the public who could be harmed.

Similar to cultures of safety in the tangible world, such as the expected appropriate conduct and precautions taken by the restaurant and service industry, the goal of the OSB is to adopt an adjacent perspective in digital spaces. Building a stronger culture and practice of risk assessment and management for online services is a long-term project and will not solve the problems of online harm overnight.

Ofcom’s approach


A risk-based approach


Ofcom recognizes that different types and sizes of online services pose varying levels of risk associated with their service, and as such will enforce the OSB’s regulatory standards proportionally to a given online service’s corresponding risk level and the available resources the service has at its disposal to address these risks.

Online services that pose a higher risk to users will be required to adhere to more stringent regulatory obligations such as transparency reporting and enhanced safeguards for users (especially children), whereas obligations for lower-risk services will minimize proportionally as risk level decreases. This approach intends to result in a balanced regulatory framework for online services and fair protection for users while avoiding burdening industry or impinging upon freedom of expression.


Defining platforms in scope

Ofcom will regulate designated “user-to-user services” defined in the OSB as “an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.” (Part 2, Key Definitions, Online Safety Bill)

Ofcom will maintain a register of regulated platforms. Platforms will need to use Ofcom guidance to ascertain whether they come in scope of the regulations and, if so, are legally required to notify Ofcom.

So long as a platform has a meaningful UK user base, the regulator can take appropriate action against any platform no matter where they are based.

Categorised versus non-categorised platforms


In-scope services will further need to be categorised by Ofcom according to the number of users in a service, its risk of disseminating harmful content, and what functionalities it has.

It should be noted that of 25,000 services in scope, Government estimation of services that actually qualify for the most stringent of these special obligations tallies to just 30 to 40 services. Ofcom has developed a series of categories for these higher risk services, these being:

• Category 1: the highest reach user-to-user services with the highest risk functionalities, with transparency requirements, a duty to assess risks to adults of legal but harmful content, requirements relating to fraudulent advertising and a variety of other duties.

• Category 2a services: the highest reach search services, with transparency and fraudulent advertising requirements.

• Category 2b services: other services with potentially risky functionalities or other factors, with transparency requirements, but no other additional duties.

Categorised platforms have more stringent duties under the legislation, and will generally be required to engage with Ofcom on an ongoing basis – via annual transparency reports, for example. Ofcom will perform an initial risk assessment of all categorised platforms to gauge potential harm and what changes are needed.


What are platforms’ new duties?


Risk assessment and proactive mitigation

Risk assessment plays a foundational role in the online safety regime; this is where all platforms need to start in discharging their duty of care. If platform executives are not aware of their riskiest features and users, they will be unable to take effective and proactive action to remain compliant.

The OSB will require all in-scope services to evaluate their own level of risk as it relates to illegal content on their site(s), and will be required to put certain protections in place to mitigate potential harms. However, these obligations will vary in scope and scale, and will be different depending on the type of service offered.

According to Ofcom’s Roadmap to Regulation document, it expects platforms to:

● Show how they respond to risks of harm

● Consider how they prioritize user protection

● Incorporate safety considerations into product and engineering design decisions

● Continually consider needs and rights of all users

● Consider tradeoffs for the sake of platform safety

● Regularly discuss safety issues between senior decision makers.

Control and report illegal content

The prioritization of online safety in this bill focuses on specific types of material that pose the largest risks of online harm. Ofcom will focus on these categories of illegal material to help halt their distribution, reuploading, and ultimately consumption. Platforms must take proactive steps to prevent the distribution and consumption of this content; when it is found they must remove it and promptly notify the National Crime Agency (NCA).

The Online Safety Bill specifies “priority illegal content” as:

  • Child sexual abuse material (CSAM) and child sexual exploitation and abuse (CSEA) refers to visual evidence of child sexual abuse in the form of videos or images.

  • Terrorist and Violent Extremist Content (TVEC) - Any content, text or visual imagery, that depicts or incites violence, terrorism, and extremist ideologies.

  • Content that “amounts to a relevant offence” – these include existing offences such as violent threats, harassment and incitement; supply of drugs and firearms; facilitating illegal immigration, sexual exploitation and fraud; and also some new offences, such as cyberflashing.


Previous versions of the Bill extended the same duties to cover so-called “legal but harmful” content, such as self-harm content, but this created strong controversy and was removed. However platforms are still required to prevent children from seeing this content, and must offer adults tools to manage what content they see, ensuring their personal choice to view or block certain legal content they deem a harm to themselves. Doing this will require platforms to categorize content so that users can choose whether or not to view it.

Implementing highly advanced digital detection tools

The nature of illegal and harmful material, such as CSAM, terror material and revenge porn, is that it will likely never be completely eradicated from online spaces. Victims depicted in this material are revictimized for years when it is downloaded, shared, reuploaded onto online platforms. However, by better regulating the digital space and implementing stronger safety measures, there is a chance at effectively thwarting abusers' efforts to continue circulating this kind of material and inhibiting the frequency with which it appears online.

The OSB gives Ofcom the power to require platforms to use specific technologies to proactively detect and remove content.

Hash matching technology in particular can help to reduce the amount of time harmful material stays live on a given platform. Performing continuous, automatic search against known harmful content databases (such as the National Center for Missing and Exploited Children) will ensure content moderation teams can address visually unique material and separate it from known material.


Focus on protecting children online


A key component of the OSB is keeping the best interests of children and their safety online. Children’s risk assessments will need to be completed and reviewed by all service providers. This assessment must include:

  • Assessing the user base for how many children and in which age groups are using the service

  • The level of risk to children using the service, specified by content type, primary and non-designated content assessed separately

  • Assess risks associated with functionalities such as enabling adults to search for other users, with children appearing in search results, and children’s ability to find potentially harmful or age-inappropriate content

  • Assessment of the service’s design and operation with consideration of the promotion of media literacy and safety tips for the service as well as the business model and governance.

A primary focus of Ofcom’s expectations is for in-scope platforms to implement preventative measures to help stop the spread of child sexual abuse material (CSAM), child grooming, and child sexual exploitation and abuse (CSEA) generally. While Ofcom is aware that many online services already have automated detection and moderation tools in place, the expectation is predicated on building a stronger culture of online safety, with the highest priority being putting actions in place that effectively curtails the dissemination of CSAM online.

In addition, legal but harmful content that was previously to be banned for all users – such as material promoting self-harm or eating disorders - is now to be controlled for children only. The OSB gives the UK government significant latitude to define and update these categories.

Accountability and freedom of expression

Platforms must publish clear terms of service which are strictly enforced, and to respond appropriately to complaints.

Ofcom’s job is to ensure that platforms operate within the law, enforce agreed standards of safety, and are accountable to users. It will not have jurisdiction over the users involved in posting or disseminating content, or censor online content. The OSB was amended to include a specific duty for platforms to protect users’ freedom of expression, as well as protecting journalistic content.

However, there have and continue to be concerns that the OSB could usher in censorship by the back door, by encouraging platforms to be overly risk-averse in the content they block or allow. Larger online platforms have become a primary space for the exchange of ideas and communication. This issue will no doubt continue to generate discussion and amendments to the regulations over time.

Transparency (for categorised services)

Categorised services have a requirement to collect transparency data – to be specified by Ofcom - indicating the scale and nature of regulated activity on their platforms. They will need to submit annual transparency reports to Ofcom.

Ofcom’s powers of enforcement


Monitoring and engagement comes first

Ofcom’s strong preference is to engage with platforms and support them to comply with the regulations, rather than reacting when something goes wrong. To that end, a large part of Ofcom’s duties comprise producing detailed codes of conduct, guidance and research to help platforms understand their duties and take appropriate action.

Categorised services, particularly the largest category 1 platforms, should expect to have an ongoing relationship with Ofcom – ideally a positive one, with two-way communication that also helps Ofcom to make its approach more effective. Ofcom will be proactively monitoring these high profile platforms, including receiving transparency data, information from user surveys and third party complaints.

That said, when Ofcom is made aware that a platform may not be compliant with its duties it has a wide range of enforcement powers with some stringent consequences for the worst offenders.

Powers of information gathering and investigation

Ofcom may require a platform to provide specific, detailed information on any relevant matter; to provide the name of a senior manager who is to be responsible for ensuring compliance; to commission specialists to investigate; to open a statutory investigation; to require interviews; and even to enter and inspect premises to carry out audits. Organisations and senior managers can be held liable for failure to cooperate at this stage.

Consequences of non-compliance

Following an investigation, Ofcom can require a platform to make good on its failure to fulfil its duties – for example, by carrying out a proper risk assessment, tightening or enforcing policies, or implementing new technology to proactively detect and remove illegal content.

In the worst cases, platforms which are unable or unwilling to comply could face fines of up to £18 million or 10% of the company’s global annual turnover, whichever amount is higher. Some UK parliamentarians have also indicated their desire to introduce a power to hold platform executives criminally liable. As well as the potential fines, Ofcom would have the authority to block non-compliant platforms from being accessed in the UK.


bottom of page