top of page
  • Writer's pictureEinar Sigurdsson

UK Online Safety Bill: Prepare for Compliance with Upcoming Regulations

With potential to receive Royal Assent in autumn 2023, the UK Online Safety Bill will require online platforms that operate in the UK (regardless of where their platforms are based) to take a more proactive approach against harmful content posted to their sites. In this article, you’ll learn everything you need to know to ensure your platform complies with the bill’s requirements.

 



Included in this article:



Since their advent, online platforms and search engines have done their part to support the open exchange of ideas and media by allowing user-generated content (UGC)—i.e., content created and uploaded by users—to be posted on their sites. While most users take advantage of these services in a safe and secure manner, there are some who have abused this freedom by posting illegal material on platforms despite government laws or servicers’ terms of use, resulting in a consistent stream of harmful content, such as child sexual abuse material (CSAM) or terrorist and violent extremist content (TVEC), making its way online every day.


While platforms operating within the UK are certainly already familiar with various legal duties directed at the removal of harmful content, the current processes put in place to catch and remove harmful content—often user reports and understaffed/overworked content moderation teams—have proven inconsistent.


In an effort to address the epidemic of harmful content across the web and protect users (especially children) from being presented with it, the UK’s aptly named Online Safety Bill (OSB) may soon require online platforms to take a refreshed approach toward ensuring the safety of users online.


In anticipation of this bill, and as thought leaders in the detection of harmful content, we’ve put together an overview of the key touch points your platform will need to be aware of as you prepare for compliance.


Has the Online Safety Bill been passed?


No. The bill has completed the Committee Stage in the House of Lords, now moving into the Reporting stage.


What is the bill and why is it in legislation?


With the intention of making the UK the world’s safest place to use the internet, the government has been refining the bill since it was first drafted in May 2021, with firm emphasis on ensuring the protection of freedom of expression while managing harmful content.


The bill is primarily concerned with protecting children from harmful content (such as pornography and content that promotes suicide, self-harm, or eating disorders). A large part of this focuses on holding pornography service providers to a high standard in regards to age verification and site gating.



UK Online Safety Bill basics: Regulatory body, repercussions of non-compliance, and timeframe.


Who regulates the new law and who will the new law affect?


The UK government will enforce the OSB with communications regulator Ofcom, who will oversee around 25,000 tech companies currently hosting UGC. These platforms include forums, messaging apps, cloud storage services, and the most popular pornography sites. Since search engines give users access to platforms, Ofcom will regulate them as well.


Additionally, though the OSB is UK law, the internet is global; the regulator will take appropriate action against a company no matter where they are based, so long as the platform has users in the UK.


What are the repercussions of non-compliance?


Non-compliant platforms will be at risk of fines up to £18m or 10% of their global annual turnover, whichever is higher, and in extreme cases may risk their platform being blocked.


When will the Online Safety Bill go into effect?


The bill is still pending, but has some potential to be in effect as early as autumn 2023. With this in mind, tech companies should soon begin investigating and implementing efficient and effective means of self-regulation for their platforms. Platforms should act now and get prepared to avoid risking the company’s image and revenue losses. Under the OSB, users will be able to report a company for non-compliance if harmful content is on their platform. With the risk of being fined or blocked entirely, the company must be quick to take down harmful content.



Primary objectives and focuses


Protecting children


Given that easy access to the internet has made children more susceptible to consuming harmful content than ever before, implementing means for securing children’s safety is the primary focus of this bill. Platforms will be required to put protections in place that prevent children from accessing content that is harmful to them, such as pornography and self harm.


Platforms will be required to report child sexual abuse material (CSAM) to the National Crime Agency (NCA), which will aid law enforcement in their initiatives to locate perpetrators and rescue victims.


Protecting users’ rights while avoiding violence and racism


The OSB intends to protect users’ freedom to debate and share opinions that may be offensive by requiring platforms to enforce this right. Regulator Ofcom will further ensure that platforms do not discriminate against a specific political viewpoint.


Protecting women from harassment


This bill plans to protect women and girls against content-related illegal interactions and abuses, such as non-consensual intimate imagery (NCII), harassment, and cyberstalking. In addition, cyberflashing, the act of sending explicit images or videos without the receiver’s consent, will be made illegal by the OSB.


What platforms will need to do to comply with the Online Safety Bill


1. Protect children from harmful content and abuse


Alongside illegal user-generated content, platforms will need create restrictions that disable children from accessing content such as pornography, self-harm, and eating disorder content.


Platforms intended for children will also be responsible for creating protections from abuse that isn’t clearly defined as illegal but can still be extremely harmful, such as cyberbullying. In the event of this kind of abuse, platforms will be required to offer easy ways for children (and their parents) to report the problem as well as take action against it.


2. Provide terms and conditions with easy content reporting system


The OSB will not allow arbitrary removal of harmful content and platforms must clarify the boundaries on permissible UGC.


For children and adults, there must be an easy way to report harmful content, which may be in the form of comments, photos, videos, and emojis. Simple terms and conditions and having an accessible reporting system are also critical to maintaining compliance as a platform.


3. Moderate harmful misinformation and disinformation


The OSB will require companies to eliminate posts that could indirectly cause harm to children, such as misinformation and disinformation containing incitement of violence and vaccine-related topics. Ofcom will attend to this by obtaining transparency reports from companies to not only ensure that they are complying with the OSB, but are equipping their users with ways to identify such falsehoods.


4. Address racist abuse and hate speech


The OSB emphasizes that platforms allowing racist UGC on their websites will be highly scrutinized. Platforms must respond quickly whenever a user posts racist content to avoid being fined or blocked. The OSB urges platforms to have processes in place specifically for identifying racist content, whether it be text, photos, videos, or emojis.


5. Report illegal content to the National Crime Agency


Each platform must report illegal and abusive content found in both public and private channels on the platform to the National Crime Agency (NCA), who will be responsible for taking further steps to resolve the crimes. Especially regarding CSAM, the regulator will enforce companies’ duty to report crimes as quickly as possible to stop any active abuse towards the children depicted.


6. Advanced technology for automation and optimization of scanning methods


The OSB urges affected companies to adopt advanced technology to help manage and automate the harmful video/image removal process. In the event that it is necessary, Ofcom will be able to require platforms to implement technology that scans private channels, like private messaging, for harmful content without compromising user privacy. For maintained privacy, platforms must use a highly accurate tool to extract and identify such material and leave the legal content at bay.


Video identification tools: Helping ensure compliance with the Online Safety Bill


Video and image-based identification tools made possible by visual fingerprinting technology can give online platforms the ability to automate the level of content moderation required by the OSB. Using an advanced API to match newly posted videos and images against a reference database, video identification tools can help platforms begin to build visibility over the content that ends up on their site, as well as provide a thorough and effective method of scanning for illegal harmful content.


While larger platforms most likely already have some form of AI detection processes in place to identify harmful content, these technologies tend to only pick up a small percentage of the slack. This leaves the brunt of the work to human moderators, which can be both harmful to those in charge of catching said material as well as a slower, less efficient method of video and image identification. Automating this technology with new and unique approaches to content identification can aid in filling these gaps to ensure platforms account for the UGC that appears on their site.


Using an advanced video identification system would assist platforms in:

  • Detecting harmful content

  • Finding harmful content repeatedly posted to online platforms

  • Reducing the need for an extensive content moderation department

  • Moderating and curtailing abuse

  • Assisting law enforcement in finding criminals and rescuing victims

  • Helping hotlines track distribution of CSAM


Contact us to discuss how we can help address your platform's particular video identification needs.



bottom of page