Big Tech Regulation: Contrasting UK and EU approaches to content moderation

Blog post

The differing approaches to the regulation of big tech in the UK and EU highlight some crucial questions for law-makers in both jurisdictions. The EU’s Digital Services Act and the UK’s Online Safety Bill both tackle the need to restrict certain content online and create a framework for regulating online platforms. The measures affect the underlying fundamentals of Internet services in very different ways, which could impact their operation across borders. This blog post considers how current policy choices in these two laws may influence future policy directions and ultimately the whole Internet ecosystem.

data-sharing-uk-eu_web.png

The Internet is a global, cross-border system. Historically, it was designed around open protocols, and anyone could connect to it without any technical or legal barriers. As such, it grew exponentially and spurred innovation. However, content was always technically considered separate from the network: it sat on top of the network infrastructure and had no effect on it. The system operated in the same way regardless of geography. Now, measures to tackle prohibited content risk impacting the Internet ecosystem as a whole.

The differing approaches to the regulation of big tech in the UK and EU highlight some crucial questions for law-makers in both jurisdictions. The EU’s Digital Services Act and the UK’s Online Safety Bill both tackle the need to restrict certain content online and create a framework for regulating online platforms.

The measures affect the underlying fundamentals of Internet services in very different ways, which could impact their operation across borders. This blog post considers how current policy choices in these two laws may influence future policy directions and ultimately the whole Internet ecosystem.

Policy background

In policy terms, the genesis of the two laws has played a significant role in influencing the outcomes.

The Digital Services Act (DSA) arises out of EU Treaty obligations, notably the Single Market and the Charter of Fundamental Rights. However, enforcement and protection of digital rights dates back to the eCommerce Directive, and more recently the separate 2009 Telecoms Package. At the time, a core issue in the debate concerned copyright infringement and a proposal to suspend or terminate the Internet accounts of infringing users.  It was resolved in the European Parliament with an amendment that called for a ’prior, fair and impartial’ hearing before such action could be taken. This was followed by more political battles over  the Anti-counterfeiting Trade Agreement (2012), the net neutrality regulation, and the Copyright Directive (2019).

Since 2009, the central question has been whether online services could be allowed to filter, block, or prioritise content at their own discretion, and how to protect lawful content.

In 2022, this question remains, but with rapid developments in technology what has changed is the nature of the content to be restricted (which could now include text, images, video, music or animation for example) and the type of intermediaries that are affected. 

UK law is now separate from EU law and outside the jurisdiction of the Court of Justice of the European Union (CJEU). The UK is also considering leaving the European Convention on Human Rights, and there is concern that this could undermine the protection and protection of human rights in the UK.

The Online Safety Bill is part-way through its legislative journey, with an anticipated conclusion in 2023. It arose out of a policy discourse around protecting children online that began around 2008. Over the years, the discourse has expanded to address other areas, including terrorism, violence against women, and online abuse.  During that period, European standards, such as prohibitions on ‘general monitoring’ of content, were frequently regarded by UK policy makers as barriers to possible interventions.

Key comparisons

The DSA wraps in the key provisions of the E-Commerce Directive (2000/31/EU), which governed the activities of online services for 20 years. The provisions ensure neutrality of broadband networks, meaning that “its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores.” It also prohibits general monitoring, so service providers may not be asked to systematically monitor content, as well as the so-called liability shield for hosting platforms, which protects them from lawsuits that could arise from harmful content posted to their site by third parties.

Content moderation

The DSA includes provisions for a notice-and-action procedure to create a consistent cross-border framework for content moderation. Key elements of this are:

  • Notices will require a “reasoned justification”
  • Online platforms are not required to make an assessment of the content
  • Online platforms are expected to act according to a uniform set of rules in order to protect the right to freedom of expression.

By contrast, the Online Safety Bill attempts a much more ambitious, expansive set of requirements, which the UK government argues will result in more safety for users. The Bill

  • Introduces a general monitoring requirement
  • Overturns the liability shield provisions 
  • Requires Internet companies to police their platforms for illegal content, defined as 28 criminal offences, which involves both identifying and removing content reported to them, as well as proactively seeking out illegal content for removal 
  • Creates a new concept of legal but harmful content. It grants Internet companies a wide discretion to assess what they would reasonably consider harmful.

The criteria for harmfulness will be established by government Ministers after the Bill has been adopted into law. According to an indicative list in a Ministerial Statement issued in July, harmful content will include eating disorders, legal suicide content, promotion of self-harm, and online abuse and harassment. For children, it also includes promoting violence and pornography.

Categorising online services

The categorisation of online services (grouping services to impose different obligations) is handled differently in the two laws.

The DSA takes a horizontal approach, categorising services as intermediary, hosting, online platforms and very large online platforms (VLOPs; serving more than 45 million active monthly users). It requires online platforms to have more sophisticated reporting networks, and outlines transparency requirements for their recommender systems (used to predict the rating or the preference a user might give to an item).

The Online Safety Bill creates three categories of online service. These are loosely defined on the face of the Bill, and the final criteria – and hence which services are applicable – will be determined by Ministers after the Bill has been adopted.

Freedom of expression

One key area that is handled differently in the two laws is freedom of expression.

The DSA explicitly recognises the right of European citizens to exercise their freedom of expression, guaranteed in the Charter of Fundamental Rights. As a result, online service providers are asked to act in a non-arbitrary manner that is diligent and proportionate, even when restricting content under their own terms and conditions. There are procedural safeguards, such as a requirement to notify users with a reasoned statement of justification when removing content. The law establishes principles to govern complaints and redress processes. 

By contrast, the Online Safety Bill states that providers must ‘have regard for’ freedom of expression, but at present has little in the way of procedural safeguards to guarantee this. Users who feel the restriction is unjust have the possibility to access a complaints procedure [Clause 18], but the requirements for that procedure are unspecified. The right to appeal to a court is absent.

Questions raised by these different approaches

Cross-border effects will raise questions about how online platforms deal with potentially conflicting mandates in different jurisdictions. The concerns relate to the underlying fundamentals of the laws and the levers they seek to use over the platforms.

General monitoring of online content, for example, is forbidden in the EU, but will be expressly mandated in the UK.  All platforms that operate in both jurisdictions will face risks to their systems and users’ content. It may also mean a loss of service provision in one jurisdiction.

Another important area is that of notice-and-action. Until now, the US Digital Millennium Copyright Act (DMCA) has been the de facto route for issuing complaints to online platforms about copyright infringement. Could EU’s notice-and-action procedures become the de facto standard for other types of complaints about wrongfully removed content or accounts?  What happens when content is lawful in one country and unlawful in another? It is possible that UK users will either have less access to redress than EU users, or that UK users will enjoy some levels of procedural protections as a result of the de facto European standards, without the right to appeal to the courts?

An area which will be particularly tricky for regulators and companies will be the legal but harmful content clause that falls outside of current online content identification techniques. Here, the EU expects companies to provide assessments to mitigate risks, according to criteria set forth by the European Commission. In the UK, ministers direct the regulator to ensure companies take action over certain content or behaviour. In practice, global companies want a uniform set of rules.

Although they might seem like purely technical or legal matters, these decisions ultimately affect the millions of people across Europe and the UK who use these platforms for professional and personal reasons, and who rely on the State to guarantee their basic rights whilst doing so.

Today’s policy choices will have ramifications  for many years to come. The UK and EU have different scope for action and potentially conflicting approaches, but ultimately share the same goals: both want major platforms properly regulated, such that illegal and harmful content is effectively managed. Both the UK and EU should focus on these desired results in order to ensure divergence does not get in the way of what they both wish to achieve.