UK data protection reform and the future of the European data protection framework

Policy paper

The UK Government has indicated its desire to diverge from the European data protection regime. This brief considers some of the main areas of divergence in the Data Protection and Digital Information Bill (DPDI Bill), the latest iteration of an effort to reform data protection to "free up the use of data" to "unleash its value across the economy" and for the UK to "operate as the world's data hub."

uk-map_web.png

Read the full dossier "Digital rights post-Brexit - Regulatory divergence in the UK and the EU"

 

  1. Introduction
  2. Structure of UK data protection law
  3. UK-EU context
  4. Data subject rights
  5. AI and automated decision rights
  6. Cookies
  7. Removal of data protection impact assessment and oversight mechanisms
  8. The Information Commissioner’s Office (ICO)
  9. Conclusion

 

1. Introduction

The UK Government has indicated its desire to diverge from the European data protection regime. The UK’s General Data Protection Regulation 2016[1] and the Law Enforcement Directive 2016[2] are the two main instruments in these areas, which regulate processing of general and police and crime-related data by public bodies, respectively.

This brief considers some of the main areas of divergence in the Data Protection and Digital Information Bill (DPDI Bill),[3] which was introduced to UK Parliament on 18 July 2022. The DPDI Bill is the latest iteration of an effort to reform data protection to "free up the use of data" to "unleash its value across the economy" and for the UK to "operate as the world's data hub."[4] At the Conservative Party Conference 2022, the new UK Secretary of State for Digital announced plans to replace the GDPR with a "truly bespoke, British system of data protection."[5] This suggests that any legislation underpinning the UK's data protection reform agenda may now be subject to change. However, the DPDI Bill remains a useful overview of trends and issues that will likely re-emerge in the next proposal.

2. Structure of UK data protection law

UK data protection law is spread across multiple legal instruments. The core instrument is the General Data Protection Regulation 2016, as retained by the European Union (Withdrawal) Act,[6] and amended by various pieces of secondary legislation. This instrument is informally known as the “UK GDPR”. At the time of its passing, the UK also passed the Data Protection Act 2018.[7] This repealed existing UK data protection law; made choices regarding the leeway that the UK as a EU Member States  was granted in implementing the EU GDPR; established the Information Commissioner’s Office; articulated additional data protection offences; and put in place data protection regimes for policing and law enforcement (transposing the Law Enforcement Directive 2016) and the intelligence and security services (a version of the modernised Council of Europe Convention 108.)[8] Further data protection-relevant law is found in the Privacy and Electronic Communications Regulations (PECR),[9] which contains a provision often known as the “cookie banner law”,[10] but in reality covers a much wider array of situations where information on users’ devices are accessed or stored.

It is in this context that the Data Protection and Digital Information Bill was introduced on 18 July 2022. This document amends the existing instruments, adding to the patchwork of legal instruments that are becoming increasingly difficult to navigate.

3. UK-EU context

Data-reliant businesses in the UK often rely on data flowing across borders. Where that data is personal, to move it from the EU to the UK requires an authorising legal basis. While many options are available for this, only an adequacy agreement offers reliability and low administrative cost. Adequacy decisions are granted by the European Commission if a country has an “essentially equivalent” level of data protection to the European Union, such that individuals can rely on the foreign legal system to protect their rights. Only Japan, South Korea and the United Kingdom currently have adequacy decisions granted under the GDPR standard.[11] Countries such as New Zealand[12] and Israel[13] have decisions under the less stringent 1995 Data Protection Directive,[14] which may require them to have strengthened provisions when they are reviewed. Transferring data without an adequacy agreement is estimated to incur costs ranging from 1 and 1.6 billion GBP to UK businesses[15] as a result of additional compliance obligations.

Other options for cross-border data transfer, such as standard contractual clauses (SCCs), rely on contractually binding the recipient of data in the UK to adhere to GDPR-style legal standards through contract law. These incur legal costs, and may be subject to legal challenge when they involve countries which operate sophisticated electronic government surveillance, such as Five Eyes countries.[16] For example, there have been two successful challenges to the EU-US data agreements in the Court of Justice of the European Union: Safe Harbor and Privacy Shield (Schrems I,[17] Schrems II[18]), which represent one of the most significant judgment on global data flows. The EU Court of Justice (CJEU) judgment in Schrems II in particular sets out criteria that the European Commission will use to judge whether a particular regulatory environment is adequate, with a particular focus on state surveillance powers and lack of legal redress.

The Irish Data Protection Commissioner is currently seeking to suspend data flows that were made on the basis of SCCs between Facebook Ireland and Facebook Inc. in the United States, as contracts between two private entities do not protect against the siphoning of data by the United States government.[19]

The UK adequacy decision is the only one the EU has ever made to include a specific sunset clause (27 June 2025),[20] rather than just a review obligation. This limits the duration of adequacy to four years instead of a being a non-time limited decision, which the European Commission regularly reviews. In announcing its decision, Věra Jourová, European Commission Vice-President for Values and Transparency noted that, “we have listened very carefully to the concerns expressed by the European Parliament, EU Member States and the European Data Protection Board, in particular on the possibility of future divergence from our standards in the UK's privacy framework.”[21] As a result, there is real concern that both changes to the UK regime, including with respect to intelligence co-operation with the USA and data transfers between UK and US intelligence  agencies, may threaten this decision and cause significant costs for the UK.

The UK is advancing its own “adequacy” style regime, which in the DPDI Bill is called “transfers by regulations”. The key difference from the EU approach is that the UK law excludes the consideration of foreign intelligence surveillance regimes.  This issue was central to the CJEU ruling on EU-US data transfers under Privacy Shield and in its judgment, the CJEU elaborated on the criteria by which future the European Commission should make future adequacy decisions.

In choosing to exclude foreign intelligence considerations in its adequacy decisions, the UK adopts a materially lower standard of test than the EU. A significant concern this raises is that data might enter the UK from the EU under the current adequacy decision, and then leave to another country that has not been granted EU adequacy for reasons of its intelligence regime.

The UK government has proposed that adequacy regulations may be made for groups of countries, regions and multilateral frameworks. In contrast to the EU adequacy system, the DPDI Bill also proposes to abolish the 4-year review requirement. Instead, it would give discretion to the UK Government as to when and why to initiate a review. It would also expand redress requirements to include non-judicial redress, such as administrative redress. This would seem to depart from the CJEU ruling in Schrems 1 which found the availability of independent and impartial judicial remedies a key consideration for EU adequacy status.

4. Data subject rights

Data subjects have a variety of rights, such as to object to an organisation processing (using) your personal data at any time, to have their data erased, or to request a copy of their data. The DPDI Bill introduces new limitations to these rights.

While in the GDPR, requests can be refused if “manifestly unfounded or excessive”, the DPDI Bill changes this language to “vexatious or excessive”. The range of factors available to a data controller when deciding whether a request meets these characteristics is vague, including the “nature of the request” and the “resources available” to the controller. The impact could be a lower threshold for organisations to refuse data subject access requests by individuals.

Many controllers hold a great deal of data on individuals without necessarily having the resources to analyse it, partly due to the low costs of modern computing. This nonetheless poses a risk to the data subject of leakage. Data brokers that are bankrupt, for example, are liable to sell their data assets to other actors to recoup funds. Could these organisations also claim to refuse an erasure request due to low resources, at a time where data subjects are most likely to have their data transferred to a different actor without oversight?

5. AI and automated decision rights

The UK Government proposes significant weakening of the automated decision-making provisions present in the EU GDPR. In short, the existing provisions prevent a significant, solely automated decision from being made about an individual without legal basis, such as explicit prior consent,  a contract, or a legal obligation to make such a decision. The DPDI Bill removes this protection, limiting it only to decisions based on sensitive data, such as ethnicity or religious affiliation. Decisions not based on such data are subject only to the safeguards that exist after the decision has been made, rather than requiring authorisation to make it. As a result, decisions to block an online worker or content creator in real-time can be made without human review and can have immediate impact on individuals.

Organisations tend to find it difficult to assess bias in computer systems, especially using artificial intelligence, and sensitive data can be legally onerous to collect and store. On this, the proposed DPDI Bill aligns with European Commission, namely with the  AI Act. The DPDI Bill could allow organisations to use personal data more freely for training and testing AI. However, concerns remain regarding legal safeguards especially within the context of a Bill that would weaken automated decision-making provisions present in the EU GDPR.  The EU AI Act  is still under negotiation and any approach to bias mitigation  may change going forward.[22]

6. Cookies

Cookie banners are a visible element of data protection law. They are widely perceived as a regulatory failure, exploited by advertising technology (adtech) and the surveillance advertising industry.[23] There is a desire to address them through an “opt-out model of consent” in the DPDI Bill to indicate a quick-win. However, the DPDI Bill does not grapple with the sources of this failure.

PECR, the UK’s transposition of the EU’s e-Privacy Directive,[24] includes a provision which requires obtaining consent for accessing or storing information on a user’s terminal device, like a mobile phone or Web browser, when doing so is not necessary to access a service requested by an end-user. As advertising and tracking is not technically necessary to, for example, access a shopping website, consent is necessary in these situations. The GDPR, when it became enforceable in 2018, explicitly strengthened the requirements for consent, which are linked to the definition in PECR (although, according to the CJEU, the previous standards were roughly as strong). Perhaps more importantly, it  strengthened enforcement with fines and further regulatory powers for Supervisory Authorities. This led to an array of companies and industry groups selling solutions that would allow websites to continue tracking as they were. However, the vast majority of these solutions, as deployed on websites, obtain consent that is not freely given, informed or explicit. For example, they make accepting easier than rejecting, or they ask for individuals to consent to hundreds of trackers at once. It would be impossible for users to know the identities of or consider the implications of all these trackers. Regulators have found these practices to be illegal, indicating that the status quo of online tracking that the DPDI Bill seeks to address is not what is mandated by the GDPR, but instead represents a practice that the GDPR outlaws.

The DPDI Bill adds exceptions where an opt-out regime applies to

  • statistical analysis of a service’s use for the purpose of its improvement
  • security updates
  • enhancing the appearance or functionality of this service
  • clarifying fraud and fault detection
  • identity verification, if it can be considered “necessary” for the provision of a service.

The DPDI specifies that “statistical purposes” should not include individual measures, and should only result in aggregate data, but it amends this in to the UK GDPR, rather than PECR, creating legal uncertainty as to whether the PECR amendments create an opt-out regime for tracking or not.

7. Removal of data protection impact assessment and oversight mechanisms

The DPDI Bill proposes the removal of the obligation on some controllers to have a data protection officer, as well as of the obligation to carry out data protection impact assessments (DPIAs). This may be in part because these assessments have caused the UK government great embarrassment when, thanks to the oversight of government data processing operations enabled by the Freedom of Information Act 2000,[25] it was revealed to the public that the assessments contained poor or incomplete analysis. A prominent example of this involved the initial version of the centralised COVID-19 Bluetooth contact tracing application in England and Wales, where criticism based on the released DPIA contributed to its replacement with an alternative system.[26] Relatedly, the DPDI Bill proposes that law enforcement agencies no longer have to record the justification for accessing data in their systems, which could allow them to claim different legal justifications after they have accessed the data.

8. The Information Commissioner’s Office (ICO)

The ICO is the primary enforcement organisation for data protection law in the United Kingdom.[27] The DPDI Bill restructures the organisation and creates a series of new duties for  its regulatory function, which could overburden the regulator or actively discourage them from exercising their function  It sets an objective for the ICO to achieve only an “appropriate level” of protection for personal data, with new obligations on the Commissioner to promote innovation and competition, and expects the Commissioner to recognise the importance of preventing and detecting crime and safeguarding public security. This is likely in response to the ICO’s work in areas such as advertising technology or adtech`,[28] which has seen interaction with the Competition and Markets Authority, and some of the ICO’s vocal statements on facial recognition,[29] for which the Government has indicated support.

The Commissioner is also under a new obligation to encourage “expert public bodies” to produce their own guidance on how to interpret and comply with data protection law within their sectors. While the Commissioner must review these codes before publishing them, there is no clear division between the data controller and the “public expert body”. In areas such a health or policing, it seems highly possible that the organisations writing the codes of conduct may be carrying out the data processing themselves, creating concerns around organisations “marking their own homework”.

9. Conclusion

The divergence between the UK and EU’s data protection standards are significant and enough to destabilise the UK adequacy agreement. In particular, the proposals on data flows from the UK are unlikely to be acceptable to the EU. This would damage digital trade between the UK and EU.

The global trend is towards tighter rather than looser data protection. This puts the UK in an awkward position, and is likely to place additional burdens on UK businesses who will need to operate multiple standards in order to trade with the EU or beyond. Enforcement is also more complicated, should the UK wish to gain access or co-operate with EU regulators.

For the EU, the international dimension to the UK’s data protection reform plans could cause significant problems for the operation of adequacy agreements, especially if the UK’s approach is tolerated. It may also cause difficulties for the upward trend of data protection standards, as one key market, the UK, moves in a different direction to a global trend for tighter data protection standards and attempts to establish or reinforce different, lower standards for data transfers.

The changes to data protection in the UK are likely to fuel abuse of data by companies including online platforms. This could further exacerbate problems with online content prioritisation and continue to drive the attention market that is closely linked to online advertising and profiling. Such outcomes run counter to the aims of the UK’s Online Harms Bill and Europe’s Digital Services Act.

Strong data protection and consumer trust are closely linked. Some of the basic measures, which would limit access to data and make usage unpredictable, seem designed to fuel distrust in digital decision making and use of data. This will have a negative impact, economically and socially, while a small number of companies benefit from the freedom this provides.

The UK has been discussing greater powers for competition authorities, to introduce measures to allow customer data to move across digital businesses, as they have already done for the banking sector, as we discuss in our competition paper. Undermining consumer trust in the rules for use of personal data is likely to make competition measures at least harder and less effective, and sometimes may create significant dangers for customers.

 

Acknowledgments: We would like to thank Dr Michael Veale for his significant contribution to this briefing.

 

[5] Michelle Donelan, Our plan for digital infrastructure, culture, media and sport: https://www.conservatives.com/news/2022/our-plan-for-digital-infrastructure--culture--media-and-sport

[15]McCann D, Patel O, Ruiz J. The cost of data inadequacy: The economic impacts of the UK failing to secure an EU data adequacy decision (UCL European Institute Report with the New Economics Foundation); 2020. https://neweconomics.org/uploads/files/NEF_DATA-INADEQUACY.pdf

[16]The Five Eyes (FVEY) is an intelligence alliance comprising Australia, Canada, New Zealand, the United Kingdom, and the United States.

[19]Manacourt, Vincent (2022), ‘Europe Faces Facebook Blackout,’ Politico, July 7 2022, accessed August 22 2022 from https://www.politico.eu/article/europe-faces-facebook-blackout-instagram-meta-data-protection/