Key takeaways - Visions for a Digital Europe 2025


Unleashing Data Treasures: European Data Spaces & Gaia-X

  • How can a European Data Space be achieved? 
  • We need significant datasets and open data spaces to act independently and in compliance with European values in this digital age. Such spaces need to be open to civil society actors and cannot be limited to commercial and political interests. 
  • The question of which data can be used was – in regards to personal data – answered by the General Data Protection Regulation (GDPR). However, a lack of practical licensing and standards for such data is restraining the potential of digital spaces.
  • We are still relying on hardware and software from non-EU companies. We need to invest in infrastructure and personnel to complement the strong regulative basis with EU innovations that are building upon them. 
  • Economic stakeholders criticise the fact that the framework of action is much broader for companies in the US than in the EU. Regulators retort that the regulations are designed intentionally to obligate and nudge companies towards business models that are more aligned with the values of the EU.

Digital Markets Act: New Rules for Fair Competition

  • Discussion about the necessity of the Digital Markets Act (DMA) and the regulation it will contain.
  • Organising and influencing markets are two of the essential functions of platforms. Until now, they have mostly relied on self-regulation. Due to their sheer size, these private actors are able to control and regulate large parts of the internet. Their power is not only of an economic nature but is reality-defining.
  • The Digital Markets Act is an instrument of legal harmonisation and not based on market regulation. This will, however, make it harder to get the DMA into the EU contracts as it goes beyond trade. The current design leaves it to the Member States to enforce the law.
  • The discussions around DMA entail a level of complexity that makes them accessible only to people with great expertise. The “normal citizen” cannot participate in this process of deliberation. This means that there is no wide support in the population for a stricter regulation. 
  • There are tense discussions about how far-reaching the regulations should be. In particular, the platforms themselves argue that too many boundaries would slow down innovation and would result in higher restrictions compared to other market segments. Interoperability is also seen as a security risk by most companies, while many regulators highlight the importance of research and market competition.
  • The Digital Markets Act is, together with the Digital Services Act (DSA), only a first step in regaining public regulative power in the digital realm. These acts are not only designed to regulate platforms but also to enforce basic values and change perception within the public. 

Regulation of Data Trusts

  • There is a multitude of terms that aim to describe a technological concept that allows trustworthy and safe data sharing among different entities (data trust, data trust service, Datentreuhand, data stewardship). The proposal of a Data Governance Act (DGA) by the European Commission now establishes the legal term of “data intermediary”.
  • There are manifold use cases for the service of data intermediaries - including public sector use in health care, shared use of private and public entities in the field of mobility in Smart Cities, commercial Personal Information Management Systems (PIMS), and shared data spaces by civil society organisations.
  • The aim of establishing good conditions for trustworthy data intermediaries on the European market is multi-faceted - many of the challenges cannot be tackled through a regulatory framework, but need to be brought forward by public funding, private investment and market innovation.
  • DGA also provides a legal framework for data altruism and public service use of data intermediaries.
  • The regulatory approach of the DGA is contested - some fear overregulation by the requirement of an ex-ante notification and suggest rather a “soft law” approach with certification mechanisms, while others expect the regulation to have only little effect on the market due to is narrow margin of application.
  • It is yet unclear how the DGA will overlap and interact with other legal acts, such as the Data Act, GDPR, Digital Services Act, Digital Markets Act and so on.

Data Protection & Data Sovereignty

  • Data protection is helpful to innovation; the GDPR has encouraged creativity with regard to continuing to run data-driven applications in a data protection compliant manner.
  • The GDPR is a foundation of principles; authorities have the task of implementing and enforcing them.
  • On the relationship between the concepts of data protection and sovereignty, (digital) sovereignty is often seen as the opposite of data protection, which is wrong. It is about protecting fundamental rights and data protection is part of this. ‘Technosolutionism’ must be viewed critically.
  • Sovereignty has no fixed definition, and everyone understands it differently. It can only be interpreted within the framework of existing data protection laws. Sovereignty also requires being able to deal with technology in an informed way. There is a large information gap between what is happening and what people know in the area of data protection.
  • Strong civil actors to bring knowledge about data protection to the public are needed.

AI regulation in Europe - a critical look at the proposal

  • Regulation of dynamic technologies remains difficult; there are no perfect solutions, perhaps not even very good ones. The EU AI Regulation was mostly seen as a step in the right direction.
  • The panel mostly agreed that a risk-based approach is the right legal regulatory instrument. Whether transparency, civil society involvement or institutional certification is the right tool is an open question. The extent of the EU's regulatory competences was disputed.

International Perspectives on European AI Regulation

  • Europe's strong market power ensures that foreign countries carefully observe and partly adopt European technology regulation.
  • The planned AI regulation is considered to be open to technology and thus fundamentally viable for the future, perhaps even eventually becoming globally accepted.
  • In the discussion, whether the AI draft is sufficient to actually make human-centred AI the standard in Europe, and whether the draft leaves enough space for innovation, both remain controversial

Disinformation: a Threat for Public Discourse and Freedom of Expression

  • Disinformation is a problem because it seems credible. Underpaid moderators and algorithms cannot counteract disinformation sufficiently.
  • The business model of online platforms must change. Polarising content generates more attention, which equals more time on the platform. This is a danger for democracy.
  • The Facebook leaks confirm what was already speculated before.
  • The Digital Services Act cannot regulate disinformation in Europe by itself. The problem is too big; people have to be made resilient against disinformation.
  • Technology companies often try to lure regulators into complexity traps. Platforms must be oriented towards society's values.
  • International cooperation to fight disinformation is crucial, but it takes a long time. The role of civil society organisations must be strengthened, as does the media education of society.

Functioning Data Protection Supervision - Lessons for the Future

  • How well is the GDPR working from an enforcement perspective?
  • While the enforcement varies between national data protection officers, there is a common pattern of too few qualified staff and too little expertise in investigating incidents across the EU. Most national authorities have a large backlog of cases and many companies just hope to stay under the radar. Authorities hardly ever carry out unannounced privacy audits, so from an economic perspective there is little incentive to comply with the rules.
  • The public needs to develop a higher privacy awareness. A general shift in culture would motivate companies to comply with the GDPR on their own. 
  • The main problem is the procedural roadblocks while carrying out the GDPR laws. We need better and faster instruments. The regulation itself is currently a worldwide gold standard.
  • Some recommend moving enforcement to the EU level, hoping for a harder grip on Chinese and US companies in particular. Others argue that this would make the procedure for individuals even more complicated and would not help with the overload in cases.
  • There are still many open questions regarding compliance with the GDPR. Consumers and companies need certainty on the interpretation of the articles.
  • The mandate given to the authorities to advise stakeholders on data protection law and to additionally monitor compliance with it cannot currently be fulfilled due to a lack of resources. The GDPR is not the last chapter of EU privacy law. It is a continuously developing system striving to improve. One of the latest examples (for a sectoral appropriation to the GDPR) is the AI Act. 

Sustainability and Global Justice of Digitalization

  • Digitalisation enables many sustainable applications. Digital policy and climate protection must be more closely integrated. So far, this has only happened in the discourse, not at the working level. Society must be transparently involved in order to strengthen the understanding of both.
  • AI is neither good nor bad for climate. For now, it is just a tool, a group of methods, that we use in various areas and that can have an impact on climate: AI can be used to advance climate protection, but it can also lead to more emissions and resource consumption, as well as an increased energy consumption.
  • AI measures: there are a lot of ideas, but many of them are rather academic/theoretical and not yet applied in practice. What we need is a mandatory measurement of the climate impact of AI.
  • High risk rating of AI, which is introduced in the EU AI Act, should also include risks towards the environment.
  • Without justice, there is no environmental protection, and without environmental protection, there is no social justice. Just reducing a few tonnes of CO2 does not help. Digitalization must be applied for the common good in order to create more fundamental change.

Rethinking Power, Justice and Participation in the Digital Era

  • Discussion about the means necessary to make the digital area more inclusive and fair for all participants.
  • Basic human rights and democratic values need to be implemented in technology to produce a fair digital era. Because private companies will always have a for-profit perspective, they can never be left unregulated and public sector alternatives must be offered. Citizens must be empowered to choose their own path.
  • The public sector is still not very involved in the digitalization of its respective country. Solutions in this area are normally created by buying knowledge or software from abroad. However, to dissolve dependencies, the sector needs to develop its own solutions and approaches. 
  • Because digitalization concerns all areas of life, regulative thinking structures must be broken up and rebuilt. No single department is able to regulate the digital space.