Trump vs. Europe: The role of the Digital Services Act

Analysis

The European Digital Services Act (DSA) lays down new rules for online platforms, seeks to strengthen user rights and intends to hold tech companies to account. But what specific measures does the DSA include and why is it under so much pressure from the new US government and big tech companies?

Auf Deutsch.

DSA

With Donald Trump, not only Elon Musk but the entire American tech elite seems to have moved into the White House. Silicon Valley's tech billionaires sent a clear signal with their presence at Trump's inauguration: no resistance is to be expected from them. On the contrary, they align themselves with Trump for economic and financial benefits. Meta’s recent changes to its hate speech policies, which now permit speech describing trans people as “mentally ill” and women as household objects, can only be described as an act of genuflection to Trump's agenda.. Elon Musk's platform X has also been under fire for months for doing little to tackle illegal and harmful content, while offering a platform to the right-wing party AfD weeks ahead of the German federal election. These demonstrations of corporate power, and the close ties between tech CEOs and the US president, is increasing the pressure on European regulations to curb the power of online platforms. First and foremost, the Digital Services Act. 

What is the Digital Services Act?

In short, the Digital Services Act (DSA) is the new legal framework for online platforms in the European Union. These include social networks like TikTok, X and Facebook as well as search engines such as Google Search, marketplaces like Amazon and Zalando, and app stores such as the Google Play Store. 

The DSA pursues a risk-based approach: The more control a provider has over the content that is distributed via its service, the higher the requirements it must meet. Following the same logic, the DSA imposes additional rules on the very largest platforms whose services have the greatest societal impact. If the platforms fail to adhere to the rules set out in the DSA, they face fines of up to 6% of their annual global turnover.

What rules do platforms now have to follow?

Put simply, the DSA comprises four pillars. The first chapter defines liability rules, i.e. answering the question of when platforms can be held liable for illegal content shared by their users. Service providers who only process information but do not take an active role in creating and distributing content benefit from exemptions from liability, known as liability privileges. Depending on whether the service only transmits content, stores it temporarily or makes it available to others, as is the case with social media platforms, for example, additional conditions must be met in order to enjoy liability privileges. 

Stricter rules therefore apply to online platforms such as social networks that make user-generated content available to a broad audience. For example, they have to provide mechanisms for users to report potentially illegal content. Platforms must then assess users’ reports and take action to disable access to illegal content. In addition to regular users, so-called trusted flaggers can also report potentially infringing content. These are organisations that have special expertise in identifying specific categories of illegal content and are certified by the regulatory oversight bodies in the Member States. In responding to user reports, platforms must give priority to content reported by trusted flaggers. 

Beyond illegal content, online platforms can also voluntarily remove content that violates their own rules, so called terms and services. Previously, users whose content got removed or whose accounts got disabled where at the mercy of the platform if they wanted to challenge the platform’s content moderation. The second pillar of the DSA sets out clear requirements for dealing with user content: Platforms must communicate clearly in their terms of services what content is permitted on their platform and what is not, while respecting fundamental European rights such as the freedom of expression. Users must be informed about changes to these rules. Service providers must also explain to their users why specific content has been removed, publish transparency reports on the moderation of content, and record all moderation decisions in a public database. 

However, platforms cannot be forced to proactively search their users' posts for potentially illegal content. In view of the massive amount of content on the Internet and the often complex legal considerations involved in assessing a statement, this rule is crucial to avoid the mass censorship of content.

New rights of appeal and transparency obligations

The new rights of appeal for users are also key: platforms must provide complaints mechanisms that allow users to contest moderation decisions. A new feature is out-of-court dispute resolution: if a platform refuses to reinstate blocked content after a complaint from an affected user, the user concerned, that user can turn to alternative dispute settlement bodies, which will then mediate between the platform and the user. Although these procedures are not legally binding, they are free of charge for users. However, users can also turn to civil society organisations to assert their rights – in the context of the platforms' complaints mechanisms, in out-of-court dispute resolution or in court. 

In addition, platforms must ensure basic transparency about how their algorithmic recommendation systems work. These determine how content and posts are displayed to users. In addition, platforms may not use sensitive private information to target online advertising and must take measures to protect children and minors. Dark patterns are also prohibited: These are design patterns that are intended to trick users into behaving in a certain way or making decisions that they would not otherwise have made. A common example are cookie banners, which make it easier to consent to data tracking than to reject it. 

From user rights to systemic risks

The DSA thus creates a Europe-wide minimum standard for user rights. Beyond these rights, the third pillar of the DSA sets out further obligations for the largest services. These concern very large online platforms (VLOPs) or very large online search engines (VLOSEs) that have more than 45 million users per month. To date, the European Commission has designated almost 25 services as VLOPs and VLOSEs. These include social media platforms such as Instagram, TikTok and X, Google and Apple's app stores, Google Maps, a number of marketplaces, Google Search and Bing, as well as several pornography platforms. 

These platforms must publish reports analysing what "systemic risks" emanate from their services. What exactly such systemic risks are, remains unclear: the DSA lists broad categories, such as negative impacts on fundamental rights, civic discourse and electoral processes, public safety and health, as well as adverse impacts relating to gender-based violence or a person's physical and mental well-being. However, there are no definitions or concrete examples of these risks. It is therefore up to the most powerful platforms themselves to decide how much responsibility they take for these societal risks and how they mitigate them. 

Reading the first, long-awaited risk reports is, however, sobering. A good example is TikTok. In the context of the Romanian election, serious accusations were made against the Chinese platform: following the publication of reports by local civil society groups, in which TikTok was accused of amplifying the content of the ultra-nationalist presidential candidate Calin Georgescu, and a report by the Romanian secret service alleging cyber attacks and foreign influence, the Romanian Constitutional Court annulled the election results. Shortly afterwards, the European Commission initiated proceedings against TikTok for failing to manage systemic risks related to the integrity of the Romanian elections. TikTok's own report on systemic risks also addresses election-related misinformation; however, it focuses on the dissemination of fake news and ignores factors such as the misuse of TikTok's recommendation algorithms or the manipulation of the platform to spread content via inauthentic accounts in a coordinated manner. This illustrates the significant discretion that platforms have in defining systemic risks and implementing their own risk mitigation strategies. 

In its attempt to address not only the distribution of illegal content, but also the more complex social impacts of online platforms, the DSA is breaking ambitious new ground. Whether this approach will work, however, will depend largely on how consistently platforms are held accountable. 

The DSA is only as good as its enforcement 

This brings us to the last pillar of the DSA, its enforcement. The DSA is also breaking new ground in this respect. The European Commission is responsible for overseeing the very largest platforms – the VLOPs and VLOSEs. This is to prevent the Irish supervisory authority, in whose country the vast majority of large platforms have their European headquarters, from being overwhelmed by the scope of its tasks or its proximity to the platforms (as seems to be the case with the General Data Protection Regulation). 

In the Member States, the Commission is supported by national regulatory authorities, so-called Digital Services Coordinators (DSCs). They are the first point of contact for complaints and perform a range of specific tasks, such as the certification of out-of-court dispute resolution bodies, trusted flaggers, and researchers who wish to avail themselves of the newly introduced right to access platform data for research. In Germany, a number of authorities, including the state media authorities, are involved in enforcement of the DSA. The Federal Network Agency has taken on the task of coordinating the relevant authorities as the German DSC. A special feature in Germany is an advisory board consisting of representatives from civil society, academia and business, which advises and supports the DSC in its work. 

As far as enforcing the DSA in Germany is concerned, the lack of staff allocated to the DSC to enforce the DSA presents a significant problem. With around20 positions currently filled, this falls far short of the 70.6 stipulated by law, which were already considered insufficient.. And in view of this year’s early parliamentary elections and the associated challenges ranging from ensuring information quality to combating disinformation, it is clear that the DSC will face more rather than fewer challenges.

Transatlantic pushback

At European level, more than 60 enforcement actions have been kicked off in the first year after the DSA’s entry into force,, of which 13 concerned TikTok, eight were against Meta and five against X. Does this mean that fines against tech companies for failing to comply with the DSA are imminent? Not if the Silicon Valley CEOs get their way. For them, the second Trump presidency is an opportunity to undermine the new European rules. 

More than anyone else, Elon Musk, owner of X, Tesla and Starlink, stands for the new "broligarchy", the group of super-rich men who seem to be buying Trump's favour. Musk himself, who supported Trump's election campaign with a whopping 277 million dollars and priceless clout on X, and is a member of Trump’s cabinet, will surely receive the president’s support in his attempt to skirt European law Already during the American election campaign, Vice-President Vance threatened to make American support for NATO conditional on the EU not taking action against X. 

Meta’s CEO Mark Zuckerberg and Apple CEO Tim Cook (who also supported Trump’s campaign financially) have already complained to Trump about potential European fines. It is no coincidence that Mark Zuckerberg has framed the EU’s fines for non-compliance with its tech rule book with tariffs. Trump has already proven that he will not shy away from entering another trade war in order to assert American interests. Holding tech companies accountable and enforcing European user rights could thus become a question of geopolitics.

Don't deal with it

Amidst concerns regarding Europe’s competitiveness, the anti-regulation posture by Musk and co also plays into the hands of European industry actors, which have joined the dangerous chorus calling for a new era of radical deregulation. However, there is more at stake than the announced dismantling of environmental regulations or enforcement of the DSA: anyone who deals with Trump is gambling away Europe’s credibility. 

Strict enforcement of the DSA and European competition laws has the potential to counter the financial, structural and political power concentration of big tech companies. Instead of allowing itself to be blackmailed by tariffs and NATO aid and delaying enforcement, the EU must now wield the most potent weapons it has at its disposal. Private power must be curbed, and the democratisation of the economy expedited in the interests of European citizens. It remains to be seen whether the DSA and the structural measures set out in its sister law, the Digital Markets Act, such as unbundling, will be sufficient to achieve this. As things stand today, however, they are among the best antidotes for protecting Europe and our democracies from the influence of the new tech oligarchy. 

 

The views and opinions in this article do not necessarily reflect those of Heinrich-Böll-Stiftung European Union.