Blog
16.06.2023

How will the European Union govern social media platforms under the Digital Services Act?

Prof. Daniela Stockmann unpacks how the European Union will govern social media platforms under the Digital Services Act, and how, what she calls the bloc’s process-based approach, will differ from other comparable efforts at overseeing big tech platforms worldwide.

The European Union is currently in the process of making key decisions that will shape the development of a European model for digital governance. One outstanding initiative is the Digital Services Act (DSA), which is part of a package with the Digital Markets Act. Its principal aim is to address increasing concerns about the power of particularly large platforms. It does so via a new policy innovation that differs from the past, and from the way that platforms are still governed in the United States and most countries in the world.

To that end, I will begin by addressing what the DSA is not. Significantly, the DSA does not follow the currently dominant content moderation approach towards addressing problems such as misinformation, hate speech, distrust, and manipulation of elections by third parties. Content moderation was developed by platform firms from the start, since they had to separate spam from content that has a value to users. So content moderation has to be done.

Over time, however, platforms have discovered that they cannot get content moderation right. That is because 90 percent of content moderation is done via automatic curation tools that block, filter, and delete information. One important insight on the side of the platforms has been that automatic curation tools are either too restrictive (the so-called false positives) or too lax (the so-called false negatives). You cannot get content moderation 100 % right, and whoever is in charge will face criticism. This has led to a healthy discussion across the globe about the boundaries of freedom of speech, and about who should be put in charge to decide these boundaries – the state, tech companies, society, or a combination of all of them.

While these are important questions that should be discussed, the European Union decided to go a different route than content moderation.

The DSA represents what I call a process-based approach towards addressing concerns about platform power and their lack of responsibility. There is a knowledge gap between what policymakers and society know, on the one hand, and tech companies on the other. To address this knowledge gap the DSA aims to constrain platform power by regulating procedures. It is an attempt to introduce greater emphasis on public interest on the side of the platforms, stressing a need for more transparency, public responsibility, and accountability.

The DSA differentiates between Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) with over 10 percent of users within the European Union (45 million as of 2023). Facebook, Instagram, Twitter, and TikTok have well above 45 million monthly users, but also companies like Booking.com and Zalando classify as VLOPs. The EU makes a distinction between platforms above and under 10 percent of users in the bloc because according to the DSA, VLOPs and VLOSEs have greater obligations – especially transparency obligations – than smaller platforms and search engines.

One of the most important obligations for VLOPs constitutes the obligation to prevent abuse of their systems and to reduce risk to society. This is done via the following oversight instruments:

First, platforms are required to establish an internal compliance department which is in charge of risk assessment. Risk assessment is done by the platform itself outlining potential risks it poses for society, including such risks as:

  • dissemination of illegal content;
  • negative effects on fundamental rights, including freedom of expression;
  • negative effects on civic discourse or electoral processes;
  • negative effects on minors or gender-based violence or mental or physical well-being.

Second, compliance to the platform’s own risk mitigation procedures is ensured via annual external and platform-independent audits. Audits will likely be executed by consulting companies that receive access to data and algorithmic systems. Their main objective is to check only whether tech companies have put the announced risk assessment procedures into place.

The final – and potentially most powerful – instrument for oversight constitutes access to internal data given to researchers via a vetting process; researchers who do qualify to access such information – in particular the most sensitive personal data – are granted it according to GDPR regulations.

If these oversight instruments uncover non-compliance, the EU Commission has the right to impose fines of up to 6% of the company’s global turnover.

The DSA outlines a new system of governance to create oversight over social media platforms and ensure the stronger enforcement of regulations. Each of the 27 member states of the European Union will designate one independent body as a Digital Services Coordinator (DSC) empowered to represent that country’s interests. Currently, one group of countries – like France and Germany – is in the process of choosing national media telecommunications regulatory authorities to act as DSCs, while another group of countries – such as Luxembourg and Denmark – are considering national consumer protection and competition authorities. (There is, additionally, a third group of countries that have not yet taken action.)

Regardless of the composition of the DSCs involved, each will send one board member to the European Board for Digital Services. The board will vote by simple majority, with 48 hours to vote its recommendations to the Commission. The Board will advise the Commission on who can initiate proceedings, request information from VLOPs and VLOSEs, conduct inspections, adopt interim-measures, and impose fines. VLOPs and VLOSEs will have the right to be heard and have access to the files.

The EU’s process-based approach is a new and innovative method that will strengthen the capacity of the European public administration to hold platforms accountable to its own risk assessment and internal procedures. In doing so, Europe differentiates itself from prior content moderation-focused approaches towards social media governance. Europe’s approach strengthens participation of all players, including companies, society, and the state. Yet its success depends strongly on the ability of European member states to build agile institutions that support collaboration with societal actors to oversee big tech. In order to make sure that social media promotes democracy during the next European elections in 2024, member states have to act and assign DSCs with sufficient resources now.

Teaser photo by NordWood Themes on Unsplash.