Over the last two years, the European Union has been working towards establishing sensible and sustainable solutions for regulating online platforms and digital services through the Digital Services Act (DSA). This cornerstone initiative has the power to regulate how online platforms safeguard our fundamental rights, such as privacy and freedom of expression. This huge joint effort is no mean feat, but it is striking that EU Member States establishing their own national solutions would prove to be the biggest obstacle to the DSA’s culmination.
Countries looking for national solutions to EU issues should be seen as positive, proactive initiatives, right? Not really in this case. It’s actually unusual for European states to regulate an issue at the same time that it is being discussed at EU level. It can lead to contradicting approaches across the region, heightened tensions between countries, and increased difficulties in finding common ground in Brussels. And that’s exactly what we’re seeing today.
So, why are EU countries seemingly working against the EU? What decisions are they taking? Does it fit with the EU approach to regulating online platforms? What are the consequences of these state-run initiatives?
Where the EU’s at
Let’s start with a recap of the EU’s version of the story. Despite the various shortcomings of the draft DSA, it represents a novel approach to platform governance. It focuses on the regulation of processes deployed by online platforms, rather than on rules limited to specific categories of illegal content or illegal conduct. For years, we have been calling on legislators to advance this type of regulatory response, instead of short-sighted legislation that relies on swift content removals under unduly short time frames. Even though the EU has taken on an extremely challenging task to harmonise the fragmented regulatory landscape in content governance, this step towards a new generation of platform regulation is a significant move for the protection of human rights in the digital sphere.
However, some EU member states have decided not to follow the EU’s approach, instead choosing to undermine the harmonisation efforts by legislating independently. Since September 2020, several national draft laws have been introduced across the EU in anticipation of the DSA proposal, with even more announced since the proposal’s launch. The most recent ones are the French proposal on “reinforcing republican principles”, and the Polish draft law on the protection of freedom of speech on online social networking sites. Similarly, Hungary had planned to introduce a draft law to combat “systemic abuse” of power by platforms, concerned about Facebook possibly limiting the visibility of Christian, conservative, right-wing opinions. However, the Hungarian government eventually backtracked on this idea.
France to the EU: “It’s my way or the highway”
In response to a series of deadly terror attacks in recent years, France proposed a bill seeking to “do something” about the spread of illegal or potentially harmful content online. It does not provide a comprehensive approach to regulating online content, instead it focuses on hate speech. Alongside proposals such as the 2020 Austrian Anti-Hate Speech law and the German NetzDG, it places new obligations on all platforms, whether established in France or abroad, which list, rank, or share content uploaded by third parties — ultimately making platforms responsible for their users’ actions. As a nation marred by vicious attacks, France’s concerns over the spread of illegal and potentially harmful content is certainly understandable, but rushing into well intended but short-sighted law reform in the aftermath of tragedy leads to significant human rights violations that negatively impact especially vulnerable groups.
In some sections, the French proposal tries to follow the EU DSA’s approach by including transparency provisions, such as mandatory yearly reporting on platforms’ mechanisms implemented to fight against harmful content, or precise information in the terms of use about the company’s content moderation process. In the same vein, it also requires very large online platforms to implement yearly external risk-auditing mechanisms in relation to harmful content and the effective protection of user’s fundamental rights. These measures suggest that France has moved away from old regulatory rhetoric represented by the now defunct Avia Law — a previous attempt to regulate hate speech that was struck down by the French Constitutional Council in May 2020. The new bill could extend the definition of illegal content to include a new criminal offense of the “revelation, dissemination or transmission of information relating to the private, family or professional life of a person that enables their identification or location and thus exposes them to a direct risk of harm to person or property.” This amendment was proposed following the assassination of the school teacher Samuel Paty, whose name and place of work was made public on social media.
France has however made clear that it would want its vision, whether or not it aligns with the EU’s, to be followed in the DSA. For instance, it has announced that it will try to modify the DSA proposal to make sure it covers “potentially harmful” content in its scope – a vague concept can lead to legal content being removed online.
France is also suggesting a new enforcement model. The new French proposal may also require platforms to appoint a single point of contact to “streamline” the communication with the French broadcasting authority (CSA) who could be tasked with overseeing some of these rules. In practice this could change the country of establishment or origin principle enshrined in the eCommerce Directive, and it could lead to fragmentation of the Digital Single Market, instead of its harmonisation. According to this principle, where an action or service is performed in one country, but received in another, the applicable law is the one of the country where the action or service is performed. The European Union has expressed interest in following a similar approach in the DSA to the one suggested by the French in order to better empower people all over the EU. In fact, this could allow people to place complaints in their own country without having to be dependent on where an online platform may be from. But this approach also has pitfalls, as it could lead to different enforcement of rules across the EU and transform it into 27 small islands equipped with their own enforcement measures.
The Polish way: a hidden agenda dressed up to protect “traditional values”
Conservative governments in Hungary and Poland announced their intent to regulate online platforms using the need to avoid arbitrary “deplatforming”, like that of Trump in the U.S., as the main basis. Poland has unveiled its regulatory proposal, while Hungary initially sat on its draft, then completely withdrew its announced plan to regulate. Hungarian authorities did, however, set up the Hungarian Digital Freedom Committee that is tasked with monitoring online space for “violations of democratic principles”. The mandate of this committee strongly resembles the Polish Freedom of Speech Council, a new public institution founded by the Polish legislative proposal with the potential to curb free expression.
Ultimately, the Polish and French proposals are not significantly different in their content, even if the underlying reasoning and government agendas are. Both proposals establish new obligations for online platforms in the field of content moderation. The Polish version also obliges all service providers to submit yearly transparency reports on methods countering disinformation and disseminating illegal content. Notably, providers that receive over 100 complaints in a calendar year about the distribution and access to illegal content, would have to publish a report every six months about how exactly they handle these complaints. The reporting templates are to be prepared directly by the Ministry of Justice, which, coupled with mandatory legal representatives as government go-betweens, shines a spotlight on the government’s underlying goal to centralise power and exercise control over online platforms.
The so-called Freedom of Speech Council is the final piece of the puzzle to centralise the state power over online platforms in Poland. The Council is directly appointed by the parliament’s lower house, and should act as an appeal body for those individuals who are not satisfied with the outcome of online platforms’ internal complaint procedures. Behind closed doors, five members of the Council would deliberate, and could order the restoration of restricted content or access to users’ profile, giving platforms a mere 24 hours to comply. This enforcement mechanism is completely incompatible with the harmonised oversight structure discussed at the EU level that would also guarantee the application of the EU Charter of fundamental rights — which is often ignored by the current Polish government.
How can the EU’s DSA prevail?
As the EU process enters its next phase in both the European Parliament and Council, national proposals continue to create overbroad and disproportionate obligations for online platforms. They undermine the EU’s ability to reach a harmonised standard for platform governance.
The EU slogan, “unified in diversity”, should be the foundation of all processes, but Member States are suddenly prioritising contextual and cultural differences in order to push their own hidden — or downright obvious — populist agendas. When announcing their legislative proposals, Poland and France cited the need to seriously “consider national specificities” and to take “onboard all the cultural differences” between nations and the EU when tackling platforms’ content governance models. Different words, different goals, but same outcome.
We call on EU Member States to actively participate in the negotiation process of the DSA rather than creating further division in Europe.