The future of the EU content moderation playbook, the Digital Services Act (DSA), was decided last Friday, April 22, when EU co-legislators reached an agreement on the final text which is not yet public. While Access Now welcomes the human rights-centric framework, the brash inclusion of the Crisis Response Mechanism, and the exclusion of safeguards to encryption and the prevention of legally-mandated decision making leave gaps that must be addressed.
Access Now has provided a series of recommendations to EU co-legislators and closely monitored the process of finalising the law since 2020. Without a doubt, the final deal on the DSA is a step in the right direction.
“After a lot of back and forth, decision-makers have agreed on a DSA that puts people first. The human rights-centric framework will provide a clearly defined set of due diligence responsibilities for companies — placing the responsibility on the shoulders of those profiting, not on everyday people simply using these platforms, to create safe spaces to communicate,” said Eliška Pírková, Europe Policy Analyst and Global Freedom of Expression Lead at Access Now. “The final text of the DSA could be more ambitious. Many progressive measures, such as clear safeguards for end-to-end encryption in communications were either ignored or weakened during negotiations.”
Access Now’s ongoing analysis of the outcome is preliminary, and based on observations of the negotiations. It may evolve once the final text is made public.
The good:
- A last-minute proposal that would have required search engines to de-list illegal content and websites from search results, and risking freedom of expression, did not make it to the final text.
- An obligation of mandatory cellphone registration for those who disseminate pornographic content on platforms that host such content — which would have inadequately addressed image-based sexual abuse and non-consensual nudity was excluded, and reduced anonymity online — was scrapped.
- A ban on the use of sensitive personal data for the presentation of ads online was a positive addition where the DSA complements the General Data Protection Regulation (GDPR). The GDPR has strict rules concerning the processing of sensitive data and, although it is unclear how sensitive data could have ever been legally used for ads, the ban agreed under the DSA would put an end to this legal debate and this invasive practice.
- A measure against the use of so-called dark patterns is a very positive outcome, but the open-ended list of platforms’ deceptive design and other practices originally proposed by the European Parliament has been significantly shortened.
The missed opportunities:
- An agreement on safeguards that would have prevented legally-mandated automated decision making and strengthened the protection of encrypted online communication did not make it in the final text, and negotiators missed an opportunity to take a strong stance in favour of privacy and confidentiality of communications.
The last-minute addition:
- The addition of a Crisis Response Mechanism (CRM) to the DSA proposal in response to the war in Ukraine, which will allow the Commission to require Very Large Online Platforms and Very Large Search Engines to adopt rapid actions should a crisis occur, is incredibly alarming. Based on the confirmed outcome of Friday’s negotiations, the Commission will have to act only upon recommendation of the Board representing national regulators. A sunset clause of three months is included in the CRM measure.
Yet, the hardest part for the DSA is likely ahead: its successful enforcement. The coming years will prove the strength of the EU content moderation rulebook and whether it can live up to its job. Access Now will continue to monitor the DSA enforcement mechanism, and will push for a human rights-centric approach to content moderation across Europe.