Dear Commissioner Breton, We, the undersigned civil society organisations, are writing to request clarification concerning your recent comments suggesting that arbitrary blocking of online platforms could be an enforceable and justified measure under the Digital Services Act (DSA). These were made in response to remarks by French President Emmanuel Macron who raised the possibility of blocking access to social media platforms in relation to the ongoing civil unrest in the country. These comments could reinforce the weaponisation of internet shutdowns, which includes arbitrary blocking of online platforms by governments around the world. According to international human rights law, internet shutdowns, including arbitrary blocking of online platforms that disregard procedural safeguards, violate human rights. The impacts are especially severe in contexts where people are most at risk of violence. In 2022, Access Now recorded 62 shutdowns during protests across the globe. Data shows a spike in their use to shroud violence and serious human rights abuse, such as brutal crackdowns on protesters. Research has also highlighted that network disruptions exacerbate the spread of misinformation, as people are denied access to alternative sources of verification channels. Arbitrary blocking of online platforms and other forms of internet shutdowns are never a proportionate measure and impose disastrous consequences for people’s safety. The European Union fully recognises that internet shutdowns severely hinder the enjoyment of economic, social, and cultural rights, as well as civil and political rights. By no means should arbitrary blocking of Instagram, TikTok, or other social media platforms be viewed as a solution to any event or perceived crisis in a Member State or across the EU. We therefore ask for clarification to confirm that the DSA does not, in fact, provide for the possibility of shutting down online platforms as a sanction for failing to remove “hateful content,” as implied by your comments. While the DSA allows for some temporary restrictions on access to services, these are last resort measures that may only be considered after repeated non-cooperation and non-compliance with this Regulation. In addition, they have to be backed by significant procedural safeguards to comply with international human rights standards. According to the jurisprudence of the European Court of Human Rights (ECtHR), such procedural safeguards include providing advance notification of the blocking measures to affected parties and conducting an impact assessment of the measures in order to avoid their arbitrary or excessive effects. In addition, a blocking order has to be issued by an independent and impartial judicial body. We also urge the European Commission to ensure that national implementation and enforcement of the DSA by Member States does not lead to an overly broad interpretation of DSA measures, which would go against the regulatory objectives of the law and violate the EU Charter of Fundamental Rights. In particular, the draft law to secure and regulate digital space proposed in France (Projet de loi “Sécuriser et réguler l’espace numérique”) carries such risks. For instance, the proposal establishes a 24-hour deadline for content removal. It would also require browser-based website blocking, which is an unprecedented government censorship tool. These and several other proposed measures go far beyond the DSA requirements and contradict its goals by creating risks of censorship of legal content. We look forward to hearing back from you and we remain at your disposal for any questions you may have. Sincerely, Signatories: