In a recent statement to European Commissioner Thierry Breton, ARTICLE 19 and Access Now welcome the commissioner’s reassurance that the enforcement of the Digital Services Act (DSA) framework will fully respect fundamental rights. Having said that, we wish to reiterate that the platforms must not only be ‘effective’ in dealing with illegal and harmful content, but that they also ought to be necessary and proportionate, strictly adhering to international freedom of expression standards. A precise and rights-based interpretation of the DSA is key, especially during times of conflict, to ensuring that lawful content is not removed. The statement follows.
Statement: Response to Commissioner Breton on civil society calls for a precise interpretation of the Digital Services Act (DSA) during times of conflict.
On 28 October 2023, ARTICLE 19, Access Now and 28 partner organisations wrote to European Commissioner Thierry Breton in response to his communications to X, Meta, TikTok and YouTube regarding the spread of disinformation and illegal content on their respective platforms following the escalation of hostilities in Gaza.
We highlighted that Commissioner Breton conflated the DSA’s treatment of illegal content and disinformation, although the latter is not automatically illegal; that the DSA imposes due diligence obligations on very large online platforms (VLOPs) that tackle systemic risks stemming from such content rather than prioritising swift content removals; and that the DSA does not include any mention of a 24-hour timeframe for VLOPs to respond to notifications of alleged illegal content, despite Commissioner Breton’s demands.
We welcome the Commissioner’s response confirming that the DSA:
- sets out carefully balanced due diligence obligations for platforms to cater for the specific nature of the risks and content at stake,
- distinguish between illegal and harmful content; and,
- ensure at the same time the full respect of fundamental rights, such as freedom of expression and information.
We remind the Commissioner that measures taken by the platforms must not only be ‘effective’ in dealing with illegal and harmful content, but that they also ought to be necessary and proportionate, strictly adhering to international freedom of expression standards. The significance of upholding freedom of expression during armed conflicts cannot be overstated. Yet, the over-removal of content protected under freedom of expression standards on certain online platforms, particularly during the current conflict in Israel and Palestine, remains a serious concern.
In this context, we also wish to highlight the distinction between ‘normal administrative practice’ and requesting a response within 24 hours from platforms as to how they tackle problematic content whilst referencing the DSA’s sanction regime. It is the latter which risks incentivising platforms to adopt a heavy-handed approach to content moderation.
We are pleased that the Commissioner confirms that the DSA entrusts civil society with an active role in ensuring platform accountability and supporting the Commission in the exercise of supervisory and investigatory powers. While we welcome the Commissioner’s invitation to a roundtable, we reiterate our previous calls for structured and formalised cooperation mechanisms between civil society organisations, independent experts and the Commission.
Furthermore, the potential impact of the DSA – and the decisions made by its enforcers – beyond the EU has been highlighted once again by the current Israel-Palestine conflict. We reiterate our calls to EU regulators to include non-EU civil society voices in the DSA implementation process.
We will continue to closely monitor the enforcement of the DSA to ensure that it is in line with fundamental rights and the provisions of the DSA itself. This includes the ongoing formal proceedings against X, a process we will be following closely and which will set an important marker as to how the Commission approaches its duties as a regulator. We urge the Commission to enforce the DSA consistently, equally and impartially across all platforms, based on evidence and in a way that will further, not restrict, freedom of expression and fundamental rights online.