EU: The draft Digital Services Act and the Digital Markets Act must protect freedom of expression

EU: The draft Digital Services Act and the Digital Markets Act must protect freedom of expression - Digital

Today the European Union published the draft Digital Services Act (DSA) and the Digital Markets Act (DMA), two pieces of legislation with major implications for freedom of expression online. 

The DSA places new transparency and due diligence obligations on very large online platforms. The DMA imposes dos and don’ts on gatekeepers that provide ‘core platform services’. Both Acts have the potential to be a blueprint for regulation and competition across online platforms.

As the EU legislative process gets underway, ARTICLE 19 calls on the European institutions to ensure the highest levels of protection for freedom of expression and other human rights.

Commenting on the digital rights package, Barbora Bukovska, Senior Director for Law and Policy for ARTICLE 19, said: 

“The Digital Services Act and the Digital Markets Act are a once in a generation opportunity to get regulation right in this complex area. To be a success, the protection of human rights, including freedom of speech, and user choice must be at their  heart. It is crucial that the European Commission ensures that both of these pieces of legislation are people-centred, ensuring accountability and transparency of platforms are at the centre of EU digital markets and services.”

Digital Services Act: few surprises, some good news and several concerns

ARTICLE 19’s preliminary analysis indicates that the DSA is looking to maintain the cornerstones of the E-Commerce Directive, including conditional immunity from liability and the prohibition on general monitoring. This is positive and corresponds to our recommendations in the consultations organised by the European Commission.

In addition, ARTICLE 19 is glad to see that the DSA includes several transparency measures and requirements for internal complaints mechanisms in line with our recommendations in the Missing Voices campaign. Whilst external redress mechanisms such as out-of-court settlements are a positive development, we hope that EU legislators will bring those provisions more closely in line with our proposal about Social Media Councils. 

Notwithstanding the above, we are disappointed that the Commission is effectively proposing a notice and takedown mechanism for allegedly illegal content, which effectively outsources the determination of what content is ‘illegal’ to private companies. Companies appear to then be tasked with reviewing their own decision on illegality and resolve this through their internal complaints mechanisms. This is highly problematic from a rule of law perspective. It is nothing less than governments delegating censorship powers to private companies. The design of the notice and action mechanisms strongly suggests that the overwhelming incentive for companies will be to takedown content lest they lose immunity from liability.

There are several other provisions in the DSA, particularly on the assessment of systemic risks and mitigation measures that require more careful examination. Our very preliminary assessment is that they appear to elide clear definitions or criteria as to what constitutes sufficient risk and what kinds of obligations companies would be required to put in place to mitigate those risks. It also remains to be seen how intermediary liability provisions will interact with systemic duties obligations and the extent to which any limited Good Samaritan rule will encourage increased reliance on upload filters and other kinds of technology to restrict access to content. The role of the Commission as enforcer of the regulation against very large online platforms also requires further scrutiny. 

Digital Markets Act: keeping gatekeepers under control but some doubts about how

With regards to the DMA, ARTICLE 19 is pleased to see the Commission stressing that platforms with gatekeeping position have the power to act as private rule-makers and to function as bottlenecks between businesses and consumers, as well as the power to lock in users and leave them with limited options to switch to alternative services. 

However, we warn that in communications markets, and especially in social media ones, a few large platforms act not only as “economic” gatekeepers, but also as “human rights” gatekeepers and that they impact how people exercise their rights in digital ecosystem, in particular the right to freedom of expression and information, and the right to privacy. Furthermore, at a community level, social media gatekeepers can exert decisive influence on public debate, which raises issues in relation to diversity and pluralism in the online environment. It is of utmost importance that media freedom and media pluralism and diversity are guaranteed online as they are offline, and we hope that EU legislators will take this element into account when discussing the Act.

We also welcome that the DMA will aim to tackle high market concentration and to lower barriers to entry. The Commission announces that the Act will impose on gatekeepers an extra responsibility to conduct themselves in a way that ensures an open online environment that is fair for businesses and consumers, and open to innovation by all, by complying with specific obligations laid down in the draft legislation’. We encourage the EU legislators to consider that fairness for consumers is not going to be achieved unless their fundamental rights are duly protected in the market. 

Finally, we are pleased that the proposed Act will require gatekeepers that match the 3 cumulative criteria listed in the DMAt, and potentially including Facebook, to make their services interoperable with competitors in certain specific cases. However, we are convinced that, in case of social media gatekeepers, interoperability obligations should be accompanied by the requirement to unbundle hosting and content moderation activities. Indeed, incorporating requirements for gatekeepers to unbundle their hosting from content moderation, and allow third parties to access their platform through interoperability requirements, would reduce the imbalance of power in social media markets and have a positive impact for business users, the wider industry, and individuals. Users would be able to make choices on which type of content moderation and which content rules they would like to be subject to, and which company hosts and access their data. The final Act — which will likely take several years to come into force — should ensure this process is as streamlined as possible.

Background

ARTICLE 19 will publish a more detailed analysis of the EU’s proposals of DSA and DMA in due course and will continue to engage constructively in this debate.

ARTICLE 19 has worked closely with the European Commission on the drafting of both the DSA and the DMA. In September 2020, we responded to the EU consultation on the future of the DSA and the DMA, calling for greater transparency and accountability for consumer’s fundamental rights, and for pro-competitive measures that open markets to new players, providing better quality-services and more choices for consumers. 

—————————————————————————————————————

For more information or to request an interview, please contact Nicola Kelly: [email protected]