Twenty years after the adoption of the E-Commerce Directive, a cornerstone of Internet freedom in Europe, the EU institutions are set to review whether it is still fit for purpose and adopt a new set of rules governing online platforms as part of a new Digital Services Act (DSA). As the EU is poised to launch its consultation on the DSA, ARTICLE 19 proposes ten key recommendations for the regulation of digital services, especially social media platforms.
ARTICLE 19 has previously set out our position on intermediary liability and the human rights responsibilities of platforms in our Internet intermediaries: Dilemma of liability and Sidestepping Rights: Regulating speech by contract policy briefs, respectively. We have further developed a proposal for content moderation oversight by Social Media Councils and launched our Missing Voices campaign to urge big social media platforms to be more transparent and have proper appeals mechanisms in place. We believe that the principles we have championed remain true today.
At the same time, we recognise that Big Tech companies have all too often proved themselves unwilling or too slow to address challenges for the protection of freedom of expression and other rights over the last 10 years. In countries such as France or Germany, this has already led to rights-infringing new models of regulation that delegate censorship powers to private companies, require the deployment of filters and mandate the removal of unduly short time frames. We firmly oppose those. By contrast, the DSA is an opportunity to get regulation right.
For the DSA to be successful, however, it must be built on the successes of the E-Commerce Directive. Conditional immunity from liability and the prohibition of general monitoring are foundational principles of Internet freedom and innovation. Abandoning them would be a profound mistake. We therefore recommend that they should be maintained in any future regulation of digital services. At the same time, notice and action procedures dealing with illegal content should be clarified along the lines that we proposed in our Dilemma of liability briefing. Provision must be made for access to effective remedies, whether for victims of hate speech or for those whose content has been wrongfully removed.
Beyond illegal content, we believe that concerns over the content moderation practices of dominant social media platforms should be overseen by an independent multistakeholder body such as a Social Media Council. If, however, more traditional regulators – such as broadcasting or telecoms regulators – are brought to bear in this area, it is crucial that their remit be limited to overseeing the implementation by ‘dominant’ platforms of transparency and due process obligations. Regulators should not be involved in decision-making about content. Transparency, accountability, the protection of human rights should be the guiding principles of the regulatory framework. In practice, this means that the independence of any regulator in this area must be guaranteed and that the proportionality principle is upheld throughout, from the scope of the Digital Services Act to the sanctions applied when digital services providers fail to comply with their obligations under the Act.
Concerns over the power and dominance of certain social media platforms is unlikely to be resolved by focusing on online content regulation alone. For this reason, ARTICLE 19 also recommends that large platforms be required to unbundle their hosting and content moderation functions and to ensure that they are interoperable with other services. Data collection in the provision of digital services and digital advertising must also be more strictly limited.
The DSA is a once in a generation opportunity to get regulation right in this complex area. To be a success, the protection of human rights and user choice must be at its heart. We hope that our recommendations will help lawmakers to make the right decisions as they embark on this new reform that could become a blueprint for the regulation of digital services worldwide.