Recommender systems have a significant impact on how people access and share information online and what information they get to see. As Members of the European Parliament consider amendments to the Digital Services Act (DSA), ARTICLE 19 calls on them to ensure transparency, exposure diversity, more user choice in recommender systems, and unbundling of hosting and content curation.
The EU’s draft Digital Services Act primarily addresses the question of so-called ‘recommender systems’ – fully or partially automated systems used by platforms to determine what content to promote or demote to different users – in its Article 29. The provision requires very large online platforms (VLOPs) to provide information on the ‘main parameters’ used in recommender systems, and where options are available to change these, to provide easily accessible means to modify these, and provide at least one option not based on profiling.
Although this is a welcome step forward, ARTICLE 19 is concerned that this does not go far enough in addressing the impacts of recommender systems on media and information diversity, and platforms’ transparency, ultimately leaving too much power in the hands of large platforms over online expression.
The problem with recommender systems
Less exposure diversity
VLOPs’ business model is based on massive data gathering – often in violation of data protection rules – which is used to profile users and offer them personalised content. This can have a damaging effect on users’ exposure to diverse content and access to information. Recommender systems select content based on parameters whose main purpose is the maximisation of user engagement, and thus the maximisation of profit. Companies therefore have no incentive to expose users to all content or to a diversity of content, but instead to content that engages users the most. This means, among other things, that smaller outlets and minority voices tend to bear the brunt of downgrading. Efforts to promote ‘authoritative’ voices, while often intended to improve the accuracy of information available online, can drown out minority voices, having a discriminatory impact and increasing the dominance of a limited set of information and ideas, as well as media actors.
Lack of transparency and access to data
Recommender systems shape what people see online, with an alarming lack of transparency as to how and why. Concerns have repeatedly been raised about these systems’ tendency to promote clickbait, sensationalist, false or ‘extremist’ content, often pushing users down ‘rabbit holes’ of this type of content, without their knowledge or consent. They therefore have the power to shape ideas and discourse to some degree, although little data exists to confirm the extent of this influence. Platforms provide very little information on the automated systems they use, and how they work in practice, meaning the influence these recommender systems have over users’ access to different types of content, and their potential to promote certain types of problematic content to certain users, or to hide entirely legitimate content or conversations, cannot be properly examined.
Recommendations on regulating recommender systems
ARTICLE 19 is concerned that draft Article 29 of the proposed DSA does not go far enough in tackling these problems and sets an unacceptably low bar for platforms on transparency and information diversity. Strengthening transparency obligations on all platforms and providing greater choice to users on how they access content online is a crucial first step to addressing concerns on the impact of recommender systems.
We urge the European Parliament to protect users’ exposure to diverse content and their right to free expression and access to information online by ensuring the following recommendations are implemented in amendments to the DSA:
Unbundling of hosting and content curation
Unbundling content curation from hosting is one of the best options to give users back control and choice over what they get to see online, and therefore protect their rights. VLOPs should be required to unbundle hosting from content curation, and to allow third parties to provide alternative recommender systems on their platforms. This measure would open the market for content curation to a number of alternative players, which could adopt different business models for their recommender systems with regards to, for example, the criteria used for the promotion or demotion, the data collected from users etc. In turn, users would have the choice to pick the recommender system they prefer among those available. This diversity of players and options on the market would create the conditions to improve users’ exposure to diverse content and provide them with greater choice over what they see and share online.
Transparency requirements should be at the core of the EU’s approach to platform regulation and in particular to recommender systems.
- How recommender systems work: All platforms, not just VLOPs, should be required to provide clear and accessible information on how their recommender systems are used to present, rank, promote or demote content. This should include a requirement to undertake automated systems transparency audits, by providing relevant data to independent researchers to understand how these systems are operating in practice.
- How recommender systems are used: Platforms should clearly mark content that has been promoted and provide clear information on how and on what basis users are targeted with such content. Where these systems are used to demonetise or downgrade content, clear information on the circumstances under which this type of content moderation is being done should be available, with the option to appeal, and transparency reports should include information on downgrading decisions and the outcome of appeals.
- More choice for users: All platforms should provide options to users to modify or influence the parameters under which content is shown to them, with recommender systems which are not based on profiling set as the default option.