Blog: I am an Irish citizen, so why is the Capitol Hill siege of such concern to me?

Blog: I am an Irish citizen, so why is the Capitol Hill siege of such concern to me? - Digital

Blog by Patrick Hever, Keep it Real campaign Ambassador, age 23, Dublin

The last ten months have not been normal, and the events at Washington D.C. recently exacerbated this frustrating time in our lives. Seemingly many Irish people were surprised, perhaps annoyed that so many of their fellow countrymen and women were so heavily invested in the ongoings of the US recently. Some may say the Irish reactions on social media to the surreal scenes at Capitol Hill were unwarranted: why can’t people concentrate on the problems with their own government or their own country?

However, if we look beneath the surface of those recent events, a decisive factor that has led to lives being lost and a mob preventing elected representatives from doing their job was the spread of conspiracy theories on Facebook, Twitter, YouTube etc. It would be ignorant to say “this will never happen in Ireland” because we humans are a fallible species, regardless of nationality, ethnicity, or race. Disinformation and propaganda can have serious consequences, but is it up to social media companies to unilaterally make choices on what can/cannot be said in (online) public debates?

Social media platforms enable the dissemination of content, including disinformation, on a massive scale, thus potentially amplifying the impact of potentially harmful messages. Content moderation – that is, deciding which messages will be made visible in the feed of individual users – is in the hands of social media companies. Many voices have pointed at a number of concerns with the current state of content moderation. The use of AI, which is a necessary part of content moderation (due to scale), is problematic, for instance in terms of biases built-in the algorithms. Content moderation is based on Terms of Service and Community Standards that companies can change at any moment.

The reasons for content moderation decisions are not transparent enough, and there is little opportunity for individuals to appeal these decisions – although these decisions may have a serious impact on their lives or businesses.

The boundaries of free speech on social media apparently differ for politicians and ordinary citizens since posts by political personalities are considered newsworthy or in the public interest. Facebook for example, has allowed some politicians to be exempt from its ordinary fact-checking and from its hate speech rules, but isn’t rhetoric from politicians and leaders’ more likely to trigger violence when compared with messages from an ordinary user? Furthermore, are the decisions of platforms consistent from country to country?

There may be signs of optimism. For instance, as of October 2020, Facebook, Twitter, Pinterest, and TikTok have modified their practices to deal with election related content that delegitimize election results on the basis of false claims.

But are we satisfied – as democrats – with a situation where private companies based in the US have the power to make decisions on such important matters on their own? Even well-intentioned initiatives by social media platforms to curtail disinformation can have an adverse impact on political expression and elections. Ideally, platforms should ensure that any restrictions on content are compatible with fundamental rights, which at a minimum would mean providing detailed explanation for every decision and the possibility for users to discuss the decisions.

In Ireland, the new Online Safety and Media Regulation (OSMR) Bill, which currently is being examined by the Oireachtas, could mark a decisive moment in the regulation of social media platforms. The bill will create a new regulator, a multi-person Media Commission to replace the Broadcasting Authority of Ireland. Most importantly in my view, the bill will create a regulatory framework for online safety to tackle the spread and amplification of certain defined categories of harmful online content. As such, this bill can be a step in the right direction, but will it be enough to deal with disinformation while also protecting freedom of expression?

A monumental change in Ireland would see the introduction of a Social Media Council (SMC), something which ARTICLE 19 are staunch advocates for. The SMC will have two main objectives: providing individual users with a complaints mechanism when their content has been taken down and elaborating general guidelines for social media platforms. The work of the SMC will be based on international standards on human rights. This new mechanism will be created and operated by all the persons in society who have a key interest in content moderation, such as social media platforms, media, civil society organisations, and academics.

The SMC will be an intermediate level between pure self-regulation and state regulation: a place for flexible, open, and transparent conversations with all these stakeholders. The idea is that, if the tech industry and all of society’s stakeholders are associated with the development of rules, these rules will percolate more easily into everyday practices. The SMC’s efficacy will rely on voluntary compliance by platforms. While the respective roles of the public regulator and the SMC can be organised to be complementary.

We are in the midst of a content moderation, disinformation, and freedom of speech crisis. Decisions that our society needs to make today can pave the way for a better future.