Oireachtas Joint and Select Committees

Wednesday, 7 July 2021

Joint Committee on Tourism, Culture, Arts, Sport And Media

General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)

Dr. Pierre François Docquir:

That is a very broad question that I did not include in my written submission so I would be very happy to follow up and share some documents by my colleagues in the law and policy team in Article 19, particularly with regard to developments relating to the Digital Safety Act. There are three things relating to content moderation by platforms from a free speech perspective and a human rights standards perspective. The first thing is transparency. There has been progress. Transparency reports have been developed and we know more about what is taking place but there is a general consensus that they could be improved, for example, on disaggregating data per country in order that we know exactly how many content moderation problems exist on each platform in Ireland. The second problem is due process. This involves making sure people are informed of the reasons why their content has been moderated and giving them a chance to have a fair conversation with the platform and a remedy to contest the takedown decision. The final thing is making sure that freedom of expression and other fundamental rights are part of the debate on whether the specific piece of content should be allowed to circulate on the platform.

In terms of the question about disinformation, of course, I acknowledge that some seriously problematic content is circulated online. There is no doubt about that. However, the key thing here is to remember that content moderation is not binary or at least is no longer binary. There are many options and ways of dealing with content. Content can be demonetised, which can be a very strong incentive for people to stop sharing stupid or harmful content. It can be downranked or flagged. There are ways to display a message that points to authoritative sources of information on, for example, health and the pandemic. Something like a social media council would be an appropriate forum for the discussion and fine-tuning of those non-binary approaches to content moderation in order that stakeholders could all share their views on what is needed, platforms could explain what is possible technically and some sort of societal agreement could be reached on the best approach to content moderation for each of the categories of problematic content.

Comments

No comments

Log in or join to post a public comment.