Oireachtas Joint and Select Committees

Wednesday, 12 May 2021

Joint Oireachtas Committee on Jobs, Enterprise and Innovation

Online Content Moderation: Discussion

Ms Cori Crider:

Thank you very much for being the first Parliament in the world to address the vital issue of social media content moderators on Facebook and beyond. I am Cori Crider, the co-founder of Foxglove, a non-profit organisation working to ensure justice in technology. For two years now we have assisted moderators to seek a fairer workplace. As Ms Ní Bhrógáin said, social media content moderation is one of the most vital and under-regulated jobs in the modern economy. A third of the planet uses Facebook now, including millions in Ireland. These days, it is our public square. As Facebook's oversight board said in a recent decision about the ban on former President Donald Trump, Facebook has become a virtually indispensable medium for political discourse, especially in election periods.

Content moderation, a new form of work, is here to stay, so it is crucial that this committee understands what it really takes to keep Facebook running and what it costs people like Ms Plunkett who do it.

The work Ms Plunkett and thousands of her colleagues do is for us. It guards our public square. When a terrorist tries to livestream a beheading, a content moderator steps in. When an anti-vax group spreads misinformation about the Covid vaccine, a moderator deals with it. When a depressed teen tries to hurt herself and broadcast it to the world, a moderator is there. I can put it no better than an engineer at Facebook did: "Content moderators are the people literally holding this platform. They are the ones keeping the platform safe."

Facebook depends on these moderators to keep the company attractive to users and therefore profitable, but Ms Plunkett and other critical workers like her feel treated like second-class citizens. Light-touch regulation of social media has failed. Thus, Foxglove has three recommendations. First, end the culture of fear and excessive secrecy at Facebook and its companies. No NDA should stop moderators raising legitimate concerns about working conditions. The firms should really be made to clarify this and the extent of moderators' rights, in public. Second, regulate content moderation, just as other hazardous businesses in Ireland are regulated. One could draw, for instance, on examples in the police and, indeed, from recommendations which Facebook itself has endorsed and which we have submitted as evidence to the committee. A regulation moderation floor should allow opt-outs for toxic content for people who are not able to sustain it, it should allow independent experts to set safe limits on exposure to toxic content and it should offer real psychiatric support - medical, clinical support, not mere wellness coaching. Third, it should be required that social media firms of a certain size bring content moderation in-house. It should be brought in-house through a legal requirement on platforms of a certain size, because it is core to their business - it is not adjunct to their business.

We were pleased to meet with the Tánaiste in January to discuss these issues. However, we only received a response to the issues raised in that meeting at 7 p.m. last night, which I am afraid, raises more questions than it answers. Also, because it came in at 7 p.m., we could not submit it in evidence, but I am happy to discuss the response that we received.

I will now hand over to Ms Plunkett for the remainder of the time.

Comments

No comments

Log in or join to post a public comment.