Oireachtas Joint and Select Committees

Wednesday, 12 May 2021

Joint Oireachtas Committee on Jobs, Enterprise and Innovation

Online Content Moderation: Discussion

Ms Cori Crider:

I thank Deputy Stanton for the question. A class action was brought on behalf of content moderators from several American states where there are content moderation facilities, in which I was not involved. That was settled out of court with Facebook paying $52 million dollars to settle for the whole class of thousands of workers, although all of the people in the class action were outsourced in exactly the way we have been discussing. It amounted to $5,000 to $10,000 maximum, depending on the assessed level of PTSD per person who opted in to claim the settlement. I hope it goes without saying, but serious clinical PTSD is difficult to treat. It is not clear, particularly in the expensive American healthcare system, how far $5,000 to $10,000 will go. My honest opinion is that it was a bargain basement settlement to avoid dealing with a systemic issue. As I said, an ounce of prevention is worth a pound of cure. We do not want to burden the courts system with these matters to sweep up in as best they can after the fact. We want to, as best we can, stop people getting sick in the first instance.

To my knowledge, there are no direct employees at Facebook or Google doing this work, but there are people who are exposed to the content because the quality assurance, checking compliance with Facebook's rules and making sure that the outsourced moderators are getting it right, is done by staff. There are staff who it has been estimated to me are exposed to ten to 12 hours per week of content. In other words, only a fraction of their work is content review. The people who are doing it in volume, in the way Ms Plunkett has described, are all outsourced.

There is ongoing research. New York University published a report last year on who watches the social media giants, which we should have submitted as evidence, but we can send to the committee. It calls explicitly for an end to outsourcing and states that this practice is core to the tech giants business, not just Facebook but others, and should be in-housed, and it is based on interviews with moderators internationally. It also states that this is a growing problem that requires a regulatory response. I would commend the NYU report to the committee. An academic at University of California, Los Angeles, UCLA, Sarah Roberts, has written a book about this entitled Behind the Screen, which, again, is based on research with dozens of content moderators.

As rightly pointed out, this is a global issue and a new form of work. We are pleased that the committee has taken the initiative to investigate the work because it is not going away.

Comments

No comments

Log in or join to post a public comment.