Oireachtas Joint and Select Committees
Wednesday, 12 May 2021
Joint Oireachtas Committee on Jobs, Enterprise and Innovation
Online Content Moderation: Discussion
I thank members and witnesses for their attendance at this meeting in line with the exceptional circumstances and measures required due to Covid-19. I remind members who are participating remotely that they should do so from within the Leinster House complex.
Today we will be discussing issues relating to the employment of content moderators. A content moderator is an employee of an online platform responsible for user-generated content submitted to the platform. The content moderator's job includes making sure the material posted to the platform is placed in the right category, is free from scams and does not include any illegal matter.
At the outset I wish to explain the purpose of today's discussion, which is to enable the committee to inform itself about matters of policy and possible legislative changes it could raise with the Minister for Enterprise, Trade and Employment and officials of his Department, who will be before us next week. As is normal, we cannot involve ourselves in general industrial relations issues. To assist the committee in its consideration of some of the issues that arise relating to the employment of persons as content moderators, I am pleased to welcome Ms Fionnuala Ní Bhrógáin from the Communications Workers Union, CWU, and Ms Cori Crider from Foxglove, a London-based group that advocates on behalf of the users of digital technology. Ms Crider is accompanied by Ms Isabella Plunkett, a content moderator.
Before we begin I wish to explain some of the limitations on parliamentary privilege in the practice of the Houses in the context of references witnesses may make to other persons in their evidence. The evidence of witnesses who are physically present or who give evidence from within the parliamentary precincts is protected, pursuant to the Constitution and statute, by absolute privilege. However, today's witnesses are giving their evidence remotely from outside the parliamentary precincts and, as such, may not benefit from the same level of immunity from legal proceedings as witnesses who are physically present.
Witnesses are reminded of the long-standing parliamentary practice to the effect they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable or otherwise engage in speech that might be regarded as damaging to the good name of the person or entity. If their statements are potentially defamatory in relation to an identifiable person or entity, they will be directed to discontinue their remarks and it is imperative that they comply with any such direction. Opening statements and other relevant documentation have already been circulated to members. I invite Ms Ní Bhrógáin to make her opening remarks.
Ms Fionnuala Ní Bhrógáin:
I am the head of organising with the Communications Workers Union, CWU in Ireland. The CWU represents workers in the postal, telecoms and call centre digital sectors. The increasing footprint of global digital companies in Ireland has brought changes to the existing landscape and new challenges for workers in these areas. We are pleased to appear before the committee today along with Foxglove and, in particular, with Isabella Plunkett, who has shown great courage by speaking out. I thank committee members for giving us the opportunity to speak to them today and for giving voice to a group of workers who have been protecting us and our families, quietly and in the background, but without the same protections their directly employed colleagues and other workers enjoy.
Despite the critical service content moderators provide, they are led to believe they cannot voice concerns regarding that work and its impact on them, their colleagues and their families. It is our information that content moderators are required to sign non-disclosure agreements at the start of their employment. They are not given a copy of these agreements to retain for reference. This has a chilling effect from the start and creates an atmosphere in which workers fear retaliation if they try to vindicate their right to raise legitimate issues and concerns in their workplace. We have been advised that workers are frequently reminded of the existence of these agreements and they are prohibited from discussing any and all aspects of their work, whether that be with a trade union representative or even with their own families.
Social media platforms and their outsourcing partners exploit moderators' lack of legal training to make demands for secrecy that are possibly unlawful. Two moderators received messages seeking to reassert this secrecy from one of the two outsourcing firms in advance of this hearing. This climate of fear has chilled workers' participation in legitimate democratic processes and is unacceptable. Trade unions have long experience of workers having a fear of speaking out or of talking to a trade union, let alone joining one. Coupled with the atmosphere created by non-disclosure agreements, this fear is clearly amplified. The right to join a union is enshrined in our Constitution and yet workers are led to believe this right cannot apply to them. Furthermore, the European Parliament is currently in discussions on a proposal for a directive on an adequate minimum wage in the EU. This debate has expanded to include collective bargaining rights and access to workplaces for the purposes of organising, thus recognising the need for these protections. How can such rights be achieved when workers feel silenced by their own employer?
It is worth noting that a large proportion of content moderators working in Ireland are non-nationals, working on various streams and in multiple languages. Employers rely not only on their language abilities but also on their unique understanding of their respective cultures to make judgments on what is and is not appropriate material. This valuable cultural and linguistic insight is also part of these workers' vulnerability due to their possible lack of knowledge of their rights and entitlements here in Ireland. It cannot be said strongly enough that the content these workers are exposed to can be grotesque and traumatising and includes violence, extremism, child exploitation and more. During the past year, society has come to recognise the vital role played by front-line workers. In terms of this critical service to society, surely we owe it to these workers to value their work as it should be valued.
Artificial intelligence and other technologies are nowhere near being able to review and make the judgment calls taken by thousands of moderators here in Ireland, across Europe and the rest of the world. Social media companies are fully aware of this dynamic but continue to outsource this work, meaning moderators do not enjoy the same level of support, benefits and protections as direct employees. Without the moderators' expertise, these platforms would simply cease to function. When attempting to engage with employers, the outsourcing company can point to the requirements of the client company and the client company can point to the fact these workers are not, in fact, their employees. Where does the buck stop? By directly employing content moderators, these global giants would be recognising the immense importance of their work to society while also recognising the impact of this work on their mental health, their families and their relationships.
In order for moderators to exercise their rights without fear of reprisal, we ask the committee to engage with these employers, prohibit all acts which undermine workers' rights or discourage their active participation in a union, and ensure proper access for all workers to the necessary information on their rights. Our society is made better by work of content moderators. They are suffering so that we do not have to, and the least they deserve is to be rewarded and protected accordingly.
Ms Cori Crider:
Thank you very much for being the first Parliament in the world to address the vital issue of social media content moderators on Facebook and beyond. I am Cori Crider, the co-founder of Foxglove, a non-profit organisation working to ensure justice in technology. For two years now we have assisted moderators to seek a fairer workplace. As Ms Ní Bhrógáin said, social media content moderation is one of the most vital and under-regulated jobs in the modern economy. A third of the planet uses Facebook now, including millions in Ireland. These days, it is our public square. As Facebook's oversight board said in a recent decision about the ban on former President Donald Trump, Facebook has become a virtually indispensable medium for political discourse, especially in election periods.
Content moderation, a new form of work, is here to stay, so it is crucial that this committee understands what it really takes to keep Facebook running and what it costs people like Ms Plunkett who do it.
The work Ms Plunkett and thousands of her colleagues do is for us. It guards our public square. When a terrorist tries to livestream a beheading, a content moderator steps in. When an anti-vax group spreads misinformation about the Covid vaccine, a moderator deals with it. When a depressed teen tries to hurt herself and broadcast it to the world, a moderator is there. I can put it no better than an engineer at Facebook did: "Content moderators are the people literally holding this platform. They are the ones keeping the platform safe."
Facebook depends on these moderators to keep the company attractive to users and therefore profitable, but Ms Plunkett and other critical workers like her feel treated like second-class citizens. Light-touch regulation of social media has failed. Thus, Foxglove has three recommendations. First, end the culture of fear and excessive secrecy at Facebook and its companies. No NDA should stop moderators raising legitimate concerns about working conditions. The firms should really be made to clarify this and the extent of moderators' rights, in public. Second, regulate content moderation, just as other hazardous businesses in Ireland are regulated. One could draw, for instance, on examples in the police and, indeed, from recommendations which Facebook itself has endorsed and which we have submitted as evidence to the committee. A regulation moderation floor should allow opt-outs for toxic content for people who are not able to sustain it, it should allow independent experts to set safe limits on exposure to toxic content and it should offer real psychiatric support - medical, clinical support, not mere wellness coaching. Third, it should be required that social media firms of a certain size bring content moderation in-house. It should be brought in-house through a legal requirement on platforms of a certain size, because it is core to their business - it is not adjunct to their business.
We were pleased to meet with the Tánaiste in January to discuss these issues. However, we only received a response to the issues raised in that meeting at 7 p.m. last night, which I am afraid, raises more questions than it answers. Also, because it came in at 7 p.m., we could not submit it in evidence, but I am happy to discuss the response that we received.
I will now hand over to Ms Plunkett for the remainder of the time.
Ms Isabella Plunkett:
Good morning. I thank the committee for inviting me today. I have worked with Facebook as a content moderator, through one of its outsourcing firms, Covalen, for almost two years. Today I speak on behalf of hundreds of colleagues who would have come to the meeting but have been intimidated by a pervasive climate of fear at work. I speak out today to make a difference.
I was excited to work for Facebook. You would think that it is a smart career move and that you would be part of the new economy, until you realise what the costs will be to you personally. I will talk about three problems that we face.
The first is the climate of fear. It has been very hard to come here to day. My lawyers insist that it is my right and I believe that, but in my opinion, Facebook and Covalen have confused us and undermined our belief in the right to speak out. Facebook presents itself as a progressive company. There are slogans in our office that read: "Be bold and be brave". How can we be brave when our right to speak out is consistently undermined?
The second point I wish to raise is the mental health aspect. The content that is moderated is awful. It would affect anyone. It has finally started to get to me. I have horrible lucid dreams about all the things I have seen, and for months I have been taking antidepressants because of this content. My job is to train the algorithm. Facebook's fantasy is that one day human content moderators will no longer be required. That means I get all kinds of content - hate speech, bullying, graphic violence, suicides, abuse, child exploitation, and the list goes on. Some of my colleagues have it even worse. They are working on child abuse and self-harm queues all day. A manager tells them that they should limit their exposure to a maximum of two hours per day, but this is not happening. To help, they offer us wellness coaches. These people mean really well, but they are not doctors. They suggest karaoke and painting, but frankly, one does not always feel like singing, after having seen someone be battered to bits. I was referred to the company doctor once and I was supposed to hear about a follow-up appointment, but I have not heard anything since.
Finally, we are tired of the second-class citizenship. We know that it would be different if we were Facebook staff, because we have seen how Facebook staff are treated. The Facebook staff who review identical content to me can work from home and stay safe during the Covid period. My mother has survived cancer twice and I support her. When I asked to work from home to keep her safe, I was told that it was not possible. I was told that I had to come into the office and I should stay away from her. The pay of Facebook staff is more than double mine. They get sick pay. I get seven days' unpaid sick leave per year. If our work is so important, why are we not Facebook staff?
Moderators want to see action taken in respect of the following. First, content moderation is Facebook's core business and it must be valued. We should not be treated as disposable; we are not disposable. If we are "core" enough to risk our lives and those of our loved ones to come into the office, we are "core" enough to be staff. I hope the committee will investigate the practice of outsourcing. Second, the mental health care is nowhere close to good enough. Unfortunately, we cannot opt out of toxic content and we are not given real psychiatric care. It is a workplace health and safety issue and must be dealt with. Finally, the climate of fear has to end. I should not have to stand here alone------
Ms Isabella Plunkett:
Finally, the climate of fear has to end. I should not have to stand here alone. Hundreds of my colleagues have the same experiences. We ask that Facebook and the outsourcing companies are made to openly clarify our rights around legitimate grievances. Fear stops us speaking out and makes us afraid to raise safety issues, as it stands in the way of our working with the union.
I thank the members for their time.
I invite the members to discuss the issues with the witnesses. I remind them to use the hand-raising function on Teams, and more importantly, to lower their hand, as it were, when they have finished speaking. As members know, a rota is in place. The first member indicating is Deputy Louise O'Reilly, who has 14 minutes.
Time is tight. I would like to let the members and witnesses know that I am due to speak in the Dáil Chamber, so I will be stepping in and out of the meeting, but I will be following the proceedings.
I wish to address Ms Plunkett directly for a moment. I want to thank her and commend her on her bravery. I know that it is not easy. It is never easy to do something like this, particularly in the circumstances as have been outlined and given the nature of the work and the manner in which her employer uses NDAs and other mechanisms within the workplace. I do not think that anyone will disagree with me when I say that Ms Plunkett is a brave young woman and she is doing her colleagues, and indeed her union, a great service today. As Ms Crider has pointed out, it is the first time that any parliament has had the opportunity to discuss this issue. It is absolutely necessary. I think every person participating in this meeting uses social media. I would be surprised if they do not. We all use social media and know the benefits of using it. However, we can all use these platforms safely because of people like Ms Plunkett. There are people like Ms Plunkett who stand between us and the kind of disgusting, vile content that has been alluded to and not even described. I do not think that we would even be able to listen to the descriptions of it, much less to have to experience it. The reason that we can use these platforms is because of people like Ms Plunkett. Therefore, the work that they do is absolutely essential and is core to the business of the platforms. The fact that it is outsourced is deeply worrying.
It is deeply worrying on a number of levels. It is deeply worrying from the point of view of workers' rights. It is also worrying that from a user perspective, the platforms feel that it is okay to have this essential front-line work conducted at a remove from the main companies. That is very wrong. We are talking about issues of health and safety, and the personal safety of those who are moderating the content. That is core to the business.
I do not have enough words with which I can salute Ms Plunkett. She has been very brave today. As we all know, and as I say at many of our meetings, the strongest message that any worker can send to their employer and, indeed, to Government and politicians is to join a union and be active in that trade union. That gives the worker a voice and that is why Ms Plunkett is here today.
I ask whichever of the witnesses has the information to hand the following questions. What supports are available currently? I know that sometimes we are told that supports are available. I am referring specifically to supports available to content moderators who are not directly employed. What sort of supports should be available? There is no disputing that the work is dangerous. The health, safety and welfare at work legislation provides for checks, balances, safety equipment, education and other measures as part of dealing with a hazardous work environment, which is the kind of work environment the content moderators are dealing with.
It is deeply worrying on a number of levels. From the point of view of workers' rights, it is deeply worrying
One of the witnesses could describe what kind of supports could be put in place and what should be done.
Ms Cori Crider:
I will outline the general situation and Ms Plunkett, if there is sufficient time, could give her experience of engaging with the mental health services.
Ms Plunkett and other content moderators working for Facebook are allotted a short period of time each week with a so-called wellness coach. You might wonder what the purpose of such a slightly strange word is but it is specifically intended to convey that this is not a clinical role. If you look at the rules governing the wellness coaches, these are not psychiatrists. They are forbidden by the rules from engaging in diagnosis and treatment. What they are intended to do is to provide a sort of soft counselling service but it is not medical care. To compare it, for example, with what a police officer in Ireland investigating a child abuse case would be offered, that is proper clinical psychiatric support. It recognises it is a medical issue. The committee heard Ms Plunkett talking about karaoke, which seems, frankly, a quite odd way of handling exposure to graphic and obscene content. Essentially, it is not medical care. Many people have said to us, in several countries and not only Ireland, since this is a systemic issue across Facebook's countries that do this work, that it just is not enough because it is not medical care.
Does Ms Plunkett have anything she would want to add about her experience with the system?
Ms Isabella Plunkett:
I thank the Deputy again for the kind words. To outline it briefly, we can go and see a wellness coach and we can do these sessions for an hour a week. We are only entitled to an hour and a half a week of wellness, for example, if we see some very harsh content and we need to walk away to get a cup of tea or go for a cigarette. An hour and a half is not enough. The services provided to us, although they involve amazing people who try their best, do not have the capability and the necessary scope for what we are dealing with. I would love to see it change into the future because mental health is so important. It is talked about so much these days. It should be a priority. That is all I have to add.
I thank Ms Plunkett. If I could address a few questions to Ms Ní Bhrógáin, greenfield organising is tough and I commend the Communications Workers Union on the work it is doing on this. This is important work to be done.
In terms of organising these workers, bringing them together and giving them a collective voice, what impact does the outsourcing have? It seems there is a bit of an "Upstairs, Downstairs" situation going on. There are directly employed and outsourced workers. There is clearly a comparator. I would say the comparator exists with An Garda Síochána. Even if it did not, there is clearly a comparator with directly employed personnel. Will Ms Ní Bhrógáin outline for us briefly the challenges she has faced as an organiser and how tough it is, particularly to organise this group of workers because this is new work? This work did not exist when I was growing up. This is new work and I would say it requires new regulation.
Ms Fionnuala Ní Bhrógáin:
I thank the Deputy. There are several challenges with organising content moderators in their current employment status, the primary one being, as the Deputy identified correctly, the outsourcing model used by these employers. As I mentioned briefly in my statement, when you are looking to raise or address an issue with an employer, you have to identify who that employer is - who is calling the shots, so to speak. When an individual worker raises an issue with his or her direct employer, the outsourced company, he or she is referred to the business requirements of the client, and if the person attempts to raise any issues with the client company, he or she is correctly advised that he or she is not employed by the client company.
The other main issue we have that is unique to content moderators and which I have not encountered in my 15 years' experience of organising until this point is this climate of fear Ms Plunkett outlined so clearly. They are required to sign these non-disclosure agreements, NDAs, they are given on day one. These are taken away. They do not have time to read them. They certainly do not have time to refer to them. Regularly and frequently they are reminded they exist. As difficult as it is to organise a new workplace and give those workers the assurance they have the right and entitlement to be a member of a trade union and raise legitimate issues, to convince a worker he or she has that right when he or she is being bombarded on all sides by this climate of fear and secrecy Ms Plunkett described so well is an extra layer of challenge for those workers. Until the position is clarified that these workers have the right to raise issues in the same way as any other workers, it will remain a significant challenge for them.
If I could come to Ms Cori Crider briefly, is the NDA something that is used internationally? In my experience, these types of non-disclosure agreements do not exist to a huge extent in the State. I would be worried, the same as with all trends that batter workers, that it might catch on. Is this an international phenomenon right across the board or is this unique to us here?
Ms Cori Crider:
I would say it is an international phenomenon across the board because ultimately the entity driving the NDAs as they are distributed is Facebook. It is not simply one NDA. It is a series of documents people are made to sign and which they are not then generally given a copy of.
We engaged with the Tánaiste and Minister for Enterprise, Trade and Employment about this question of NDAs when we had the meeting in January. I now can see from the letter I received at 7.04 p.m. yesterday that the Tánaiste did then ask. Facebook wrote back in February stating that all the content moderators needed to do if they wanted to get a copy of their NDAs was simply ask, but that is not consonant with our experience. What happened was, in advance of the meeting of content moderators with the Tánaiste in January, we wrote to Facebook and to one of the outsourcing firms stating these people were about to speak about their workplace conditions, they needed some reassurance from them, that we had said it was lawful but asked the companies if they could confirm it, and if they could disclose all of the documentation they said applied to these people's right to speak. The outsourcing firm never replied in advance of that meeting. Facebook wrote back stating the content moderators know what their obligations were and it would not give them anything. After the meeting, the companies told the content moderators they needed to put in a subject access request under the General Data Protection Regulation, GDPR. We put in a subject access request under the GDPR, there was an endless back and forth, and we have received some documentation. However, I have good reason to believe it is not all of the relevant documentation on confidentiality.
It may well be, by the way, that if the companies were required to publish this information, disclose it, give it over in the course of events and give it to an independent legal examiner, it would be something like a paper tiger, that a lawyer who read it would say it does not prevent a person from raising legitimate workplace grievances and that there has been an effort to talk workers out of raising legitimate issues that is much farther than what the law would allow. Content moderators are not lawyers, however; they should not have to be lawyers. They should be able to get the documents without having someone like me shepherd through a subject access request to do it. That has been the problem - identifying what it is that these companies even claim restricts these people.
Will Ms Ní Bhrógáin outline for us briefly the impact of these NDAs? I refer to the idea that you sign it, it gets taken off you, you cannot refer to it then but you are constantly being told there is something on a piece of paper that you have signed that means that you cannot do what you want to do. In terms of organising and giving the workers a voice, what is the impact of that NDA?
Ms Fionnuala Ní Bhrógáin:
The impact, in its most basic form, is getting workers to talk to each other, getting workers to talk to a union and getting workers to talk to a lawyer. They have been, as I said, bombarded with this idea that they cannot speak. They are told they cannot speak to their own families and to their own friends. They cannot speak to their colleagues. In that climate, how would a person say that he or she will reach out and expose all this - what he or she believes to be confidential information - and have no comfort whatsoever as to what the impact might be on his or her employment and what kind of retaliation he or she might face as a worker by raising these? Imagine in that climate even attempting to raise an internal grievance with your employer.
Is it true that you are not supposed to talk to the person you live with, notwithstanding the bile and traumatic content that you might see during the course of your work, that when you go home in the evening you are not allowed to tell the person you split the bills with, share the house with or whatever - the person who is there when you go in in the evening - who you work for, what you do or what you have seen that day, or to talk about it? Am I right in saying that?
Ms Fionnuala Ní Bhrógáin:
The Deputy is absolutely correct. They are not supposed to discuss the content they have seen or the impact it has on them. That is extraordinary given that, as Ms Plunkett has said, they do not have the psychiatric or mental health support we feel would be necessary given this is a workplace health and safety issue. Not even to be able to go home and have the comfort and support of your own family members as a release for that trauma is an extraordinary burden to place on workers given the nature of the content they are viewing.
I can only imagine the impact that has. Perhaps Ms Plunkett will outline for us the mental health impacts. In her evidence she described some of the impacts. Will she now share that with the committee? I will not get a chance to speak again but I must thank and salute Ms Plunkett. She is doing a very good job here on behalf of her union and on behalf of her colleagues.
Ms Isabella Plunkett:
I thank the Deputy. I appreciate the kind words. The mental health side of things is a big part of it. Not being able to speak to family and friends about it will, of course, affect a person. As I said earlier, in these days and times we should be looking after mental health. Issues are on the rise. As Ms Ní Bhrógáin has said, if we are not given that support in work, what are we supposed to do? Are we supposed to sit there by ourselves and think about these horrible things? Are we supposed to suffer in silence? I do not think it is right and I do not believe it is fair. As I outlined in my statement, I am now on antidepressants for more than seven months. I have horrible dreams about the content and I can find myself, while in a day-to-day situation, thinking of yesterday's work. I may think of something I saw that was really horrible and I cannot forget it although I would like to. Everybody should be able to separate work and life balance. They should be able to go home and have that separation, but I feel that with this job it is not necessarily possible because your subconscious mind is always thinking about it. It is very difficult.
I thank our speakers, Ms Plunkett in particular. I had read and researched about the role of content moderator. It seems to be most challenging and difficult. As we have heard from the contributions, it is very upsetting. This is unacceptable.
Will our guests speak about what they feel is the biggest factor in Facebook not making content moderators employees? Is it solely related to cost and financial issues or are there other factors?
Ms Cori Crider:
I cannot overly speculate as to Facebook's motives, but I would respectfully suggest to the committee that, just as it is speaking to the Tánaiste shortly about these issues, it would be appropriate to invite the companies who are taking on this work and who host the public square to come and give the committee their perspective. We would absolutely welcome that. There seem to us to be two factors at play. One is, as Ms Plunkett has said, that she makes an absolute fraction of what a Facebook employee makes, including a relatively entry-level Facebook employee who is doing quality assurance on the same content. This employee is just a couple of rungs up the proverbial totem pole and would make an absolute multiple of what an outsourced content moderator earns, so there is that fact. There is also the fact there is greater ability to shift the workforce relatively rapidly, in that the companies can open up something and then lay off people en massewith what they consider to be a greater degree of flexibility. The final concern is to do with potential liability for people who contract very serious clinical PTSD as a result of exposure.
One of the documents we submitted in evidence to the committee is an acknowledgment used by one of the outsourcing firms that says in its terms that the moderator will understand that exposure to this content "could even lead to Post Traumatic Stress Disorder", and that the moderator will engage in the mandatory wellness coaching session. The moderator is to understand that those [coaches] are not clinicians and this may not be sufficient to prevent him or her contracting PTSD. We can see there, in black and white in the document we have submitted to this committee, that the companies are essentially asking the workers themselves to shoulder the burden and the responsibility for their own mental health. This seems to us to be the functional equivalent of an industrial factory saying to a worker building a car, for example, that "we have not put a guard on but make sure you keep your arm out of the machine". The fact this is mental health and therefore harder to see and that we are socially behind in our conversation about this does not mean it is not a real medical workplace health and safety issue; it is.
I thank Ms Crider. It is hard for me to understand a couple of those points. No more than Deputy O'Reilly's points, will the witnesses give a broader sense of the supports that are available? Surely there has to be more.
Ms Fionnuala Ní Bhrógáin:
I thank the Senator for the question. "Surely there has to be more" is the human response to Ms Plunkett's story. Surely there has to be more, but the reality is there is not or she would not be with us here today. You would have an expectation of what should be a proper standard of dignity, respect and protection in the workplace for this kind of material, but sadly that is all there is, which is the reality.
I welcome our guests today. I thank them for what they have said and for the work they are doing. I wonder what these social media platforms would be like if this work was not going on. None of us can even imagine the kind of content the moderators have to look at. Reference was made to post-traumatic stress disorder. During the First World War this was not recognised and people were shot because they had it and were determined as cowards. From what I can see and from what I know about this, post-traumatic stress disorder in cases of content moderators is very real. It must be dealt with and treated properly.
I will not go over what Deputy O'Reilly, Senator Crowe and others have already asked. Will the witnesses clarify if the pandemic has made things worse? We have heard that violence, domestic violence, and other forms of aggression have increased during the pandemic because people have been at home. Has this actually increased the awfulness, for want of a better word, of the content that moderators are looking at? Has anything been done to help with that?
The witnesses have spoken about the issue of non-disclosure. Will they tell us about working from home? Are moderators allowed to work from home or is this work deemed an essential service? If that is the case, does it make a difference if they are working from home?
I agree we should certainly invite the companies in as soon as possible. This is to be fair and to learn more about this. The issue is relatively new. When many of us here were children, social media did not exist and I would argue many people do not know about content moderation. In some ways it is a hidden employment, while being essential. We must consider what kind of society we are living in when the kind of stuff that is put up on social media is so awful that it causes post-traumatic stress disorder among the people who are charged with taking it down. There is a bigger question here about what kind of society we have.
I have heard of algorithms that are used to pick out certain issues and feed them to a certain kind of moderator, but I do not fully understand that. Perhaps the witnesses will tell us about it.
Are there legal actions before the courts at the moment on any of this? Are there legal actions on behalf of content moderators that are before the courts in Ireland or in any other jurisdictions, but especially in Ireland, in respect of this activity that is going on?
Ms Cori Crider:
I am aware of several existing cases. I am not sure of the precise figure, but it may be about 30 cases. This is a systemic problem, and it should not be. Prevention is of course much better than waiting until people's entire lives have gone off the rails and they are unable to relate to their family members or seek other gainful employment because this job has essentially scarred them. We would like to see much better preventative psychiatric measures in place that would stop people getting ill. That is what must happen as opposed to relying on after-the-fact interventions and burdening the court system with sweeping up the pieces. Such a situation is not ideal for anyone. There are some existing cases, however.
I thank Ms Crider for that. I just wanted to clarify that issue. As the Chair said, we must be careful about what we say if cases are before the courts. I asked several other questions about the pandemic and whether its impact had made the situation worse. Perhaps Ms Plunkett or Ms Ní Bhrógáin might be able to comment on that aspect. I refer to the working from home situation and what is involved in that regard.
We can talk about psychiatric help, but from what has been said and what I have been reading about this issue, it strikes me that some of the content is so awful and shocking that I wonder whether people could be damaged so much by seeing this material in the first place that no amount of psychiatric help could assist. Ms Plunkett said that people experience nightmares and it is impossible to get the content viewed out of their heads and that it is impossible to forget having seen something so awful and terrible.
This issue, therefore, comes back to the society we live in, how awful it is and whether anyone should be looking at this stuff. However, is there another way of dealing with it? Someone has to look at the content to judge it and then take it down, otherwise the whole system would collapse. It is a very serious, deep and, as Ms Crider said, systemic issue right across society. Prevention is better, but how would it be possible to prevent people from seeing something that is so awful that even seeing such content even once might leave them scarred for life? I thank the witnesses for being here and sharing their stories with us. I ask them to comment on those questions.
Ms Isabella Plunkett:
I thank Deputy Stanton. I can step in here and answer a couple of those questions. Regarding the pandemic and content, crises unfold around the world daily, many of which we do not see and are not covered by the media. However, I think the situation is the same in the pandemic as before it. We have awful content every day, and I cannot really differentiate between the time before the pandemic and now.
Turning to the algorithms, to address another part of the Deputy's question, part of my team is training a machine to be able to pick up a ticket that involves, for example, hate speech or bullying. That will remove the human aspect from doing the job, and the algorithm should be able to do it itself, almost like artificial intelligence. We are still a long way away from that in reality, and of course in the interim people will be needed to do this job and it is a difficult job to do.
Moving on to working from home, we were allowed to work from home on Christmas Day only in 2019. In 2020, we were told we were not allowed to do so. No clear reason was given as to why that was. We were again left in the dark. The communication is not good at all. Of course, we wonder about this aspect, because we see our counterparts in Facebook doing essentially the same job as us. They are viewing the same content at home as we do in the office. They are able to work from home, but we are not. We have to risk going into work every day and putting everyone else at risk as well. I spoke about my mother, and that alone caused huge anxiety about going to the office every day, returning home and then having that burden on my shoulders. I do not know exactly what can be done about that. I do not know why we are not allowed to work from home, while Facebook workers are allowed to do so. However, I really hope the situation changes in the future. I thank the Deputy again for his questions.
I will pick up on Ms Plunkett's point at the end when she said it was possible to work from home for Christmas Day in 2019. I assume the content to be seen on Christmas Day is not any lighter than on at any other time. Therefore, if contract moderators are able to work from home on Christmas Day, Facebook's argument regarding not being able to work from home for the other 364 days of the year does not stack up.
I thank Ms Crider, Ms Ní Bhrógáin and Ms Plunkett, in particular, for their contributions. It is incredibly tough to hear about their experiences. Several of the questions I was going to ask have been asked. The answers have given rise to more questions for Facebook and the service providers than for the witnesses. It is important in that regard that we bring representatives from Facebook and the service providers before the committee to be accountable for the decisions made by those companies.
I do a good deal of work on mental health and services for people in communities, and I cannot think of anyone I dealt with who would have to deal with challenges from a mental health perspective more than Ms Plunkett would regarding the content she must see. Services are available for doctors, nurses and gardaí to enable them to cope with the traumatic situations they have to deal with, not necessarily daily but certainly regularly. Ms Plunkett, however, knows she has to deal with such situations daily. I do not know how she does it, to be perfectly honest. I commend her on what she is doing. It is phenomenal.
The advice that seems to be coming from Facebook in respect of not speaking to family members and not reaching out to support groups flies in the face of the view of how to cope with mental health challenges, which is all about opening up, speaking out and sharing experiences, pressures and problems. Facebook must be accountable regarding the advice it is giving, because it goes against all the advice that anyone working in the area of mental health would give to someone who must deal with the kinds of pressures Ms Plunkett faces.
I will not ask the same questions as those that have been asked and answered already. On staffing levels, however, what is the timeframe for most people who last in the job? What is the turnover of staff like? I ask the question especially because it seems to be a profession where no support is forthcoming from the employer and, therefore, one that is very difficult to stay in for a long time. I sense that the only reason someone would stay in this job for a long time is for financial or personal reasons. That is the option as they see it from a personal perspective, however, that is probably not good from a long-term mental health perspective. I ask Ms Plunkett to comment on that aspect.
In addition, regarding reaching out and trying to talk to people, do moderators speak to each other about the pressures they are under and the content they must deal with? Wellness coaches are good, and I am sure there are fine people in those roles, but that is a general chit-chat. It is not a constant review weekly of how moderators are coping with the challenges they face. If organisations and companies were taking this issue seriously, it should be addressed and dealt with specifically once every five days. I have gone on for a little bit, but I feel there is an accountability issue here. I would like to hear Ms Plunkett's view on the turnover of staff, the pressures they are under and how long people can last in such an environment when the requisite supports are clearly not in place.
Ms Isabella Plunkett:
I thank the Senator for his questions. Starting with turnover, it is quite high now and has been throughout the pandemic. A new employer in Ireland, a tech giant, has taken on many of these staff because it has better conditions and pay. Equally, the contract is directly with the company concerned and not with an outsourced contractor, as ours are. People obviously see that as a huge benefit. To put an average on the time people remain in the job, I would say two years, maximum. For want of a better term, you would be burnt out by then. I came into this job, as I said in my opening statement, with positive expectations in respect of joining a big tech company and the career progression that might be possible. It was all of those positive aspects. Our job as content moderators is also very fulfilling, in a sense, because we are keeping the platform user friendly. I am sure many of the committee members have kids and other family members who are on Facebook, and there is content there that people should not see. It should not be seen by me, by kids or by anyone, even by the older generation. It is challenging.
I will give the committee an example of what we do. For the moment there are three people on my team in the office. I think there are ten on my team in total. The rest are working from home due to personal issues, perhaps because they are at high risk themselves. The difference between working from home and working in the office is that they cannot do the sensitive queues from home. This burdens the three of us in the office who are doing all the queues including the graphic violence, suicide and child exploitation queues. We are not necessarily picking up the slack but we are doing the harder tickets because nobody else can do them because there is nobody to actually do them. Of course that puts more pressure on me. I am speaking on behalf of my team-mates in work as well, and I am sure there are other people in the office in the same situation.
On mental health, I totally agree with Senator Ahearn. Then again, it was very hard for me to speak about my own mental health. I struggle to speak about it with my own family and friends so of course to come before the committee and say this was a big thing for me. You know yourself, a person might not feel comfortable talking to his or her colleagues about their mental health and would feel more comfortable with a friendly face, a mother, father, sibling or close friends. Not having the ability to do that is horrible. One feels limited, one feels alone in a sense. I do not have any better words for it really, it is just very upsetting and I hope for some kind of change not just for me but for all my colleagues.
I thank the witnesses for coming in. Well done to Ms Plunkett. She is a brave woman. I am so glad Foxglove is involved as well. I had a meeting with the organisation before. I have been engaged with this issue since January when other employees reached out to me. I had a meeting with Foxglove and then engaged with Facebook. I have a few questions and perhaps the witnesses will bear with me.
On paper, Ms Plunkett is employed by this other company, we might call it a middleman or buffer zone. Is any pressure put on this other company, by Foxglove or anybody else, to look at it being responsible for the employees' welfare? It is easy for Facebook to say it is not responsible because it does not employ them directly. Is there a way of putting pressure on the other company to see what it is doing, taking things at face value? It seems the other company is the one protecting the big giant. It is kind of the buffer zone so the first step would be to look at it and what it is doing to protect its employees.
In my correspondence with Facebook, it made a clear distinction about working from home and not being allowed to work from home as a moderator, which was the initial problem brought to me in January. Facebook said there are certain categories of content review work which should only be carried out by reviewers in the office. According to Facebook, these include some of the most critical and highly sensitive work streams, including child safety and things like that. Facebook, therefore, sees a distinction. This being the case, it must be asked of Facebook what it is doing to give extra protections and training to the moderators who must deal with that, because Facebook sees it as a distinction in the moderators. It is saying it lets some work from home but the ones who work with this content cannot work from home, so it obviously sees a distinction. If Facebook does so, where then is the distinction in the supports these moderators need and get? I have it here in writing from Facebook that it sees a distinction. What then is it doing about that? It is something interesting to look that.
I think that was it. I thank the witnesses and hope we can finally get to this. It has been going on for quite a long time now and I really value Foxglove being involved. Ms Crider mentioned there were cases pending. Were any of them in Ireland?
A final query. When Ms Plunkett took the job, was she given any particular training on how to deal with moderating darker content or is every moderator given the same training? I ask because Facebook obviously distinguishes between them in the answer I got from it.
Ms Isabella Plunkett:
I can step in and answer that quickly. On the training, we all do the same onboarding. It does not matter what team a person is going to, everyone does the set procedure. Everything is just written down on a PowerPoint. It basically says it is confidential information and we cannot share it. Again, this includes not being able to speak to family and friends about issues we see in work and things like that. In short, there is no distinct training, it is just standard. It is the same for everybody. There is no special treatment, different procedures for different teams or anything like that. I hope that answers the Senator's question.
Ms Cori Crider:
I might take some of the Senator's other questions. First, on the number of cases, all of the cases I mentioned are Irish.
As to whether Foxglove is engaging both with Facebook and with the outsourcing firms, the answer is that we absolutely are. For example, in the run-up to the meeting with the Tánaiste in January we wrote to both of them about this question of secrecy.
On the mental health services, it seems to us that Facebook has set up this entire system of work and is really calling the shots. For example, the software Ms Plunkett and her colleagues use does not vary from company to company. Facebook designs the software. The production targets are set by Facebook. Even the requirements of counselling that are in the contract with the outsourcing firm are ultimately driven by Facebook. There absolutely needs to be proper medical care and psychiatric support but that is actually not what the terms of the relationship between the outsourcing firms and Facebook currently require. A Facebook employee with a considerably lesser level of exposure to toxic content could get PTSD, by the way. I have spoken to Facebook employees who have developed PTSD with comparatively lesser levels of exposure. Those people can still have sessions with a psychiatrist in a way that Isabella and her colleagues really struggle to access because it just is not a part of this system.
On the question of working from home, it sounds as if essentially the same response was given to the Senator as was given to the Tánaiste - which I have only been able to review as of 7 p.m. last night - that is, Facebook says the really really bad stuff cannot be worked on from home. I would welcome some clarification from Ms Plunkett about this but I am not sure that was the case when the moderators were doing the work at home over Christmas. I am also not sure, from what I can see of Facebook's reply, whether the quality assurance people who are staffers - who check not all tickets but a selection, as part of their job - are not still seeing some of that content themselves. I would not have said there is a perfect firewall, whereby all graphic violence and other really objectionable content is only dealt with in the office. Putting that issue aside for a minute, the question we posed was not that no-one can ever come into the office. We were saying there must be proper protections for people who come in. If they are important and vital enough to the safe functioning of the platform they must take the risk of coming into the office when myself and many others are working in the comfort and safety of home, then why are they not staff? That is the question. There is a practical and moral question that should be looked at from a regulatory standpoint.
The question I was posing is that if Facebook clearly distinguishes the need to separate moderators between home and work, based on the level of darkness of the content, why is there no distinction in the training and supports? If Facebook can see it, where is the backup for the moderators? It is also interesting that Ms Plunkett has flagged that lots of people are leaving because another tech company has managed to do a much better job of employing people. So there is obviously precedent there of another tech giant doing it better. It is an interesting one to hold up to Facebook.
Ms Isabella Plunkett:
On the Senator's question about the different content and working from home or the office, the company can alter this, it can change this through the system. If a person is working from home, he or she is doing queues like bullying, harassment and hate speech. They would be the lower-tier queues, priority-wise. If a person is in the office he or she will be doing queues such as graphic violence, suicide and self-injury and child exploitation. Through the pandemic, although we were working from home, for I believe six months there were people who were asked to do these high priority queues. Again, as I mentioned in my statement, they were meant to be doing, for example, a child exploitation queue for a maximum of two hours a day but were actually doing it for the whole eight-hour day. They needed to tell their team leader, TL, that they had done their two hours and that the TL needed to take them off that queue; it was not proactively looked at, with their TL going to them and saying they had done their time, that they should step off it now and go back to their standard queue. That is just an example.
I thank the three witnesses for the work they are doing and, in particular, Ms Plunkett for speaking out, which takes a lot of courage. The benefit is one of shining a light on the treatment of this invisible army of essential workers, who are undervalued and are part of a two-tier workforce. Ms Plunkett mentioned some of her workmates being scared or intimidated about speaking out about the conditions they faced. Could she elaborate on that? What are they worried about? Why would they be worried about speaking out at an event like this or to a union?
Ms Isabella Plunkett:
I thank the Deputy for his questions. Many people have the fear instilled in them from within the company. They have to sign an NDA, which is then taken away from them. I would not be able to remember exactly what was in a document from two years ago. Obviously, I do not want to cross a line and cause trouble either. I want to do my job and get on with it but if I am not given a document that I can reference and stay within the boundaries of, then it is easier to instil that fear. However, many people have families and kids and bills to pay. People are intimidated and do not want to say anything in case they have to go through a disciplinary procedure and potentially lose their jobs. There have been examples of that happening. We have an internal workplace platform, like a Facebook for us. When people were leaving the company in the middle of the pandemic last year, they were writing their goodbyes on that platform. They were complaining about the treatment, what was going on and how they felt unsafe. Many other people in situations like mine live with someone at high risk and were worried or concerned and these worries and concerns were not being taken into account. It was clear that it was being censored because people's comments were being deleted, accounts were being disabled and people were put on disciplinary action and we never knew why. Of course that makes people fearful because they see this action being taken by the company and wonder why. Why do I not have the right to speak about such matters? Why can I not stand up for what is right and what is fair? I am a very fair person. I believe fair is fair and I do not agree with what is happening but it took me a long time to come to this decision and agree with the Foxglove team to take part in this. I had such a feeling that I needed to do it and needed to speak out for the people who are too afraid or feel they have too many responsibilities and cannot afford to take any risks.
I thank Ms Plunkett for doing that. How did she feel when she experienced that culture of secrecy and fear, contrasted with the kind of self-image Facebook has as a progressive, socially responsible corporation, with the slogans in the office and so on? Is it fair to say Facebook is a hypocrite in how it treats its own workforce?
Ms Isabella Plunkett:
I would tend to agree. How can a company put a slogan up on its wall or say one of its pillars is "be brave" when none of its staff feel they can be brave or have the safety to speak? It is horrible because this should not happen in any company. People should be able to speak freely. There is a big difference between crossing a line and actually speaking for what is right. I did not come here today to cross that line; I came to be fair and speak about what is right.
The communication has always been really bad and that is part of the problem as well. Everyone just wants simple communication about why these things have been going on, why we were brought back to the office and why we could not stay working from home like our counterparts at Facebook. Again, we got no answer. We asked the company to define how we fit into the essential worker category and we got no answer for six months. The question was sent to HR and we heard nothing. It could have been even longer than six months. That is a clear example of how long we were waiting for a tiny bit of communication. It is not good in the slightest.
Ms Cori Crider:
I am very happy to do so. In January, Foxglove and two content moderators who are not here today, Ibrahim Halawa and Paria Moshfeghi, met with the Tánaiste and raised many of the same conditions and issues the committee is here to discuss today. We raised the problems that arise both for Facebook's millions of users and for workers with the outsourcing model. We raised the fact that workers are not permitted to retain their NDAs, we raised questions about mental health support and we also raised the matter of Covid safety.
On the Covid safety issues, we were directed to the HSA. We sought to engage with the HSA and wrote letters asking if the authority was interested in speaking to witnesses and gathering testimony from people about the live Covid cases that have happened on the floor since the reopening. We never received a reply from the HSA at any point, although I am given to understand that it later made contact with the outsourcing firm and it basically said everything was kosher and copacetic, which is perplexing given the extremely narrow and limited scope of its investigation into Covid safety and the fact that it did not talk to lots of relevant witnesses. That is what it is.
We do not have a copy of the letter the Tánaiste sent to Facebook. What we have instead, as I understand it, is Facebook's reply to him and a covering note, which we received at 7 o'clock last night. There are quotes in it from what the Tánaiste said but I do not have the full thing that was ultimately set out so I do not know. It seems there was a question about remote working and the engagement on that was the same as mentioned in the discussion with Senator Garvey, wherein Facebook said there are different kinds of content and that explains why the company was pulling people in. It did not engage at all with the question of outsourcing and there was no indication in the covering note as to whether the Tánaiste intends to take it up further. Facebook basically said to take it up with the HSA as this is its problem.
Regarding mental health support, Facebook responded by saying its partners have to give access to on-site counselling. It is essentially just reiterating the wellness coach provision but, as the committee has heard, those people are not doctors. Facebook has refused to engage with the point that a cup of tea and a chat, as one member referred to it, is not enough. This is a medical issue. Facebook does not engage with that point and I am not clear from the tenor of the Tánaiste's response to us, attached in a covering note, whether he intends to take that up.
It is the same with the issue of the NDAs. I will not go over again the endless chase we have engaged in, running around the houses to try to get everybody to give us the documentation, but suffice it to say we have asked in many ways and multiple times, including on solicitors' letterheads, for all relevant documents and they have not been disclosed. There is a job to do here and we at Foxglove and the moderators need the members' help as regulators to hold these companies to account. I thank the Deputy for his question.
I sincerely thank Ms Plunkett, Ms Ní Bhrógáin and Ms Crider. Ms Plunkett's testimony was extremely powerful and we owe a huge debt of gratitude to her for telling her story, as well as to the CWU and Foxglove for working with her and the other workers and bringing us to where we are today, which is allowing this story to be told. I thank them for all the work they have done with the other workers because it would be much easier not to do it. Another member referred to hidden employment and Ms Plunkett's job is very much away from the public eye. We enjoy a safer Facebook because of the work she does but she is coming here today at a potential personal cost to her and her work.
I thank her for that. To think she is a number of weeks on antidepressants and still working is just mind-blowing and a reflection of the work that must be undertaken.
Thinking about the outcomes from this meeting, we, as legislators, cannot have a situation where in five or ten years we look back and say that we did nothing about regulating this work, particularly as Ireland is a global and EU headquarters for so many major social media platforms. There must be very clear actions that we undertake as a committee arising from the testimony today.
I have four questions about this. A number of members asked about the working environment. I would like to hear more about the pay because we know it is a fraction of that earned by Facebook's direct employees. I am not talking about the specific pay of the witnesses but rather the average pay of these workers. There is also the question of qualifications that individuals may have coming into the role.
The second question may be for Ms Ní Bhrógáin and Ms Crider. It seems these NDAs and the taking back of those agreements after they are signed is a breach of the Terms of Employment (Information) Act 1994, which entitles any employee to be able to access his or her contract of employment. From what I understand of the NDA, it is effectively part of the contract of employment. Do the witnesses believe this is a breach of the legislation?
The next questions relate to regulation. I would like to know what is the position with TikTok, YouTube and other social media companies. How do they handle the content moderation in those companies, as Facebook is not the only social media platform having to deal with toxic, vile or challenging content? In terms of regulation and health and safety legislation in particular, what do we need to put on the Statute Book to ensure content moderation is brought in-house as a hazardous activity rather than being outsourced? I am throwing out all the questions but it is because I have very limited time. Ms Plunkett can start before we go to Ms Ní Bhrógáin and Ms Crider.
Ms Isabella Plunkett:
I thank Senator Sherlock for the kind and reassuring words. I very much appreciate what everybody has said. I can give some averages for the pay. For the job I do, a person starts on approximately €27,000 per year, with €1,000 of a pay rise after a year when an 11-month probation period has passed. I am not 100% sure of what a Facebook worker earns but I believe it is approximately €45,000 or €50,000.
Ms Isabella Plunkett:
Many people come to the company from different backgrounds. They include lawyers, people training to be doctors or hospitality professionals. There is huge scope for people with different qualifications to join the company. It is a really diverse team, which is very nice. There are many college students as well. I work evening shifts and there are people working with me who are currently in college and who use their pay as a source of cash when they study in the morning.
Ms Fionnuala Ní Bhrógáin:
There was a question on NDAs and terms of employment. The question is whether NDAs form parts of the contract of employment. If they do, the workers should be entitled to a copy of any document as it relates to their employment. Beyond whether that legislation applies, in general, workers should have access to any documents that have an impact on their ability to interact with their employers and, in particular, in terms of interaction for raising internal or external grievances such as through the Workplace Relations Commission. If a worker decided that he or she wanted to take an individual case through the mechanisms of the State but felt this document prohibited him or her from doing it, that is crucial to taking into account how those NDAs affect the contracts of employment. I might have to defer to Ms Crider on whether this breaches the legislation mentioned by the Senator.
Ms Cori Crider:
I will readily concede that I am an American lawyer and not an Irish lawyer. It is an interesting idea into which we will now look urgently. God knows the subject access requests do not come back in a full way. It seems obvious from a common sense way that a company should not be able to hold people to agreements they are not given copies of in a systematic way. There is a wider cultural issue of how often this is repeated with periodic retraining or refreshers, with constant messages from management. It is almost Pavlovian conditioning about this question of secrecy.
The Senator had other questions about TikTok and YouTube. TikTok is the company that Ms Plunkett was referring to when she said people were leaving in droves to work for it because it is offering direct employment contracts. A newer kind of upstart social media firm has decided to employ people directly.
Everybody knows YouTube is one of Google's companies and it operates the same outsourcing model as Facebook. The outsourcing companies in Ireland do similar work for YouTube and the Senator is exactly right in that this raises the same questions relating to exposure to toxic content and precarity that Facebook puts people through. It is another matter calling out for a regulatory response because we want a consistent standard to be applied across all these platforms that really are the public square these days.
This relates to the Senator's final question, which is how to regulate this appropriately and what needs to be put in the Statute Book. With humility, I again say I am not a legislative draftsperson. It seems, nonetheless, that this is an appropriate subject of health and safety regulation and questions such as limits on toxic content are appropriate to regulator.
There is also the question of outsourcing a core business function. One does not need a particular ideological position about the wisdom or lack of wisdom of outsourcing in general. This is a special case and it is about outsourcing a business function that is core to the health of the public square. Stepping back for a moment, we are now more than 15 years into this experiment with social media and light-touch or no regulation has failed. It has hurt workers and led to a public square that is really pretty problematic in many ways. It ought to be regulated just like other forms of mass communication, such as broadcasting, both for workers and the health of the public square.
I thank the witnesses. I find myself reflecting on what Mr. David Thompson said in asking what kind of society or world we are developing. Some of what has been described is quite harrowing and all of us are exposed to it on different platforms at different times. Service companies are being used to insulate Facebook and others from the full responsibility of the effects of the employment, which must be legally addressed.
I know it has been mentioned but I cannot understand how copies of NDAs are not being given to people. If I sign a legal document that will bind me in future, I want to know what I have signed and I want a copy of it. I cannot understand how NDAs have been skirted in a legal fashion. I do not understand how that was achieved, although I understand that people must accept it as a term of employment. I just do not know how it can stand.
What activity is Facebook doing to profile challenging contributors? I refer specifically to the blacklisting of IP addresses and trying to target people who are generating content, which may be inside but is probably outside the EU.
I have no doubt the question of post-traumatic stress disorder is very real. I have often been shocked by what I have seen online. It seems benign compared with some of what has been mentioned here.
It is obvious that the moderation is supporting the platforms. As has been pointed out, if it was not moderated it probably could not work. This is part of the problem, that Facebook has a way out using moderators. Is there anything the employees can do as a group to put pressure on Facebook? As a parent, if I felt there was not proper moderation going on, I would not want my kids accessing Facebook. That is obviously a much wider discussion globally for moderators. I would welcome hearing any information they have about what Facebook is doing to try to identify and sanction those who are deliberately putting out more than challenging content that people should not see because it is very damaging for mental health and long-term well-being.
Ms Cori Crider:
Perhaps Ms Ní Bhrógáin could talk about what people could collectively do, but I can speak a little on the question of what Facebook is doing. The work Ms Plunkett described is essentially Facebook's response. Some stuff will be caught algorithmically such as spam or some kind of hate speech, but one of the things we observed over the course of the pandemic is that during the first wave when Ms Plunkett and her colleagues were sent home, there was apparently an experiment in much more automated content moderation. They turned the knob up to 11, as it were, and tried to see what would happen if the algorithm as they have it was allowed to catch more content in order, as I understand it, to stop moderators working from home having to see so much graphic stuff while they were at home. That experiment appears to have failed. There was some good investigative reporting about it last year that showed, for example, that self-harm content stayed up and was not caught by the algorithm and that other problematic content stayed up as the algorithm just could not catch it but pretty anodyne stuff that should have stayed on the platform came down. It shows that we are a very long way from this job going away. It is absolutely essential if we are going to have a contemporary public square then what Ms Plunkett does has to be a part of it, but as we said, at the moment, the work is treated as disposable. I would like to hear from Ms Ní Bhrógáin and Ms Plunkett on the question of what people can do by coming together because we have been trying to support people to come together but the obstacles are considerable.
Ms Fionnuala Ní Bhrógáin:
I will respond briefly first if I may and then we can hear from Ms Plunkett. In terms of what the workers can do collectively, we need to recognise that, as has been mentioned a couple of times by contributors, this is a new form of work in Ireland and globally, but some of the work that the moderators are doing models regular call centre work on an office-based model. In terms of it being a new form of work, there are workers out there, in particular in the United States, who have come together collectively and formed a trade union and they have gone up against the global tech giants and made considerable gains for themselves and their colleagues. They have given each other the strength to come out collectively and raise those issues that affect them, not always directly in the context of their terms and conditions but in many cases in the United States they are raising issues of ethics around how platforms are being used and what contracts the employers are signing with various bodies. In my humble opinion, it is through coming together collectively as workers in the trade union that they can exercise their collective power. Fundamentally, what many contributors have identified is that there are no social media platforms without content moderators. They are the power behind these social media platforms. Without them, we do not have a safe public square and these companies do not have a product. Once content moderators come to realise the strength of their collective power, that is when they are going to start to see and achieve real improvements in their lives and their terms and conditions. I would welcome Ms Plunkett's thoughts.
Ms Isabella Plunkett:
I thank Ms Ní Bhrógáin and Ms Crider. From my perspective as a representative and a content moderator, we have collectively come together and we have talked about it. We are denied the right of a union, as the company does not recognise the union at all. We cannot do anything about that, but it is one objective I am here to fight for today. I am also here to bring up the fact that, as much as we try to communicate with the company, we are always given false hope. We had a lot of meetings last year with HR and there were at least 300 attendees at some of those meetings. HR might look at a pay rise, for example, sometime in the first quarter but we heard nothing about it. There was no further communication on it. That is an example of the false hope we are given. That is why we are here today before the committee to raise this issue and to make it a bit more public. We are hidden. We are the silent workers and we want to be treated equally like everyone else.
I would never want my son or daughter to do this work. I just cannot imagine how horrific it must be. Could Senator Garvey please mute her microphone please? What is happening here is absolutely appalling. This is the dark underbelly of our shiny multinational social media companies. As someone who used to organise workers for a living, I would like to hear from Ms Ní Bhrógáin. We have heard about a climate of fear. Let us think about the work these people do. It is the most horrific work that ensures we do not have to see it. Deputy Shanahan mentioned that we all see it, but clearly we do not see most of the worst of it, only thanks to Ms Plunkett and her colleagues. It is quite clear to me from today's hearings that Facebook could not operate without these people, their work is so fundamental, and yet these people are denied the right to join a union. They are denied basic rights of work. I read in Ms Ní Bhrógáin's submission prior to this hearing that a couple of content moderators were contacted by the agency to reinforce its requirements on secrecy. What is happening here is truly Orwellian. It is that frightening. Could Ms Ní Bhrógáin expand on that challenge and the outlook of secrecy?
I want to ask Ms Plunkett what the mood is like among her colleagues. Presumably she is here today because she wants to see change and to give hope. She contacted the Tánaiste. He has spoken out about how crucial the work is, but we need to see action. As we come to the end of the meeting, I would like to understand what the specific asks are from Ms Crider of the Government, what it is that it needs to do now to take action for these workers because this has gone on for far too long. You have my commitment, Chair, to bring the employers in, because I want to know, not just as a trade unionist, but as a father and a citizen how on earth Facebook can stand over the outsourcing of these jobs. From what I can see, it is just for reasons of cost and liability. I will leave it at that because I am conscious of leaving time for the answers.
Ms Fionnuala Ní Bhrógáin:
I will take the question. It is important for all content moderators and all workers in Ireland to understand that whether their employer chooses to recognise a trade union, they have a right to join a trade union. That is enshrined in the Constitution whether their employer likes it or not. They have the right to be individually represented by the trade union and to seek advice and guidance from it. However, as was correctly stated, because of our voluntarist system, the employer is not required to engage. I know that discussions taking place in Europe at the moment are looking to address the voluntarist system. I see Deputy O'Reilly has joined us again. Members who are trade union organisers themselves understand that the bread and butter of organising workers is meeting them, sitting down with them and getting them to tell their stories, listening to them and understanding what their issues are so that the organiser can empower them to address those issues and lift them up in their workplace as a collective. If organisers cannot get content moderators to meet with them or speak to them because they are so afraid of the consequences of being caught then that highlights the difficulty of organising what are clearly some of the most unprotected workers currently in Ireland.
It has felt nearly insurmountable sometimes as a trade union when we meet a couple of workers. It speaks to a question one of the other contributors asked earlier in terms of the length of service and attrition. By the time we get somebody to trust us so that he or she can speak to us in a confidential manner and will not be exposed, the damage that has been done to that person means he or she can no longer stay in that employment, and that person is gone and we start all over again.
It is, therefore, enormously challenging. It would be nice to go back to the old days of the old telecoms and logistics companies where there is a tradition of respecting the workers' collective voice. As I mentioned earlier, however, this is a new form of work and these are global giants, which I believe requires a legislative response. I hope that answers the Senator's question.
It certainly does. I want to emphasise that no matter how important the mental health supports are, and they are clearly absolutely crucial, this issue will not be resolved unless these workers are made direct employees to recognise the core work they do and are given rights as members of trade unions.
I want to hear from both Ms Plunkett and Ms Crider before I finish up, please.
Ms Isabella Plunkett:
I thank the Senator for the question. I can quickly jump in and answer that. We need action. The reason I am here today and why I have plucked up the courage to speak out about this is that the morale is dreadful, as the Senator can imagine. I do not think they are really doing anything positive to change or even give us some kind of hope that there will be change. People come to me and say there will be no change and that I am mad. That is a terrible attitude to have because at the end of the day, if one is not going to have a positive mindset from the very beginning then nothing good is ever going to happen. If one is going to have a negative mindset then, of course, one is going to wallow in the negative. I firmly believe that action will be taken and I firmly hope for better days ahead.
Ms Cori Crider:
The Senator is absolutely right that this is overripe for legislation at this point. In terms of our recommendations, it seems that it would be appropriate to have a legislative requirement for social media platforms over a certain size. One can set a threshold so that some blog with 15 posters obviously does not have to go through this. We are, however, talking about some of the world's largest companies here, for example, the Google, YouTube and Facebooks of the world, which would have to in-house core content moderation staff. That is one issue.
Another issue is the health and safety point. There should be a requirement for psychiatric support as a basic workplace safety protection, and not simply wellness coaches who tell people to do deep breathing and yoga to deal with the medical risk of having to be exposed to this content.
Then, I believe we must do something, in a regulatory sense, about this culture of secrecy. I do not know whether it is possible for the committee, for example, to write to all of the companies and ask them formally to clarify or disclose to the committee what they say about secrecy, or to clarify that, in fact, the workers have the right to raise legitimate workplace grievances. We would welcome any action the committee could take in helping us in that respect. Ms Plunkett has done much here in cracking that wall of secrecy and climate of fear, but actually, it is the companies that have to go on record and admit that Ms Plunkett and her colleagues have the perfect right to raise these legitimate workplace issues.
I thank the Chairman. First of all, my apologies. I was in the Dáil speaking about mental health and health issues, so I missed Ms Plunkett's presentation, which I will certainly play back. A few questions struck me while listening to the debate. Is this a worldwide model that these platforms apply? Are there better practices in other countries from which we should be learning?
I was disturbed to hear that the Health and Safety Authority, HSA, which is an enforcement agency, had not taken some complaints seriously. We should probably hear from the HSA so it can account for that. I would like to understand a little bit more clearly what the defects were in that enforcement. It would be a serious matter because the HSA is an independent body. From listening to some of the contributions, it seems to me that there may be breaches of health and safety legislation without any new legislation. I am not an expert in this field, however.
Has the Workplace Relations Commission, WRC, addressed any of these issues? I know cases are pending, which are obviously at the very serious end of damage. In terms of the WRC responding to this concern, have there been any cases or are there evolving codes of practice that we might seek to accelerate?
I was involved in the preparation of legislation on online safety from the perspective of the user as opposed to the platform. One of the things we proposed, however, was that there would be a digital safety commissioner who would oversee codes of practice so that one would be obliged to have those codes of practice meet certain expectations and principles and so forth. Is that a potential route to go along? That legislation is comparatively advanced, although it is not published. Would that be a potential vehicle for meeting some of Ms Plunkett's concerns?
Ms Isabella Plunkett:
I thank the Deputy for his question. Definitely. I do not know how or in which way this will be approached but I believe there needs to be some form of legislation. The way I feel is that a person just does what he or she is told and cannot speak or is unable to say how he or she feels or anything like that. I believe some kind of concrete legislation is needed so that content moderators, who would not be able to speak today or have the confidence to do so , can at least refer to that. It would be in writing so, of course, that would always be helpful. I thank the Deputy again. I hope that answers his question on my behalf.
Ms Fionnuala Ní Bhrógáin:
If I may respond to the Deputy's question regarding whether this is a global model; it is. Generally, the larger companies, such as Facebook, Google and YouTube, employ the same model and, in fact, regularly use the same outsourcing companies. In Ireland, there are a handful of outsourcing companies that work interchangeably for this grouping of social media companies.
Although it is the global norm, however, new and emerging tech giants are coming out and directly employing, as was mentioned earlier. I am not sure if the Deputy was with us at the time. TikTok, for example, is directly employing people and scooping up content moderators in their droves for the very fact that it recognises them as direct employees and treats them as such.
The Health and Safety Authority has a duty to inspect and see that workers are protected. I do not have its legislation in front of me but I would have thought some of the things we are hearing about sound very close to legal breaches. Ms Crider expressed the concern that the HSA had not responded or listened to people who should have been heard. That sounds serious to me. To clarify, what is the status of that investigation?
Ms Cori Crider:
I thank the Deputy for his question. I am very happy to clarify. It is important to explain to the best of my understanding what the investigation did and did not cover. My understanding is basically that just as we were about to meet with the Tánaiste in January, somebody asked the HSA whether it was looking into the question of Covid safety at the outsourcing offices. It was, therefore, only looking at the question of Covid safety, compliance with Ireland's Covid safety protocols and whether people should be working from home. It did not address any of what we believe are workplace safety issues around mental health. None of that was within scope.
Nonetheless, we engaged with the process and said that we had a bunch of people. We basically just wrote to the entity and said that we had a number of people who were willing to engage. We did not hear anything back about the investigation into Covid. I could not honestly say to the Deputy what the HSA did other than that it apparently looked into it and said that from a Covid standpoint, what was happening in the office was, to its mind, acceptable. That does not really answer the question about who should have to come in and who should be allowed to work from home. I suppose it is more to do with spacing and ventilation.
We were concerned that there did not seem to be an interest in speaking to people who had something to say, even about those issues that were squarely within its remit and what it said it was investigating. We are happy to send the committee the letter we sent to them, if it would be useful for its records and for it to see what we offered. Basically, we did not hear anything back.
I think we should hear from the Health and Safety Authority on this matter. Apart from the Covid issues which may be all that it looked at, it seems to me there are wider issues here that it should be investigating. Those questions are not for the witnesses here today, but for the HSA. I thank the witnesses for their time.
The following questions are for Ms Crider. I understand a case was settled in the United States for $52 million Was that case heard in court, with a court adjudication or was it settled outside of court? I understand that during those proceedings Facebook acknowledged that damage was being caused to the workers. I am not sure whether the workers were directly employed by Facebook or outsourced. I am interested to hear more about that case. I know we do not have the type of class action in Ireland that was mentioned earlier, but I am interested in hearing more on the situation internationally. Are any of the content moderators directly employed by Facebook?
The following question is for Ms Plunkett. Is there a screening process for the job of content moderator? It appears to me that not everybody would be suitable for this type of work. As mentioned by Deputy Bruton, it is dangerous work and it damages people. We need to be honest about that. There should be some level of screening to ensure that a person is suitable for the type of work involved. Does that happen and what training is provided? Was Ms Plunkett told that the work could be damaging? Is it possible that people are being asked to sign a disclaimer with respect to the damage that the work could cause? Is it likely that people know it could cause them to suffer post-traumatic stress order, but they choose to do it anyway?
I agree that this work is part of the critical infrastructure. As stated by other members, Facebook and others cannot operate without the work of content moderators. It is important work. Is there ongoing research in this area?
Ms Isabella Plunkett:
On the question regarding screening, it is an interview process at which people are asked to identify the issues that might trigger a reaction in them or that they would prefer not to see. For example, I said that content showing children suffering would bother me. Nobody wants to see children suffering. That would destroy me and, I think, many other people. The response was that I could have to deal with such content and I was asked how I would handle it. I gave my answer and that was it. I was successful and I got the job. If people knew at the outset that the work could impact on their mental health or cause post-traumatic stress disorder, I do not think they would apply for the job. That would be a huge turn-off for a company, but, again, that is just my opinion.
Ms Cori Crider:
I thank Deputy Stanton for the question. A class action was brought on behalf of content moderators from several American states where there are content moderation facilities, in which I was not involved. That was settled out of court with Facebook paying $52 million dollars to settle for the whole class of thousands of workers, although all of the people in the class action were outsourced in exactly the way we have been discussing. It amounted to $5,000 to $10,000 maximum, depending on the assessed level of PTSD per person who opted in to claim the settlement. I hope it goes without saying, but serious clinical PTSD is difficult to treat. It is not clear, particularly in the expensive American healthcare system, how far $5,000 to $10,000 will go. My honest opinion is that it was a bargain basement settlement to avoid dealing with a systemic issue. As I said, an ounce of prevention is worth a pound of cure. We do not want to burden the courts system with these matters to sweep up in as best they can after the fact. We want to, as best we can, stop people getting sick in the first instance.
To my knowledge, there are no direct employees at Facebook or Google doing this work, but there are people who are exposed to the content because the quality assurance, checking compliance with Facebook's rules and making sure that the outsourced moderators are getting it right, is done by staff. There are staff who it has been estimated to me are exposed to ten to 12 hours per week of content. In other words, only a fraction of their work is content review. The people who are doing it in volume, in the way Ms Plunkett has described, are all outsourced.
There is ongoing research. New York University published a report last year on who watches the social media giants, which we should have submitted as evidence, but we can send to the committee. It calls explicitly for an end to outsourcing and states that this practice is core to the tech giants business, not just Facebook but others, and should be in-housed, and it is based on interviews with moderators internationally. It also states that this is a growing problem that requires a regulatory response. I would commend the NYU report to the committee. An academic at University of California, Los Angeles, UCLA, Sarah Roberts, has written a book about this entitled Behind the Screen, which, again, is based on research with dozens of content moderators.
As rightly pointed out, this is a global issue and a new form of work. We are pleased that the committee has taken the initiative to investigate the work because it is not going away.
I will not take seven minutes. It has been a lengthy meeting. I am sure the witnesses have other things to be doing. This is a complete mess. There is no point reiterating all that has been said. We need to move towards action now. I have two questions for the witnesses. As they will be aware, the committee will be meeting representatives of Facebook. I think we need to bring in the recruitment agencies as well. They need to be stood up in front of us because they are doing the dirty work. They are a protection for the big guys. I do not think we should leavethem off the hook; they need to be brought in as well. I would welcome clarity on whether it is proposed to do that or to only bring in Facebook. Facebook can pawn off the issues to the others involved so we need them in the room too.
Deputy Bruton made an interesting point regarding the Health and Safety Authority. I acknowledge its work in regard to physical health and safety in terms of office space and so on. What is its role around mental health and training and the treatment of staff with regard to mental health and safety? If that is part of its role, it would be important to get its take on all of this.
Ms Cori Crider:
My hope is that there will be an outcome which leads to the mental health being properly protected, properly assessed and properly regulated. It is for the Legislature rather than us to say whether the HSA is equipped to do that. Our experience did not build a huge amount of confidence. I was interested in the idea raised previously regarding a pending regulation on online safety that it is, perhaps, not too late to amend to incorporate at least some of the issues that we are raising. That sounds like a productive way forward.
I had to step out earlier to make a contribution on legislation.
I have been keeping up, however, and people have been live tweeting this session as well, which was helpful. I was able to keep an eye on what was going on. I have a question on health and safety in respect of these roles. This is a key issue we must focus on. A culture of secrecy has clearly been outlined, and it is guarded by these non-disclosure agreements. In many ways, however, it is not just that aspect; much more is going on that keeps this state of affairs secret and behind closed doors. There are serious psychological implications for the workers doing this work. My questions are probably mainly for Ms Ní Bhrógáin because they focus on human resource, HR, issues, but they are also for Ms Plunkett or Ms Crider, if they wish to contribute as well.
I am going to hazard a guess that risk assessments are not routinely carried out, given these workers are not screened before they start. Therefore, is it possible to conduct risk assessments and to mitigate the risks that exist? I ask that because this is dangerous work, but so is climbing up the side of a building. In that case, however, we put in place measures to ensure that such workers are kept safe in their workplace. Does Ms Ní Bhrógáin believe it is possible to legislate or to ensure regulations are put in place that will make this work safe? In an ideal world, by the way, and no one will disagree with this, we would love if we did not need this work to be done and that everything on social media consisted of kittens on skateboards. It does not, however, so we need some kind of robust way to deal with this situation. Is it possible to conduct risk assessments? I doubt if they are conducted, but I would be delighted if that was the case.
Is there an analogous grade category? Ms Ní Bhrógáin, and indeed Senator Gavan, will be familiar with this term. There is such a category in the main employers, namely, Facebook and the other platforms. However, is there any grade in the private or public sectors that Ms Ní Bhrógáin would consider to be an analogous grade, one we could point to as having the correct supports in place? My worry is that we will hear from the employers and others, and they will simply throw their hands up in the air and say this area just cannot be regulated because it is just awful and there is nothing we can do. I believe, though, that we can do something and that we can make this workplace a safe one to work in and also one where workers are rewarded with decent pay and fair terms.
Ms Fionnuala Ní Bhrógáin:
I thank Deputy O'Reilly, and will take a stab at answering those questions. Regarding the HSA, I share Ms Crider's perspective that I do not know if this area is within its remit currently. If it is not, then it ought to be. Mental health is a workplace health issue. Turning to inspections, it is our position that any employment which puts the health and safety of workers at risk should be subject to regular inspections and regulation. Forgive me, could the Deputy remind me of her second question?
Ms Fionnuala Ní Bhrógáin:
I think Ms Crider mentioned this earlier when she referred to An Garda Síochána. Members of An Garda Síochána and other public bodies are required to deal with situations with traumatic content and are facilitated with the correct level of medical assistance. Another example concerns operators of the 999 emergency line, who were previously outsourced and conducting the same work as those employed within the emergency ambulance and fire services. Our union's position was that those directly employed workers were an analogous grade. The 999 emergency line operators have now been taken back in-house and are dealing with 999 emergency calls as direct employees. Therefore, I suggest those two instances could be considered analogous grades.
I thank Ms Ní Bhrógáin. It might be worthwhile if the committee investigated the analogous grades and how they are treated. In the little time remaining to me, I once again thank Ms Plunkett for the work she has done and for the light she is shining on this issue, because it is very important. I will leave my last few minutes to Ms Plunkett, if she might want to say some parting words to the committee.
Ms Isabella Plunkett:
I am thankful for the kind words from the members of the committee. I appreciate all of them. I did not expect this. I was so nervous coming here to speak. I hope for great things and I am positive that there will be great things to come. Hopefully, we can soon make a change, not just for me but for all the content moderators in Ireland, and stand up for what is right. I again thank the members of the committee for their time and kind words. I appreciate that very much.
That concludes our consideration of this matter today. I thank the delegation from the Communications Workers Union, CWU, and Foxglove for assisting the committee with this matter. In particular, I thank Ms Plunkett for her contribution. It was very powerful, and I really appreciate what she has done. The committee will reflect on today's proceedings and take forward such aspects of the contribution that it believes it can progress further. I endorse what was said by members, and, where we can, we will invite employers to appear before the committee. We will look at doing that. I thank everyone for their contributions today.