Oireachtas Joint and Select Committees

Wednesday, 1 August 2018

Joint Oireachtas Committee on Communications, Climate Action and Environment

Moderation of Violent and Harmful Content on the Facebook Platform: Discussion

12:00 pm

Photo of Bríd SmithBríd Smith (Dublin South Central, People Before Profit Alliance) | Oireachtas source

I thank the witnesses for their attendance. I had intended asking a series of questions but I was intrigued to hear Ms Cummiskey state that Facebook does not allow hate speech and will not allow people to call for violence against anyone else on its platform. One of the things nobody else mentioned regarding the "Dispatches" programme was the special status given to Mr. Tommy Robinson and the Britain First party for a long time before that. It was not until the party was banned and some of its leaders were sent to prison that its Facebook pages were taken down. Nobody can deny that there were a plethora of hate messages on the Facebook pages of Tommy Robinson and Britain First, an openly fascist party which continually attacks migrants and migrant communities online, physically and politically. I ask the witnesses to explain to me the reason it applies light-touch regulation to political parties until they reach a certain level of popularity - I believe Tommy Robinson's page had approximately 900,000 followers - and are then elevated to a different category where Facebook deals directly with the posts, rather than them being dealt with by other, less important moderators. According to the policy in some Facebook's documents, if a page appears to have five or more hate messages, it violates the company's rules. Nevertheless the more popular pages, such as the far right page I cited, are protected from these rules. If that has happened in this instance, will it happen again or is it happening repeatedly? That issue needs to be discussed.

The other thing that emerges from this is that there are different strokes for different folks across the globe in Facebook's policies. I will take this one example from a piece in prorepublica.org, which was written well over a year ago when much of its content was drawn to Facebook's attention. It is good the witnesses have come in with their hands up today but it raises the question as to why this did not happen before. What is really going on beneath this? Mark Zuckerberg talked about helping people to understand the world in a better and more inclusive way. I agree with others who say that Facebook is a wonderful tool for the population of the planet.

However, we must look at what drives it. Is it Mark Zuckerberg's original idea or is it revenue and profits, 98% of which are generated from advertising?

I invite the witnesses to consider two decisions by Facebook. Following a terrorist attack in 2017 in London, a US Congressman wrote a post in which he called for the slaughter of radicalised Muslims and wrote, "Hunt them, identify them, and kill them. Kill them all. For the sake of all that is good and righteous". A month earlier, a #BlackLivesMatter activist and poet named Didi Delgado wrote a post stating, “All white people are racist. Start from this reference point, or you've already failed”. That was all that she said. Her post was removed immediately and the Congressman's was not, and remained up.

Another comparison worth looking at concerns Palestine and Palestinians. While Facebook states it protects people form extremism, 70% of Palestinian Facebook pages are removed and Palestinian activists use a hashtag, #CensorPalestine, to explain their frustration at this. In the case of Israeli Facebook hatred posts, however, something like one anti-Palestinan post goes up every 46 seconds and they are not removed. Is this related to the fact that an official meeting took place in September 2017 between Facebook and the Israeli Government, which threatened to ban Facebook from the country unless it did something to curtail the activity of Palestinians? There is no curtailment of hate against Arabs and Palestinians from Israeli Facebook posts. This shows that Facebook policies appear to favour the powerful and the elite over those who are oppressed. That is also the case with migrants. Facebook has a policy, which Ms Cummiskey repeated, that it does not allow hate speech. However, there are many incidences of posts containing hate speech against migrants. We quoted several that were exposed by the programme. Ms Cummiskey said there is a quasi-protected category for migrants. They are sort of protected, but not really. For instance, Facebook protects against calls for violence and dehumanisation in general but, in one document, allowed migrants to be referred to as filthy but not as filth, because Facebook distinguishes between the noun when used in a Facebook post. That does not deal with this being one of the most vulnerable groups on the planet. We have seen them drown in the Mediterranean, held behind barbed wire and so on, but Facebook has a quasi-category in which it places migrants for protection. These are question relating to how Facebook monitors things.

I return to the related question of training. Deputy Dooley may have asked this already but were the training materials used for moderators created by Facebook or by CPL? Will that continue to be the case? It appears as though Facebook has created a big headache for itself by outsourcing moderation. The Facebook representatives here have stated they earn about 20% above the national minimum wage but outsourcing is obviously being used to save money for the company. There is no other reason to use outsourcing. If that was not the case Facebook would directly employ the moderators and make the company directly accountable for moderation rather than being able to say that CPL made all the mistakes. If that is the case, this committee should invite CPL before it to question them on how they do their business. Ms Sweeney and Ms Cummiskey seem to be saying they did not know about things that did not happen and that CPL changed the criteria for training. That is a real problem. If Facebook had not outsourced this and was directly responsible for moderation, this might not have happened. Maybe it should reconsider the question of outsourcing. These concerns have been raised previously.

I have some questions on Facebook's submission to the committee. It stated "Ongoing training will now continue with twice weekly sessions to be delivered by content policy experts from Ms Siobhán Cummiskey's team". From this, I assume this has not happened in the past but will happen now. Is that the case?

What sort of training has Facebook offered its moderators up to now? Is this a new type of training Facebook is offering or is the company engaging in a review? Ms Sweeney stated that all content reviewers will continue to receive regular coaching sessions and updated training on Facebook policies as they evolve. Has that not happened before? Will Ms Sweeney and Ms Cummiskey please explain how these wonderful ways of monitoring things have suddenly dawned on Facebook? Ms Sweeney also stated that Facebook will continue to deploy spot-testing at its review centres. How often does Facebook carry out such tests? Is enough consideration given to spot-testing? Why is Facebook suddenly stating that it has to increase the frequency of this testing?

The other question I want to ask about moderators relates to how they are paid. Deputy Eamon Ryan asked a similar question. There are reports of moderators in the Philippines being paid $350 per month and those in the United State being paid $15 per hour. Ms Sweeney represents Facebook in Dublin. The company is based in Dublin and it pays its taxes through Dublin. It gets off scot-free from paying a huge amount of tax across the globe on the basis of the advantage it enjoys because it is based in Dublin. Last year, its total revenue was $40 billion, 98% of which was from advertising. When one looks at those figures, it is extremely difficult to believe the statement that the company is not driven by financial interest. We are talking about a huge amount of revenue from advertising. It can become like a contagion when the attitude is that the company is making loads of money so it will keep going because that is what delivers. The latter is different from the attitude in Mark Zuckerberg's statement in the context of helping people to understand the world around them. As Facebook pays so little tax because it is based in Dublin, would it consider co-operating with the introduction of a levy that would ring-fence millions, if not billions, to help address the negatives associated with the operation of Facebook, in other words to help with, for example, mental health services or with the negative impact on minorities?

Comments

No comments

Log in or join to post a public comment.