Oireachtas Joint and Select Committees

Wednesday, 1 August 2018

Joint Oireachtas Committee on Communications, Climate Action and Environment

Moderation of Violent and Harmful Content on the Facebook Platform: Discussion

12:00 pm

Ms Siobhán Cummiskey:

I thank members of the committee for their questions. Deputy Lowry asked why a video had been left on the site when Facebook had known since 2012 that the child in it had been identified to law enforcement. That is a valid question. As soon as we became aware that the child and the perpetrator had been made known to law enforcement, we took action to delete that content and use image matching technology to stop it from being uploaded. However, our failing was that we did not know. We should have known and we absolutely accept that is our failing. We are doing a number of things to make sure such a failing does not happen again. We are reviewing this policy to determine whether there is another way to address this and whether there should be time limits. That consultation is being carried out with experts in this area. Any update to our policy is done in consultation with international experts, including academics, safety organisations and anti-racism organisations.

I will answer the questions asked by Senator O'Reilly in turn. I want to be clear and state that we do not allow hate speech on Facebook. We have set out our policy on that clearly. We do not allow hate speech against people on the basis of ethnicity, religion, race or sexuality. We tier hate speech into three tiers, tiers 1, 2 and 3 which are available to view on our website; our community standards are set out there. We do not allow calls for violence against people on the basis of ethnicity, migrant status, etc.

We do not allow dehumanising speech and so on against those people either. I encourage members to have a look at our policies and we are quite clear on that. As Ms Sweeney mentioned, we also have been audited by the European Commission on our ability to remove content that is illegal or violates our policies on hate speech.

On the Senator's valid question as to whether we have enough people and are taking enough action, it might be interesting for the committee to know that we use a combination of human review and technology to tackle bad content. We issued our first transparency report on the removal of content in April. Moreover, in the first three months of 2018 we took action on 3.4 million pieces of graphic content on Facebook, using technology in the majority of circumstances. We also use real people who speak the language to review particular types of nuanced and contextual content, such as hate speech and bullying, and we are increasing our staff in that area. We accept we need more people to do this. We are increasing the number of our safety and security staff from 10,000 to 20,000 to make sure we do that well.

There was also a question about marking content as disturbing. It is important to know that sometimes, people want to use social media to bring attention to the terrible things that happen in the world. We want to allow people to bring attention to those things in a way that is appropriate and limited. In certain limited circumstances, therefore, someone can share content that otherwise might be disturbing if he or she is doing so to shed light on these terrible things. We want social media to have a role in bringing terrible things to light but we do that in a responsible way. Marking this content as disturbing makes sure that people under the age of 18 cannot view it and that a user must click through a warning screen.

It also is important to note the vast majority of people who use Facebook never come across disturbing content. I have been on Facebook for the 11 years since the firm came to Ireland in 2011 and I have never seen content of that type or anything with a warning screen on it in my news feed. I have never seen anything disturbing. That is reflective of most people who use the service. We are trying to take action against the people who are involved in the worst forms of sharing content.

We take a number of actions to protect those who are under 13. We note that we were not doing this well enough, and since the Channel 4 "Dispatches" programme we have been making sure to put on hold any account that is brought to our attention in any way. I was interested in the Senator's background as a teacher. We have had teachers using our contact form to report their entire class as being under 13. We encourage people to do that. We do not want children under 13 on our service and do not allow it.

I will address the explicit question on what we are doing about training and then conclude, because I believe Ms Sweeney has answered the other questions. We are doing three main things. First, we are increasing oversight at the CPL site and all our outsourcing sites. We are seconding Facebook employees to the CPL site. I was asked about the timeline for this measure and it has already begun and will continue to take place for at least the next six months. We will then review and see if it needs to be in place permanently.

In addition, we have made sure to correct any errors in the training documentation and retrained all the trainers at CPL as soon as we became aware of the "Dispatches" programme. That is an ongoing process, however, not something that ended with one training session. It is going on constantly. There are twice-weekly meetings between my team and CPL to provide direct support and allow CPL's staff to ask questions. That is in addition to the other training processes that we have in place.

Comments

No comments

Log in or join to post a public comment.