Oireachtas Joint and Select Committees

Wednesday, 6 December 2017

Joint Oireachtas Committee on Children and Youth Affairs

Cybersecurity for Children and Young Adults: Discussion (Resumed)

9:30 am

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Good morning and welcome everyone to this meeting of the Joint Committee on Children and Youth Affairs. This morning in session one we will hear from representatives of Facebook on the topic of cybersecurity for children and young adults and in session 2 we will hear from the Ombudsman for Children on the annual report for 2016.

In session one representatives from Facebook will make a presentation to the committee on the topic of cybersecurity for children and young adults. I thank the representatives for their attendance. I welcome Ms Niamh Sweeney, head of public policy with Facebook Ireland, Ms Siobhán Cummiskey, head of content policy, Ms Julie de Bailliencourt, head of safety for Europe, the Middle East and Africa within Facebook. I thank them for appearing before this committee this morning.

Before we commence, in accordance with the procedure I am required to draw the attention of witnesses to the fact that by virtue of section 17(2)(l) of the Defamation Act 2009, witnesses are protected by absolute privilege in respect of their evidence to this committee. However, if they are directed by the committee to cease giving evidence in relation to a particular matter and they continue to so do, they are entitled thereafter only to a qualified privilege in respect of their evidence. They are directed that only evidence connected with the subject matter of these proceedings is to be given and they are asked to respect the parliamentary practice to the effect that, where possible, they should not criticise or make charges against any person, persons or entity by name or in such a way as to make him or her identifiable.

Members are reminded of the long-standing parliamentary practice to the effect that members should not comment on, criticise or make charges against a person outside the House or an official, either by name or in such a way as to make him or her identifiable.

I remind members and witnesses to turn off their mobile phones or switch them to flight mode. They interfere with the sound system, make it difficult for the parliamentary reporters to report this meeting and they also interfere with our broadcasting equipment.

I advise that any submissions or opening statements that the witnesses are making to the committee will be published on the committee's website after this meeting.

I understand the witnesses wish to make a brief ten minute presentation followed by questions by members of the committee. I call Ms Sweeney to make her opening statement.

Ms Niamh Sweeney:

I thank all the members of the committee for inviting us here today to talk about this very important topic. As the Chairman said, I am the head of public policy for Facebook Ireland. I am joined by my colleagues Ms Siobhán Cummiskey, who leads our Dublin-based content policy team, and by Ms Julie de Baillliencourt, who is our head of safety policy for Europe, the Middle East and Africa.

We welcome this opportunity to explain our tools, policies, extensive reporting infrastructure and general approach to keeping young people safe. We are always keen to hear feedback about any issues the members have had with Facebook because we always want to do better. This is a learning opportunity for us as well.

Myself, Ms Siobhán Cummiskey and Ms Julie de Bailliencourt are all based in Facebook's international headquarters at Grand Canal Square in Dublin where we work alongside more than 2,200 other colleagues, several hundred of whom are focused on the security and safety of our users through their work. We have grown from just 30 staff when we first set up an Irish office in 2009 and our physical footprint in Ireland now extends to four locations, including our state-of-the-art data centre, which will soon start serving traffic in Clonee in County Meath, and our Oculus virtual reality research office in County Cork.

We have watched this committee's hearings on cybersecurity as it relates to children with great interest, particularly as we have good active relationships with many of the experts the committee has heard from on this topic. Most notably, we have a strong relationship with An Garda Síochána, particularly the cyber crime and the domestic violence and sexual assault investigation units which handle child safety. For several years we have worked with a designated contact within Garda Headquarters in the Phoenix Park. All requests relating to Facebook from around Ireland are channelled through this team which has allowed us to respond efficiently and effectively to those requests.

We have a dedicated team, again based out of our Dublin office, that handles these requests and they prioritise safety issue. They respond out of hours to situations involving real world harm and they proactively flag cases of child exploitation to the National Centre for Missing and Exploited Children, also known as NCMEC, which liaises directly with the Garda.

We are also involved in all the European Commission's self-regulatory initiatives referenced by Professor Brian O'Neill during his recent contribution to the committee and have worked with him on this and other issues.

We take the issue of online safety very seriously. Facebook has a huge responsibility when it comes to the safety and security of the people who use our service to express themselves and to share with family and friends, and we fully recognise that.

One of the biggest questions we face relates to what is allowed on Facebook and we spend a lot of time trying to get that right. It is hard but it is of critical importance. Today we are going to spend some time explaining how we approach that, how we refine that approach over time and continue to try to get it right.

From the outset let me be clear that there is no place on Facebook for content that shames or exploits young people. We know our enforcement has not always been perfect. It is a very difficult thing to get right and that is why we have made and are continuing to make major investments both in human expertise and in technology to more quickly identify and remove content that violates our policies. This is a very complex area and as the committee has heard in its previous sessions on this topic, there are many different aspects to the issue, which would be difficult for us to cover in the ten minutes we have been allotted.

Ms Siobhán Cummiskey:

I thank Ms Niamh Sweeney for that and also the Chairman. As Ms Niamh Sweeney mentioned, I am the content policy manager for Europe, the Middle East and Africa at Facebook based at our international headquarters in Dublin. It is my team, the content policy team, who write the rules that govern what people can and cannot share on Facebook. I have been working at Facebook for five years and prior to that I was a human rights lawyer and a solicitor here in Ireland and abroad.

More than 1 billion people worldwide come to Facebook every day to share and to connect. When one looks at that number across a month-long period, it jumps to 2 billion people. That translates into an incredible amount of content on our site. People come to Facebook to share and to connect and we know they will not do that if they do not feel safe. We write our community standards with the objective of allowing people to share and to connect while also keeping them safe online.

Our community standards are publicly available on our website and there is a link to our community standards in our written submission to the committee, and we would invite the committee members to read them. Our standards govern everything that we believe can compromise the safety of our community. That means that our community standards govern everything from bullying to hate speech to spam to pornography to graphic violence.

We know that this committee has a particular interest in our bullying policies and I would like to briefly speak to them. Our bullying policy states that you cannot shame or degrade a private person on our site. We have a specific section of our community standards that is entirely dedicated just to bullying. When it comes to public figures we allow more robust speech around public figures and matters of public interest. Having said that, we do not allow credible threats or hate speech against anyone who uses our platform.

We have a vast array of other policies that complement our bullying policy and I would like to speak for a moment to one of the most important of them and that is our authentic identity policy at Facebook. If one uses Facebook, one must do so using the name that one uses in real life. We find that makes people much more responsible and much more accountable for what they do online. When one has to put one's real name beside it, one tends to be much more careful in what one says and much more responsible when it comes to what one says.

We provide a host of other tools and resources that complement our policies. I would like to hand over to my colleague, Ms Julie de Bailliencourt who will speak to those.

Ms Julie de Bailliencourt:

I thank Ms Siobhán Cummiskey and the Chairman. I am the head of safety policy for Facebook for Europe, the Middle East and Africa. As mentioned, on top of our strong policies we have built a sophisticated set of tools to address potentially abusive behaviour and to help people to manage their own experience on Facebook. They include tools and features related to privacy and security as well as conflict resolution, blocking and reporting. In particular, we have invested time and resources in building an industry-leading reporting system. Reporting a piece of content to Facebook is quick, easy and confidential. Every single piece of content on our platform can be reported to us via the reporting links which appear beside each piece of content. People who report content can let us know why they believe it should not be on Facebook, whether it is a nude picture or a bullying comment. Non-Facebook users can also report content to us via a dedicated contact form.

Our team receives tens of millions of reports every week from all over the world. We prioritise the most serious issues first. Many of the reports related to suicide, credible threats, child safety or bullying are reviewed ahead of any other topics. We work hard to ensure that those reports are reviewed by our team as quickly as possible. The vast majority of reports are reviewed within 24 hours and evaluated against our community standards.

Our team of experts include native speakers of more than 50 languages working 24-7 throughout the globe. The team is close to 7,500 people, several hundred of whom are located in our Dublin headquarters.

If reported content is found to be against our community standards, it is immediately removed. We also close the loop with the person who reported the content to let him or her know what action we have taken. People who engage in abusive behaviour on Facebook face varying consequences, ranging from a warning to losing their accounts permanently. In the most severe cases, for example, where child exploitation is involved, such people can be referred to law enforcement. Our help centre includes a range of additional contact forms where people can report copyright violations, privacy rights violations, defamation and more.

I am keen to highlight today how we support people who may be victim to what is known as revenge porn. We have a specific contact form whereby people can get in contact touch with us and get immediate support from our team. Earlier this year, we announced the launch of a new tool to prevent further the resharing of such content on our service. When young people register on Facebook, they benefit from specific behind-the-scenes protections relating to safety and security. For example, they do not surface on public listings and their personal information is not visible to the public. We also have some built-in technology that may flag suspicious friending activity between teens and adults.

While we are committed to removing bullying content as quickly as possible, our philosophy on the reporting side is also to empower young people to speak to someone they trust about these issues to get resolution where bullying may take place, namely, in the classroom. As I mentioned, we have strong processes against child sexual exploitation. Since 2011, Facebook has been using an image recognition tool called photoDNA which scans every photograph uploaded on our platform. If the content is found to be of child sexual exploitation, it is immediately deleted and referred to the National Center for Missing and Exploited Children in the US which will, in turn, contact local law enforcement agencies.

We have been investing in suicide prevention since 2006 and we work with more than 70 suicide prevention experts throughout the world. When people report instances of self-harm to our team, we strive to review these reports within minutes and provide supportive resources to the reporting person and the friend at risk, further signposting to specialised local listening services.

Facebook has a wealth of easy to understand resources, such as our safety centre, help centre and bullying prevention hub as well as the parents' portal, which was launched last year. We continuously work with partners to launch educational programmes and resources that are available to download online. Examples include Do you Think before You Share? with SpunOut.ie, Help a Friend in Need with Samaritans, or our anti-bullying activism guide with the Diana award. Our safety partners throughout the world have a dedicated channel through which they can contact us to report sensitive situations. In Ireland, these organisations include hotline.ieand, more recently, the crime victims helpline. We are also in the process of adding the Irish Society for the Prevention of Cruelty to Children, ISPCC. Our partners also provide us with invaluable ongoing feedback on where we can do better. We are all part of the same effort and we have found this multi-stakeholder approach effective.

Facebook has been involved in a number of self-regulatory initiatives, such as ICT Coalition, since 2012. In February of this year we also signed up to the European Commission alliance to better protect minors online. We partner on an ongoing basis with Google, Twitter, Microsoft and many others to ensure we share knowledge and work together on topics such as counterterrorism and child sexual exploitation. For the past two years, Facebook and Google have joined forces to run what is now an annual safety summit in Dublin for safety organisations for all of Europe, Middle East and Africa. I am happy to take any questions on these topics.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank the witnesses very much for their contributions. We will open up with Deputy Funchion.

Photo of Kathleen FunchionKathleen Funchion (Carlow-Kilkenny, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I thank the Facebook representatives for coming in and taking the time. It is an important issue. I have to be honest and say that the Facebook Ms de Bailliencourt is describing is not my experience of Facebook or the experience of most people I know. I do not believe it is a forum free from harassment or bullying in any sense. I would not under any circumstances allow my children to use it because of my experiences on it and those of many people I know.

I have several questions relating to people not using their real names. There are profiles where it is so obvious that people are not using their real names. There are numerous profiles where there are no pictures of the person and it is so obvious that it is a fake profile. Despite this, if I report such content to Facebook, the company replies by thanking me and saying that I have done the right thing but that the content does not violate its community standards. How many times does a page have to be reported to Facebook for the company to act on it? What does Facebook define as bullying? Some of the problems highlighted are very obvious, including images of child pornography. Such things can clearly be classed as bullying. However, bullying is subtle and that is the difficulty with bullying in every forum, whether school, work or social media. Let us suppose a given page consistently targets individuals, consistently lies about those individuals and is reported on numerous occasions to Facebook by numerous people but the company does not do anything about it. That is the responsibility of Facebook. Facebook is responsible for the consequences for people who are experiencing that bullying and for the impact it has on their mental health. I am unsure whether the Facebook representatives actually grasp the seriousness of that.

There has been a rise recently in the amount of mental health issues and anxiety. I put a great deal of that down to Facebook. It is not only a matter for me. I have been on Facebook for nearly ten years. From what I can see from the number of people who use it, anything that is reported comes back. The company says that the material in question does not violate Facebook community standards. What violates community standards? The examples given by the Facebook representatives are obvious things. Obviously, if there are images of child pornography, they would be removed. I am referring to subtle bullying that goes on for years against an individual. I do not accept that Facebook cannot see that something is a fake profile or fake page. It does not take much investigation to find out such things. Moreover, there is no forum for someone to contact the company. There is no contact number for Facebook. A person is very limited in what she can report. A person has to tick a certain box. An hour later – I have no idea what sort of investigation the company can undertake in an hour – there is a message to say the content does not violate the standards of the company. The person is asked to give a further explanation. That is the only opportunity a complainant has to explain exactly how the content amounts to bullying of or to harassing or damaging individuals. I say as much as an adult. Under no circumstances will I ever allow my children to go on Facebook or use Facebook until they are over 18 years of age.

Facebook needs to do far more in respect of monitoring and the impact of this material. We see it all the time. People put up pictures of someone. They put up degrading and derogatory material. Facebook statements claim the company does not accept anything degrading or derogatory. I could offer thousands of examples of how that goes on and how Facebook does absolutely nothing about it.

I am keen to hear what exactly is the last straw for Facebook. When will Facebook take action on certain pages? How many complaints does the company need to get? How many legal letters does Facebook need to get? What is the limit for people? The impact that this is having on people and on their mental health is real. It is going to go into the next generation. It is going to go on to our children unless people, parents and guardians take a strong view and isolate their children from most social media. The Facebook forum is not a safe forum for anyone, including children, unfortunately. I feel very strongly about this.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Do our witnesses wish to respond?

Ms Niamh Sweeney:

I will address those issues one by one. My colleagues, who specialise in policy and enforcement, will probably complement my answers.

Reference was made to fake names. I am sorry that the experience of Deputy Funchion has been that she has not had the right response. I always tell people to report fake accounts. Often, some of the behaviour a person might object to would not necessarily violate policies around speech that one finds unpalatable. However, if an account is operating under a fake name, it is in clear violation of our policies.

Enforcement is difficult. Sometimes, we get it wrong. I encourage people to report issues again.

Photo of Kathleen FunchionKathleen Funchion (Carlow-Kilkenny, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

It have reported content 50 or 60 times over four years. I am not talking about once or twice. Obviously, the company has a set form. It is only a tick-box exercise. An automatic response comes back. The software probably goes through the content and if it does not see something really obvious and damaging, then it is deemed to be fine. I do not see how a page could be reported consistently over several years and yet no one asks what is going on.

Ms Niamh Sweeney:

Obviously, there is a difference between a page and a profile. Fake profiles absolutely violate our rules. We do our best to get our enforcement to the right stage. I would encourage Deputy Funchion to do it again. I hope that in some of those instances the Deputy will have seen the right outcome.

We find that fake accounts are behind many of the issues that arise, whether spam or the spread of misinformation to behaviour that people find objectionable.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

What is Ms Sweeney's definition of a fake account? It is important for us all to understand her definition.

Ms Niamh Sweeney:

We refer to it as our authentic identity policy. A person must go by the name that his or her friends and family would recognise that person by in real life. If a person is operating under a name that is clearly fake or not related to what he or she would operate under in day-to-day life, somebody reports it and it looks suspicious to us, then we will check that account and ask that person to provide identification. If the person cannot provide it, he or she will lose access to the account. That is how fake names are handled. I am sorry that has been Deputy Funchion's experience. We work hard to remove those accounts because we find that they are linked to much of the behaviour that violates our policies. With respect to material that amounts to lying about individuals, all Irish citizens are protected by several statutes, including those that relate to defamation and harassment. If it goes into defamatory territory, then we have specified reporting tools for people to be able to take action on that and we will always work with the courts to ensure people have access to recourse in that sense.

When one talks about bullying that goes on for years, there is obviously recourse for individuals if it escalates to harassment, and these are captured by existing statutes. We rely on notice from our user community to take action on much of what has been described. When the Deputy talks about photographs that have been doctored to degrade individuals, we can take action once they are reported to us, but as Ms Cummiskey said, 2 billion people use the platform which generates a lot of content, so the legal framework for this, which has been set out by the EU, is called a notice and takedown system. Once we are notified about something, we have a responsibility to respond expeditiously, which is what we strive to do. In a situation in which we are not notified, there is no artificial intelligence which can reveal those kinds of examples. Perhaps Ms Cummiskey or Ms de Bailliencourt will add to that.

Ms Siobhán Cummiskey:

I will add to the matter of policies then I will let Ms de Bailliencourt ask questions on specific reporting. I will reiterate that I am sorry the Deputy had a poor experience. I would like to see the content she reported.

Photo of Kathleen FunchionKathleen Funchion (Carlow-Kilkenny, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I have everything and it is not just me. I come across people all the time.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

It is a big problem.

Photo of Kathleen FunchionKathleen Funchion (Carlow-Kilkenny, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Facebook is great to stay connected with family and others but sometimes I wish it was banned because it is a negative thing for people.

Ms Siobhán Cummiskey:

I will speak briefly about our bullying policies. I will explain a little about what we mean by degrading and shaming. A good example would be a page set up specifically to degrade or shame an individual. That would violate our policies. Sharing any photos or videos of somebody being bullied with the intention of condoning the behaviour or further mocking or bullying the person is not allowed. We also do not allow any kind of text or image overlay on a private person's image as that would violate our policies. One really important thing is that context is key when it comes to bullying. While one person might look at something objectively and think it is bullying, the person receiving that information may not feel bullied by it. That is why it is so nuanced. People have been trying for years to solve the problem of bullying and to address it both offline and online. We have been looking at the issue of bullying for many years. We have been up and running with our reporting systems and community standards for over a decade and we also try very hard to understand and tackle this problem.

One really important thing about our bullying policy is that it is important who reports. If somebody set up a page called "Siobhán Cummiskey is really pretty" and I know from my offline world that that person has set up that page to degrade and shame me and that in the context of my world, in my school or whatever it is, that is a really common thing people say to degrade and shame me, if Siobhán Cummiskey reports that page in that circumstance, we will understand that she is reporting it as bullying. She understands it to be bullying and we would remove it on that basis. That is a particular nuance of our bullying policy that we do not have for other policies. That is one way that we try to tackle it but I am only speaking to our policy. Ms de Bailliencourt will answer some of the reporting questions the committee had and how we work with third parties on this issue.

Ms Julie de Bailliencourt:

I am sorry about the Deputy's experience. These are topics that we are working hard on and look to fix. Having the material from the Deputy at the end, after this session, would be useful. On reporting, it is very difficult for me to have specific input without having seen some of the instances that the committee is flagging. We often find that people may report a whole profile to alert us that there may be inappropriate behaviour or comments, but with the amount of information on Facebook on an ongoing basis, it can be difficult for us to identify where the abuse is actually taking place. Someone may have a profile that looks perfectly legitimate, like a real account, but that person may behave abusively in a group or post comments on a third party page. We encourage people to report the particular piece of content that may be abusive in order that we can form a better idea of where the abuse is really taking place and how repeated the behaviour is. The Deputy may get an answer very quickly. I can assure her that this is reviewed by a real person who looks at this report, but the person who is reviewing has a limited amount of information, so if one is reporting a whole profile, our team would not be privy to all the comments made by this person over the years, which can make the evaluation difficult. As Ms Cummiskey and Ms Sweeney mentioned, context is key. Sometimes we do not have the context.

We have been working hard to liaise with vulnerable groups and communities to understand how abuse manifests itself for these communities and how we can be more helpful. What next generation of tools can we develop? Where are we weaker in our enforcement? What can we do better? In particular, women have been targeted in ways that men have not, including women in the public eye, in politics and activists. Our role over recent years has been to engage with all of these groups across the globe to understand the specific nature of the harassment and abuse with a view to creating guides, programmes and training as well as understanding where we can continue to develop our tools and processes.

I am sorry this happened to the Deputy. She has our commitment that we will look at this. I reassure her that we have been looking at these specific issues. We have some things in the pipeline. We continuously announce new tools and new processes and we understand that this pervasive, nuanced and horrific abuse is not something that we want to have on our platform, and we want to better wrap our arms around it, as it were, to support users better.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Before I let Deputy Mitchell in, one of the comments the witnesses made sparked a thought on photographic imagery, particularly that which is taken by others. Ms Cummiskey mentioned that overlaying of text on a photograph is banned. Presumably, it has to be reported. I can think of dozens of examples of people in this room, in public life, who are inundated with that sort of stuff, not just Senator Noone.

(Interruptions).

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

It includes myself and many others over years. It is problematic. We are politicians and we have thick skins, but if applied to a child or young person, it is clearly difficult. How quickly does that process take place? How many people does Facebook have assigned to it? Is it centralised, globalised or how is it done?

I have regularly seen videos on Facebook, including in my constituency, and I am sure other members have similar experiences, where kids put up images of fights. I am a father of two. I admit that it never occurred to me prior to my appointment to this committee, when an image or video is brought to one's attention, that such material should not be on Facebook's platform according to its rules. I appreciate that there are billions of users and thousands working in the business, so it is almost impossible for people to handle.

Everything has to be relatively automated. I accept all of that but I would like to know whether there is an active group or programme within Facebook which is specifically looking at younger people who the company knows are a specific age, male and, under certain circumstances, more likely to share material like that. I am sure the type of profiling Facebook is capable of doing could target them. Perhaps that could be addressed and then Deputy Mitchell can ask her question.

Ms Siobhán Cummiskey:

I thank the Chairman for his question. I will start at the beginning on what we call and what is commonly referred to as "mocking photoshop" or "mocking images". That is something we do not permit in respect of private individuals. As I mentioned in my opening statement, when it comes to public people, there are different standards. The reason is that we allow more robust discussion and debate around public figures and matters which are of public interest.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

How does Facebook define "public"?

Ms Siobhán Cummiskey:

People who are part of public life are public figures, for instance. People who are very frequently mentioned in the press are public figures. Those people do not receive the full protection of our bullying policy. The reason we do not give the full protection of our bullying policy is because our bullying policy is so expansive that if we were to give that protection, it would not be possible to make a meme about the president or prime minister of a country. One would not be able to engage in the type of freedom of expression we see commonly in the newspaper with caricatures around people in public life. Having said that, we have a very hard line on things like hate speech. People are protected from hate speech no matter whether they are the president of a country or prime minister or an ordinary person. However, we draw a distinction between children and people who are private and people who are public. Credible threats are another example of something that is not permitted against anyone on our platform because they are dangerous.

We have thousands of people who are based all around the world reviewing content. They operate on a follow-the-sun model which is to say the reporting system operates 24-7. We hire native speakers in order that they understand the context of the relevant country and the reports they are receiving and they respond to most reports within approximately 24 hours. Last May, we announced that we would double our operations team, a huge number of the members of which are based in Dublin. We doubled the number from 4,500 to approximately 9,000 at the end of this year. We are also set to add thousands of more reviewers in 2018 because we take this issue very seriously and we want to get to these reports very quickly.

I was asked about the issue of people sharing fighting videos. If someone shares a video of a fight and people underneath are leaving comments, and this may be something one sees in comments rather than in the video itself, mocking the person or degrading them, we would remove that. The Chairman asked if we profiled the people who might be more likely to do it.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Ms Cummiskey said Facebook would remove mocking comments, but what about the video itself?

Ms Siobhán Cummiskey:

If somebody posted a video and mocked the person in the video or condoned what was happening, we would not allow that. We allow certain forms of violence in order for people to raise awareness around a particular issue. If the person depicted in the video reported it, we would remove it. In any situation where someone shared it to condone the behaviour or to mock the person, we would also remove that no matter who reported it.

Ms Niamh Sweeney:

We were asked how long it takes us to review things. Usually, everything is reviewed within 24 hours. As Ms de Bailliencourt highlighted in the opening statement, certain types of reports are prioritised because they may result in real world harm. If there is a risk of self-harm, that can often be reviewed within minutes and we would escalate it to the Garda. We have done that here in the past 12 months. We have 7,500 people working on the teams Ms Cummiskey mentioned and our CEO, Mark Zuckerberg, announced on the last earnings call he did at the end of the third quarter that we are adding an additional 10,000 people to the staff looking at the security and safety of users on the platform. The reason I mentioned that it was on the earnings call was because he made it very clear on that call that the security and safety of users takes precedence over the profitability of the company at this time. To put it in context, we have a global workforce currently of just over 21,000, which means we are adding an additional 50% just on the teams that look at security and safety. I hope that speaks to the commitment we have to addressing a lot of the issues that have arisen and continue to arise. These are evolving challenges.

I am sure the members have additional questions on public figures vs private individuals on the platform. It is not an easy conversation to have in a room like this because the members are obviously affected directly in a way the ordinary population is not. The difficulty we have is that freedom of speech is protected under the European Convention on Human Rights and by the Constitution. As such, we try to strike that balance. People do not often articulate themselves in a way people find polite or palatable. They use curse words. The difficulty we have is that the right to freedom of expression includes the right to offend, shock and disturb. One of the ways we have tried to address that is by creating controls for individuals like the members. While we have a higher tolerance for that kind of language, the members can control their own experience on the platform by blocking, hiding, deleting and ensuring they do not impact on their pages or profiles. I understand that these are difficult issues for this group more so than for anyone else.

Photo of Denise MitchellDenise Mitchell (Dublin Bay North, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I thank the witnesses for attending and their presentations. I have an opinion on all social media apps and it may be that I am a bit old school. I have not had a pleasant experience of them and some of the stuff I see on Facebook alarms me, to be perfectly honest. This is material that has got onto pages and been shared in the public domain. I understand that it is difficult for Facebook to police it if it has, as the witnesses said, 1 billion users a day online. The Chairman raised the issue of the Facebook live feature. I welcome it because it is good for debate and campaigning groups, but there is a darker side to the feature. In May, Facebook pledged it would bring 3,000 extra staff on board after a video was streamed in Thailand of a murder. It was viewed approximately 370 times before it was removed. I find that alarming. It took 24 hours for it to be taken offline. I ask the witnesses about that because there is a darker side which the Chairman touched on. I am a bit startled by the response of the witnesses when we were talking about young people fighting online. The witnesses said it has to be reported if someone is being bullied or if there are derogatory comments. Is it not enough to see that a young child is being attacked? One does not need to prove it is bullying when one can see a child being beaten up or fighting. It should not fall under those policy guidelines, it should be removed automatically because of the distress to the child, the family and the community in general. Facebook really needs to tighten up on that. I was not happy with the response that unless the child is mocked or jeered, it will not be taken down. I find that alarming.

Does Facebook have enough staff at present to deal with the workload it has? How many times does a video have to be reported before it is removed? I have a question on Facebook stories. We raised issues before about other social media apps which have been used to bully young people in particular. Once the thing is viewed, it goes. Does Facebook have policies? Does it retain the media that are there given how difficult it is to prove something if it is removed? It is distressing for parents and staff.

If I report something to Facebook, there is a list of questions to be answered and one ticks the relevant box. Will the delegates explain the process to be followed after a report has been made? How long does a member of staff take to process the complaint and what are the guidelines to be followed? As I am sure Facebook receives many complaints, responding within 24 hours seems rather quick, but I am concerned about whether Facebook goes into sufficient detail.

Ms Julie de Bailliencourt:

On the last question about the process involved, when someone reports a piece of content - I encourage people to do so; it is very quick and easy to do so - we must capture several pieces of information to ensure we understand not only the issue involved but also who should deal with it. If someone here makes a report, it is most likely be examined by someone who is Irish and understands the language, culture and slang. It is a quick process, but the person viewing the content has limited visibility of the content which has been reported. If a photograph is reported, for instance, the person will see the caption and perhaps the top comments. He or she may also see some information on the person who posted the photograph, but he or she will not necessarily have a lot of additional information on what has been going on or where it may have been posted. We strive for balance between efficiency and being quick, showing our reviewers the correct amount of information without overwhelming them with a lot of data. We must also consider the issue of data privacy. We do not want to show more than is necessary in the review.

We strive to give a response in 24 hours. When someone receives a response in that time, the issue has certainly been looked at by a person. Sometimes people believe it is too quick, that it must have been an automated process, but I assure the Deputy that there are real people looking at reports. When they examine the content, they evaluate it against the community standards which Ms Cummiskey described. If they find that there is a credible threat, bullying or dehumanising, they will remove the content and inform the complainant of the action taken. They will also inform the person who posted the content and perhaps give them a warning or block them from sharing more photographs for 24 hours. If it is a very severe case, we may remove the account entirely. The impact on the person depends on the content posted and if it was the first time it was posted or perhaps the tenth instance, with the nature of the issue being reported.

On staff numbers, in almost eight years at Facebook I have never felt our resources are limited. We have always measured our success by achieving a response within the critical 24-hour period for reports or a few minutes in cases involving very critical content. We will continue to hire people for our teams in Dublin and elsewhere to continue to meet the expectations we have set for ourselves. If we believed we were falling short in one particular country or on a particular topic, we would continue to look at how we should deal with the issue.

Photo of Denise MitchellDenise Mitchell (Dublin Bay North, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Therefore, staff turnover is constantly monitored.

Ms Julie de Bailliencourt:

Yes, absolutely. Our teams continually examine the numbers of reports that remain for review and the topics involved. There might be a sports event, for instance, that might lead to an increase in the number of reports; therefore, they will monitor whether there are events that are driving up or down the numbers of reports. We constantly make sure we have the correct number of staff examining and reviewing content and taking action. It is not based on people but the time spent in dealing with the matter.

I am glad that the Deputy brought up the matter of Facebook live. The announcement Mark Zuckerberg made in May was also related to Facebook live. It is a fantastic tool which, as the Deputy noted, is used by politicians and musicians, but we have seen instances where people have used it in very distressing situations. We have adapted our tool on suicide prevention to address the challenge presented by Facebook live. We have worked with global experts such as Dr. Dan Reidenberg in the United States to work out how to provide support for people who may be in distress and broadcasting live, with the individual's friends and family. My view on the matter was changed by his observation that as long as the person concerned was broadcasting, there was a chance of rescuing him or her since the Garda, friends and family had a chance to intervene by making their way to his or her place. We have adapted many of our tools to fit Facebook live. There are also reviewers who monitor Facebook live content that might become viral or gather many views. It is correct that we rely on reports, but many of the tools, particularly on suicide prevention, allow us to get within minutes to potential, credible videos such as the one mentioned by the Deputy.

The Deputy may have read an article posted in the United States in recent days which highlights our commitment to using artificial intelligence to help us to understand where we can take additional steps. We are in the early stages of considering what the technology can do, but combining the investment in technology and in people - we are talking about substantial investment in tens of thousands of people - will allow us to stay on top of these issues.

We retain media and have procedures, depending on the content involved. If, for instance, the police contact us in the context of a criminal investigation, we have specific retention policies for such cases. It is not the case that the content which is no longer available on Facebook is not available to the Garda, if it wants it. That happens in cases where Facebook has removed content after judging it to be abusive, but different policies apply where someone deletes a photograph because he or she did not want it on Facebook. In that case, the content is gone.

Ms Cummiskey explained very well our policy on fight videos. From my side of the house, we sometimes see young people sharing content or videos that they have recorded to raise awareness of practices in an establishment or another location in the world or to highlight a problem. Facebook is trying to balance, for instance, pages dedicated to the sharing of fight videos for the purpose of bullying with children who may wish to draw attention to an issue. On removing or deleting material, we have an additional mechanism to prevent young people from seeing very distressing content while still allowing adults to share it to raise awareness. For instance, we have a number of graphic warnings for specific types of shocking content. If we apply this graphic warning, the content is rendered invisible to young people.

Photo of Denise MitchellDenise Mitchell (Dublin Bay North, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

This is not about raising awareness of self-defence techniques or anything like it. We are talking about kids being jumped on on their way home from school where someone pulls out a camera to video the incident and then puts it up on Facebook. I do not mean to be disrespectful to Ms de Bailliencourt, but not only is it happening, it is also very common. I have been shown numerous examples by constituents, as I am sure has the Chairman and the other members of the committee. That is what I am concerned about. Such content should be removed automatically from Facebook because of the distress it causes to the victims, primarily, and their families. I hear what Ms de Bailliencourt is saying about the efforts being made to raise awareness, but that is not what is happening in this instance. It involves bullying and assaults on children.

Ms Julie de Bailliencourt:

We have spoken about context and nuance which are an important part of how people share different pieces of content. If a page or a group has been set up for the specific purpose of bullying others by sharing fight videos recorded in schools with comments pushing people towards engaging in that behaviour, we will simply remove it. We regularly receive reports which facilitate our taking down of such content.

Photo of Maire DevineMaire Devine (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I thank the delegates for their presentation. This is the world in which we live which sometimes is unfortunate.

Facebook has 1 billion users. What proportion of them are children? At what age should a child be allowed to sign his or her life away to Facebook?

In addition to Facebook, Instagram and Whatsapp, what are the other social media sites? Facebook and other social media sites generally make a good input to communities. Children's lives are being played out through social media, which often makes life miserable for them. Just last week, we read media reports on an inquest into the death of an 11 year old child in Dublin. Self-harm and suicidal ideation are observable in primary schools, whereas they tended to be linked with secondary schools in the past. I believe social media are one of the major reasons suicidal ideation and self-harm are now observable in primary schools. The reason I have asked a question on age limits is that children of seven, eight, nine, ten or 11 years are immature and must be protected from misery. Unfortunately, we will not be able to stop social media. I dislike them and find them very boring. I do not want to use them, but I have to do so.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Does the Senator not want to know when someone had a meal?

Photo of Maire DevineMaire Devine (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

No; I find social media boring. However, they are also necessary, which means that we need to manage them for children. I am interested in what happens when a vulnerable person posts online and refers to self-harm or ending his or her life. What is the practical response of Facebook in such cases? Does it get the Garda or families involved? How does it find family members of such persons? Will the delegates outline what occurs in such circumstances in order that we can understand the immediacy involved?

On the circumstances to which Deputy Kathleen Funchion referred, in such cases, does Facebook allow users to seek details other than those that may be available on a person's public profile? Can one find out the identify of the person involved? Will Facebook provide an ISP address? Are such requests allowed? Deputy Kathleen Funchion spoke about an awful and horrific live feed on Facebook.

Facebook does not operate for the good of our health. Its purpose is to generate profit and it owns many social media companies. Google recently announced plans to recruit 10,000 people to tackle abusive content and Facebook plans to double its workforce. The collection of data is very important. While I do not understand how this is done, I am aware that data are collected to target goods and services at people based on their lifestyle. On the collection of children's data, does Facebook apply different rules for those aged under 18 years? I know that it does not collect data for the good or health of the nation. The profits of social media companies appear to be tied with getting to know what people might need and enticing them to buy services by promising them that certain products will make them wonderful and that they will have a great life if they buy them. I hope I have not rambled on.

Ms Julie de Bailliencourt:

There is a great deal to unpack from the Senator's contribution. I will start with her question on the collection of data.

The service we provide is free and we do not allow anyone under the age of 13 years to register on our site. We very swiftly remove the account of anybody who may be under that age. To provide what we believe is a good and relevant experience, we use advertising. The Senator is correct that we would like users to see only the most relevant advertising. If a user has no interest in a specific topic, we do not want him or her to be shown anything related to it. We want the quality of the advertising that we show users to be relevant and of interest to them. We use the information in order that advertisers can create campaigns on Facebook. A woman living in a specific area may have an interest in a specific topic and she will be shown a particular advertisement. The purpose is to have high quality, relevant advertising shown to users.

Ms Cummiskey referred to our community standards for normal content. We have much stricter advertising policies and a subset of policies for children. Children cannot be shown any advertising on a number of topics on Facebook. They are related to alcohol, tobacco and a number of other things. It is in the interests of the brands that operate on Facebook - they are usually very well known big brands - to ensure their advertisements are not shown to this audience. All advertisements are also reviewed. We are very cognisant that we do not want to show people under the age of 18 years advertising that would be illegal or inappropriate.

I will speak briefly about the self-harm process.

Ms Niamh Sweeney:

If I may, I will comment on the same issue before the conversation moves on in case we do not get a chance to return to it.

Everything Ms de Bailliencourt said is correct. I also stress that we have extensive controls in place for advertising. While an advertisement will be served to a user based on some of his or her preferences or what has been worked out as being most relevant to him or her, users can opt out of this feature, which is known as online behavioural advertising. We allow users to control the things they would like to be served or they can opt out of it altogether. This does not mean that he or she will not be served an advertisement but that the advertisement will not be based on any of the user's online activity or preferences. I want to make clear that this opt-out feature is available to everybody and that there are elements that are within each user's control.

Facebook is governed by the data protection directive which will be replaced by the general data protection regulation in May 2018. We are working hard to ensure we will be ready for the changes. We are working very closely with the regulator on the issue.

Ms Julie de Bailliencourt:

On mental well-being and self-harm, if someone sees in his or her newsfeed a piece of content in which a friend says she is struggling and finding life difficult, a number of options are available. People regularly report this type of content to us. It is a small portion of the content we see, but we have been working hard for more than 11 years to make sure we have the right support system in place. We are not experts in this field, but we have been working with experts, more than 70 organisations across the world, to ensure we approach this issue in a sensitive manner and that we provide the friend who may see the content and the person who places it online with supportive resources and practical tips. For the person who reports the content, it may be something like encouraging him or her to connect with the person directly by message or telephone. We have carried out extensive research on this issue and heard from many families who have been affected by suicide and people with a lived experience. They have told us that the number one thing that could have made a significant difference to them at the moment they were struggling was hearing from a loved one. That is where Facebook can be helpful by suggesting to the person who sees the content that he or she start a private conversation with the person who wrote it, reach out to another friend to discuss the issue or call the emergency services if he or she believes something is imminent. People report content to us and, as I indicated, we have teams who will review it. This is the type of content that is extremely important for us to review as quickly as possible and we strive to deal with it within minutes. Our team will review it and show the person some educational supportive messaging. We work extensively with local organisations. In Ireland, for example, we will say, "Niamh, one of your friends is concerned about you; here are some things you may find useful." We also provide information on Jigsaw, Pieta House and Samaritans to try to jolt the person into action or realising there are local listening services that could help. In some cases, providing resources is good but not sufficient and we need to move even faster. There are instances where if our team evaluates that the risk of harm is imminent and credible, they will be escalated to An Garda Síochána.

We had a case in the past------

Ms Niamh Sweeney:

Within the past 12 months, it was reported to us that a young user of the platform had expressed a desire to self-harm. We contacted the Garda within minutes, which reached out and was able to intervene.

Ms Julie de Bailliencourt:

In such cases we are able to provide information to help locate the person and, it is to be hoped, facilitate a welfare check on him or her. I wish to highlight that we are very invested in the discussion on social media and well-being and the company is dedicated to contributing to it. Facebook, along with many other organisations, was part of the task force on youth mental well-being set up by the former Minister of State at the Department of Health, Deputy McEntee. It is heartwarming for us to see that young people who may be in vulnerable communities, possibly in rural areas or the LGBTIQ or Traveller communities or any community that may be targeted or vulnerable, have a unique way of connecting online. A landscape of support groups from which people can find support and like-minded people who may be able to help them at any time of the day is available through Facebook and other services. We know from talking to many such groups, including young people, that with additional resources, they can find immediate support and people to chat to about topics that are weighing on their minds. Parents are concerned and may have questions about what happens but there is a huge ecosystem of support groups at a local level discussing these very important topics. In particular, social media is a fantastic way for vulnerable and possibly lonely people in rural areas to connect at any time of the day with people who may support them.

Ms Niamh Sweeney:

Senator Devine asked about the details of people accused of defamation or harassment. That is a classic example of where we would work with the courts. A Norwich Pharmacal order may be issued in a case taken in that regard. The Law Reform Commission has made some proposals in terms of modifying how that works in order to streamline the process. Members may be aware of court cases reported in the newspapers regarding defamation that took place on our platform. In such cases, if the defendant was operating online under a name that does not represent his or her true identity, a court will issue an order for us to reveal that identity to the person taking the case. We must protect the privacy and personal information of our users unless an official case is in being and a direction is made by a court. We protect everybody's information until otherwise directed.

Photo of Maire DevineMaire Devine (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Facebook-free day started last year and I hope it grows.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

It is important to recognise that the headquarters of Facebook is in Ireland and we very much appreciate the employment it provides here. It is great to have three women here to represent the company. That is an unusual sight and I wish to acknowledge it. There is so much I could say about this issue that I am in danger of going on all day and so I will try to break it up into three sections.

Facebook, Twitter, Instagram and Snapchat have a huge responsibility to children in particular. Young people's feelings of self-worth are of great importance. There is a limit to what those companies can do to control that and I appreciate what the witnesses have said about balance and so on. Many recent studies that I do not have with me but which I know and are on my website show that mental health difficulties as a result of a variety of things that happen on social media are real and very serious. Forgive me if the matter was addressed before I arrived but is there space in the Facebook budget, from its very extensive advertising revenue, to promote positive mental health initiatives? That would be a very positive step for the company to take to counter the negative effects of social media and its negative perception in that regard.

Ms Niamh Sweeney:

Sorry, I was expecting a follow-up question. Ms de Bailliencourt may also speak to this issue but there are a number of things we currently do. We often find it better to work with organisations that are better placed to roll out initiatives such as that described by Senator Noone.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

That is what I mean.

Ms Niamh Sweeney:

We support the Diana Award in the provision of its anti-bullying ambassador programme here. In recent years, it has trained over 1,500 students in 200 schools all over the island of Ireland. Ms de Bailliencourt mentioned that peer-to-peer programmes, which we support, work best in this area, as indicated by most relevant research. We do much work with many local organisations such as the company responsible for the SpunOut.iewebsite, with which we have partnered on the creation of a resource called "Think before you share". We provide most of the input for that. We have partnered with those behind the Webwise.iewebsite on initiatives such as safer Internet day. The focus for 2017 was on parents and resources were created directed at helping them. We have also created a parents' portal. We partner with Webwise because it is better placed to directly liaise with educators because of its remit under the Department of Education and Skills. We work very closely with such organisations and sometimes the type of support we can give is------

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Does Ms Sweeney think Facebook could do more in that regard?

Ms Niamh Sweeney:

We can always do more.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

If the witnesses take anything from the meeting, let it be that that must be a primary focus for the company. It has a huge duty of care to people in general and children in particular.

I wanted to come back to the point about women. I am sorry to have to use foul language at the committee. Do the witnesses think it acceptable for a public or private figure to be sent a message saying the sender hopes the receiver is raped in a particular way that I do not want to say? Regardless of freedom of speech, of which I am acutely aware, do they think it acceptable for personal comments to be made that are viciously offensive to public representatives?

Ms Niamh Sweeney:

We have zero tolerance for any jokes, comments or threats relating to rape. Ms Cummiskey may wish to speak to that issue.

Ms Siobhán Cummiskey:

We do not think that is acceptable or okay. As Ms Sweeney mentioned, we do not allow any kind of jokes or threats relating to rape.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Deputy Funchion articulated this very well and I will not go over ground she covered but do the witnesses accept that people in public life, such as the members here today, are busy people and are inundated with negativity on social media? If we do not get a response to reporting issues on several occasions, why would we bother doing so again? We have more things to do. It should be policed in a better manner. I do not think it is acceptable. We are leaders and it should start from the top down. It is not made acceptable by the fact that politicians are very unfairly and regularly maligned. I appreciate Facebook is grappling with freedom of speech but more has to be done on this issue. We cannot legislate for it because of freedom of speech. It is not reasonable for us to suggest we will protect ourselves in that way. The witnesses have spoken about mental health. What about the mental health of those who work in this space? Do the witnesses have anything to say in that regard?

Ms Niamh Sweeney:

I understand. As Ms de Bailliencourt highlighted, women receive more such abuse. Our difficulty is that people can curse online and use much vocabulary that others do not find palatable or polite. As I mentioned, we try to create controls. I am in a difficult position in terms of discussing the difficulties experienced by public figures as opposed to private individuals because the right to criticise Government policy is directly captured in the Constitution. People may not elevate that to a particularly erudite level.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Must we have a society in which such comments are propagated on social media?

Quite simply, I find it completely unacceptable. We have lost the run of ourselves on social media given that we get away with using that kind of language. Facebook is the platform and it must do something about it.

Ms Niamh Sweeney:

It is very difficult to deal with the issue of language because people have the right to offend, shock and disturb. One of the interesting things that came out of previous hearings, and we would be very much aligned with most of the contributors in this sense, is that pre-emptive measures need to be taken. Professor Connell put it very well when he was discussing what the law can do in this space. He said that there are limits to what the law can achieve. However, pre-emptive measures need to be taken. There has been a huge focus on education programmes, particularly in the discussion with those from Webwise which has a particular remit here. Many of the proposals made by the Law Reform Commission and in the 2014 report by the Internet content governance advisory group, which was formed under the then Minister, Mr. Pat Rabbitte, have focused on the measures that need to be taken around digital citizenship and education in schools. We support all of that. The horse has bolted in many of the cases the Senator described but we are limited in what we can do because people can curse online. That is different.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Before the next witness comes in, I ask the committee to agree to Deputy Neville taking the Chair for a couple of minutes.

Deputy Tom Neville took the Chair.

Ms Julie de Bailliencourt:

To address Senator Noone's question on women's safety and to add to Ms Sweeney's comments, our policies are defined in a way where there is a wealth of content we will not allow, even for public figures. As Ms Cummiskey mentioned, jokes or threats about rape or anything that would reveal-----

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

As soon as they go up, do they disappear?

Ms Julie de Bailliencourt:

One would need to report it to us. Where we can do better-----

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I have to interject. Does Ms de Bailliencourt think that I have time to report everything on my feed? I accept it is mostly on Twitter and not Facebook but this is a social media issue. I am sorry if I am perhaps a little oversensitive about this at the moment, although I do not take it to heart as much as another person might as I have become so used to it. However, I hardly have time to wash myself at the moment, never mind time to report stuff that happens on social platforms. We need more protection.

Ms Julie de Bailliencourt:

I was getting to this point. In the course of the work we have been doing on women's safety, we have spoken to many different groups in many countries to understand the unique challenges they face. Women in politics and the public eye are in a unique situation which requires unique tools that we have yet to build. I guarantee the Senator, however, that we are working on these. We are trying to understand how we would allow someone in the Senator's position to have a presence online without having to report absolutely everything by providing the tools that would mitigate the experience.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I do not want to be too much of a drama queen. I can mute someone who keeps at this and we have to be reasonable in our discourse. I ignore a lot of it. However, sometimes it does get to a person. The strongest person in the world would find it hard to have that kind of stuff written about them.

Ms Niamh Sweeney:

Ms Cummiskey mentioned that context is of utmost importance when it comes to certain types of bullying. I should add that there is no context needed here though. Another thing is that it is not on the Senator alone to report it. Anyone can report it because it would be an egregious and flagrant violation of our policies. Other people could be chiming in here and a conversation probably needs to be had about it. In terms of monitoring content, the notice and take down legal regime is set out under the e-commerce directive. For various different reasons, there is a prohibition on monitoring. Some of those reasons concern protecting the privacy and personal information of individuals. Other parts of it are designed to let the Internet flourish. If there was an obligation to monitor everything that is put up, we would not see services such as this one.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

There should be buzzwords.

Ms Niamh Sweeney:

There was an effort to strike that balance. At the same time, the European Commission has spoken more recently about using artificial intelligence to tackle illegal content online, which should not undermine the prohibition on monitoring. Again, it is trying to strike the balance between those two things. I want to be absolutely clear that anyone can report threats, jokes or passing comments about rape. No additional context is needed and it will come down.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I appreciate that. I wish to raise a few final things although I know I have taken up a bit of time, for which I apologise.

We know that Big Brother is watching us. I get a lot of fashion stuff on my feed, whether I am on Twitter or Facebook, and I enjoy seeing it. Unfortunately, often I am a consumer. I cannot remember which witness made the point that it is in the interest of many companies to ensure that the rules on advertising to children are observed. I do not buy that. The effect of targeting children at a young age, approximately ten to 13 or 15 years of age, is well proven in the tobacco and alcohol industries. The advertisers target them in a nuanced way and make their product attractive. Tiger Woods might be a bad example nowadays but they show a cool person, for instance, Ronaldo, having a Coke. That is more attractive. It is proven that money goes back to that individual for every like. This applies to alcohol and sugary food etc. which I go on about, and get slagged about, all the time. However, I just do not buy that they are not targeting children. In so far as it is technically possible, the same technology that is used to actively identify people online should be used to avoid children online.

Ms Niamh Sweeney:

I will let Ms de Bailliencourt handle the alcohol and tobacco issue.

Ms Julie de Bailliencourt:

One would be surprised how many of these brands have an interest from a legal standpoint and a best business practice standpoint-----

Ms Julie de Bailliencourt:

-----to enure that the advertisements are only seen by the right audience. As tight as our community standards are, our advertising policies are even tighter. When it comes to targeting children, they are tighter again. It does not make good business sense for us or them to show any unlawful or inappropriate advertising to children. We have a strict review process. Advertising campaigns can be reported in the same way as content is reported. Things will be reviewed and removed------

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I am sorry to cut off the witness but I will ask two tiny questions. Then I will be finished.

Deputy Alan Farrell resumed the Chair.

Ms Siobhán Cummiskey:

I was just going to say that we also build that into our functionality. This is one of the things we do very well. If a firm is going to advertise a product, it must state what it going to advertise in a drop-down menu. We then make it technically impossible for the firm to target the advertisement. This demonstrates a way the online space is really good. A child can walk down the road and see a billboard with cigarettes or alcohol on it. However, when it comes to the online space, we can build it into our technical functionality that they cannot target them once they have stated it is tobacco or alcohol.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Is there any legislation that policy makers should consider in this area? That is the first of the last two things. Second, do we have to move to a situation where the default position would be mandatory IP filtering? Do we have to do that to protect our citizens?

Ms Niamh Sweeney:

On the first point about legislation, last year the Law Reform Commission made several recommendations with respect to updating criminal offences in some of the areas upon which we have touched. We are supportive of those, particularly around the non-consensual sharing of intimate images, otherwise known as revenge porn, online harassment, upskirting, and all those things.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I have done some work on it.

Ms Niamh Sweeney:

We think they are very positive proposals. As a private individual, I would like to see them advanced. I am not quite sure what the Senator meant by IP filtering.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Basically, Cisco has a product called OpenDNS. I apologise to the Chair but this is definitely my last comment on this. Germany has a policy whereby all schools and possibly all homes have an opt-out system through the telecommunications companies. The default position is that everyone is filtered from offensive stuff. A person can go in, by all means, and opt out of whatever he or she wants. However, the default position is that users and children in particular are protected. For instance, parents may not have the time or the focus on it. Does Ms Sweeney know what I mean?

Ms Niamh Sweeney:

I am not sure I have a meaningful opinion to offer on it or that my view would count for much. However, I understand the principle involved and the motivation behind it.

Photo of Catherine NooneCatherine Noone (Fine Gael)
Link to this: Individually | In context | Oireachtas source

It would be great. To be honest, it is the way forward.

Photo of Anne RabbitteAnne Rabbitte (Galway East, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank all who have come here this morning. A lot of my questions have been asked, including one on advertising and how it relates to children. I will focus on what has come up in recent weeks. We have had Dr. Geoffrey Shannon in to the committee, and we various bodies such as Webwise, all of which Facebook said it is working with.

I note a key recommendation in the report of the Internet content governance advisory group of 2014 suggested that Internet safety and digital literacy skills would be taught as a core element of the curriculum, both at primary and post-primary levels. These recommendations were echoed by Dr. Geoffrey Shannon. I would love to hear the views of witnesses on a digital safety commissioner or how one would go about it.

It is also important for those looking at this meeting to note that the witnesses did not come empty-handed. They came with their anti-bullying activism on Facebook. They also brought us Safety@Facebook. These are fantastic tools for the public, be it the mammies, daddies, teachers or whoever is watching, and for them to realise that there are tools to assist them.

I will ask all my questions together and then the witnesses can come back in. A few minutes ago, Ms Sweeney touched on the Diana award. I would have been a huge follower of Diana and I went straight in to see what that was about. It was to show us the 200 schools which participated. It is a peer-to-peer grouping. When I read through it, I saw Newbridge College and Sharnbrook Upper School there. I would love Ms Sweeney to explain how anti-bullying activism works. In one of the comments, the teacher in Newbridge stated:

I think some school staff can sometimes be a bit dubious about using Facebook in an educational setting as they are nervous about using Facebook to communicate with students. However using a peer-led scheme like this one is a great way to reach students, considering the amount of time they spend online these days.

I ask Ms Sweeney to explain what this is all about. Maybe there are more schools that could buy in to this process. The children are on social platforms but if we could educate them better, it would be fantastic.

In her detailed report, Ms Sweeney also speaks about everybody she works with in the teams on Facebook and the teams on advertising. Ms Sweeney has already referenced the ISPCC. She referenced SpunOut, Jigsaw and everything else. However, there was one glaring omission, namely, the Department of Education and Skills. I need to know if Facebook is working with the Department of Education and Skills, whether it plans to do so in the future, or how does Facebook feed into its policy? Has Facebook met the Department in the past 12 months?

Another matter that arose from Dr. Geoffrey Shannon was the takedown policy. Perhaps Ms Sweeney could tell us about the takedown policy, particularly for teen photographs. Ms Sweeney might have addressed it already. She might tell us specifically how they can go about getting it taken down. Parents would also want to know that for their children. They need to be the support for the child who has found himself or herself in that position. It is so that everyone is aware and educated.

Believe it or not, Facebook is one of our best assets and I will tell the representatives why. Since I was elected to Dáil Éireann, something I give out about regularly is the poor ICT information available where I am gathering data. Facebook probably has all of that information on the age profile of people and what they are looking at. Facebook has a whole background team. It could work well with the various Departments, such as the Department of Children and Youth Affairs, the Department of Communications, Climate Action and Environment and the Department of Justice and Equality. Facebook can give us that information in order that we can develop policies and build on the legislation that is coming through, because we do not have all that information. That is where Facebook is one of our best assets. We need to access Facebook from that point of view. If Facebook is in a position to continue working with the various Departments, it would be appreciated.

Ms Niamh Sweeney:

We might split this between us. I will start with the Internet content governance advisory group's report and the more recent proposals from the Law Reform Commission. There is a great deal in the 2014 report that makes a lot of sense, especially around the creation of the multi-stakeholder forum. It was Professor O'Neill who spoke to the research that is being carried out at the European level. That really speaks to the success of the multi-stakeholder forum in addressing many of these issues, particularly because it brings all of the main people into the same room for an ongoing conversation rather than creating a static system that cannot adapt and evolve as the challenge moves on. Certainly, Ms de Bailliencourt is well placed to speak to our involvement in all of the European initiatives that have been spearheaded by the European Commission over the years, especially in the past six or seven years.

With respect to the Law Reform Commission's proposals around the digital safety commissioner, there is much that is positive in that as well. Obviously, we are fully supportive of the proposals around the new criminal offences. I have the report here. There is a huge amount in it around education, which is unusual for the Law Reform Commission. It usually has only one tool in its box, which is to make statutory proposals or not. The Law Reform Commission has very much focused on the creation of an office that would focus on supporting the preparation of publication of material by the Ombudsman for Children, co-ordination of the activities of Departments and other public bodies relating to digital safety, and supporting, encouraging, conducting and evaluating research about digital safety. There is a whole host of other points there. The LRC very much echoes what was contained in the 2014 report that was published by the Internet content governance advisory group.

With respect to the Department of Education and Skills, Webwise would be our main interlocutor there. Obviously, it is the professional development service for teachers under the auspices of the Department of Education and Skills. We would deal with it regularly. We held an event last week and Ms Jane McGarrigle kindly came and spoke at it for us. We co-ordinate most of our activities with it. We are open to invitations to come in and talk to anybody. We would love to be a resource for Departments in that sense. I suppose that would be reflected in our involvement in the mental health task force that was created by the Minister of State, Deputy McEntee, and that recently completed its work under the Minister of State, Deputy Jim Daly. Ms de Bailliencourt sat on the task force for Facebook Ireland.

Those are probably the main points that relate to my area. Did Ms de Bailliencourt want to talk about the Diana award?

Ms Julie de Bailliencourt:

The Diana award has been a fantastic partner. The peer-to-peer education is something that we identified early on as being effective to share best practices. Simply put, young people do not want to hear from us. What their friend will tell them about how to approach the online world works far better than us pushing out education messages, which we do anyway. However, we know that this peer-to-peer education works really well.

There are two sides. There is the training that the Diana award does in schools where it will train some staff members and some students to become those anti-bullying ambassadors. This applies to best practices, online and offline. It is not only about the online world. There are a lot of videos and items that we will share with the members later on to explain how it works. Some of those bring it to life far more than what I could tell the Deputy. Basically, it is about empowering children to speak up, not be a bystander, and not be afraid of speaking to someone they trust, which would usually be a peer. This is hugely important.

We have seen amazing things coming out of the schools about the Diana award work, and it involves children from the age of five onwards. It is not a 13 years plus initiative. It can start very early. This is a project that we were involved with. Obviously, it uses Facebook as an educational tool and as a resource for teachers. The message is important. It is that young people love technology and will really latch onto it. Our goal here is to empower parents and teachers not to be afraid of the technology and to go where their children are and use this as a tool.

We believe that technology is neutral. One gets what one puts into it. In those cases, we are here to highlight how one can use a tool, such as Facebook, to achieve the same goal, which is to push a positive message and engage the community. I am touching on what Senator Noone mentioned earlier - positive messages around not being a bystander, speaking up when one faces issues, and speaking up when one faces mental well-being troubles. There is still huge stigma in the country and beyond on those topics, and having young people saying it is not cool to behave in a particular way and that they are going to reach out to a friend to help solve the situation is powerful. We are very much behind any initiative that would replicate this.

The Diana award is one of the partners we have worked with and those involved in it have been amazing.

Ms Siobhán Cummiskey:

On the issue of takedowns, every piece of information on Facebook can be reported to us. If a child or parent reports content he or she is concerned about where the child is depicted, including the videos we discussed earlier, we will remove it. We also work with trusted partners and have a trusted partner channel that is very important. It partners with local non-governmental organisations which give us a lot of information on content they are concerned about that may not have been removed or give input on policies they think could be better. We use that information to help us improve the policies and takedowns.

Photo of Anne RabbitteAnne Rabbitte (Galway East, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Two community schools, Coláiste an Chreagáin and Cistercian College Roscrea, on the brink of closure used Facebook to save the schools by positive messaging and getting people in behind them. There is a way to get a positive message out and that has to be done.

On women in politics, I have a simple philosophy. I block and I drop. If someone says something negative about me, they go. Facebook is for friends. A person sends a Facebook friend request. If that turns negative, the person is no longer a friend.

Photo of Tom NevilleTom Neville (Limerick County, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I welcome the witnesses and acknowledge that a private organisation has given time to come in here. Most of the points I wanted to raise have been raised. I have met Facebook and been taken through its anti-bullying policy and protection. I want simply to make a statement. This is an evolving new industry in which everyone is learning, particularly the employees at Facebook. We do not have any benchmark because social media is only ten years old, has ballooned in a matter of 18 months or two years and we are trying to catch up with it. It influences behaviour, negative as well as positive.

I love social media. It has brought a phenomenal benefit to people. I have seen many community groups using Facebook for lip synchs and "Strictly Come Dancing" and fair day notifications. I was at a fair last week having got the notification on Facebook. It is a tool that people can use free of charge. There is advertising involved but that is an opt-in free of charge. To go back to basics, I am the director of a crèche. We have advertised all our fundraising activities and meetings on Facebook. It has many positive aspects. The negatives are magnified because of their results. As a culture and a country we are grappling with how social media is starting to fragment as it matures. Some social media brands attract a certain cohort. Facebook is different because it has a broad cohort, but that will get narrower as competition grows.

It is for Facebook to take on board what has been said at this committee meeting. The witnesses mentioned artificial intelligence and the use of algorithms in trying to block the most extreme things that arise on Facebook live, through videos or in respect of self-harm or heinous crime. I understand complaints are sent to the team which has to monitor it, make a decision and act on it.

It takes more time to research the subtle material, and that probably needs more human interaction to decide something is bullying as opposed to free speech and who the stakeholders and the victims are. Can the research and development team work on algorithms to catch the extreme cases that Senator Noone mentioned the minute they go up, pop them into a filter and then adjudicate on it? It could use artificial intelligence because the algorithm learns as it goes on.

I am happy Facebook has chosen Ireland as its European headquarters because of the number of jobs it provides. It is a learning organisation and the average employee would be more focused on youth culture. I only know about the algorithms through talking to people who work there and are interested.

We invited the witnesses here not to discuss our personal situations but to talk about children and youth affairs. It might be helpful if the committee could produce a report based on this meeting to send action points to social media providers, including Snapchat, Twitter and other platforms. Young people are moving to other platforms because they do not want their grannies or their mothers seeing what they are doing.

Ms Niamh Sweeney:

We are investing heavily in artificial intelligence. There are limits to what it can achieve. We see great potential there, especially for tackling illegal content online. It is a bit tricky in some of the examples the Deputy gave, such as child exploitation imagery or images that were shared non-consensually. For that we use photograph matching technology which can prevent their re-upload. We would like to spread the message today that if somebody is threatening or blackmailing a person with sharing a photo, the person should contact us. If we bank that image, we can prevent it from being uploaded. We can do that repeatedly.

It is harder for us to identify those images using artificial intelligence. We are working on that but the difference between an image of somebody sunbathing and pornography is difficult to make. We are investing a lot of time in it. In respect of text, there has been heavy investment in artificial intelligence to combat terrorism. One of the challenges we face is trying to surface material that promotes terrorism as opposed to highlighting the negative effects of terrorism or news content about terrorism. We would surface many false positives if we were to filter for certain words or phrases. In the past ten days we have released some figures on the content that we have managed to remove before it is ever flagged because we have invested in this. There are, however, limits on what can be achieved, for example, a news item where somebody condemns rape versus somebody who has threatened it. All that material would surface at the same time. That is why we are heavily dependent on user reports and we strongly encourage people to take action where they can.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank the witnesses. While I will do my best to focus on the committee's hearings on children, youth affairs and cybersecurity, fake news is a big issue for social media platforms, including the witnesses' own platform, although it is not as organised here as the much written about events of November 2016 in the United States.

I am coming at the sort of incendiary article which can pop up written by individuals targeting younger people or individual groups within a community or a school or something like that. The biggest problem the witnesses have identified and that I can clearly see is understanding whether something is being put out there to highlight wrongdoing or to incite wrongdoing. Ms Cummiskey's example was a good means of illustrating the difficulty in allowing artificial intelligence, apps or bots to present certain information as a page, as Ms Cummiskey illustrated, when in fact it has a duplicitous meaning and outcome. She mentioned attempts to tackle that across her platform. I have a question on the sort of resources being put into that issue, particularly given that Ms Cummiskey knows the demographics. Deputy Rabbitte raised an interesting point about the demographics using the platform. How many resource hours are being put into the development of schemes to prevent individuals being targeted, either through nefarious fake news providers or a bully in a classroom, school or community?

Ms Niamh Sweeney:

Is the Chairman referring to hate speech? When he refers to incendiary articles does he mean incitement to violence and that kind of thing?

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

It is not necessarily hate speech, unfortunately. It can be more insidious than that. It can be poking fun at or mocking something or someone or it can be an issue of so-called false positives, as Ms Sweeney has mentioned a couple of times.

Ms Niamh Sweeney:

Is the Chairman asking about such things masquerading as news articles?

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Not just news articles. I am sorry. Perhaps Ms Sweeney missed the nuance. I was referencing fake news, not specifically asking about it. I am referring to that sort of mimicry and the suggestion within a post on Facebook that the poster means something which they clearly do not, which is taken up as a negative and is therefore a form of bullying in itself. My question was about the sorts of resources Facebook is putting into developing algorithms, bots or human interaction to deal with such posts. Is it targeting the demographics? Is it actually looking at or focusing in on younger people in particular, given their vulnerabilities?

Ms Niamh Sweeney:

Most of our efforts in the area of artificial intelligence are focused on types of behaviour and types of accounts rather than a particular demographic. For example, as I said at the outset, fake accounts tend to be behind a lot of the kind of behaviour that violates our policies. We have tried to introduce more automated detection in the area of fake accounts because, generally speaking, in certain instances at least, they tend to display the same types of patterns. Once those patterns are detected, finding one fake account can lead to fanning out and finding others. There is no specific targeting of certain demographics or profiling based on nationality, location, age or things like that. It is more about the types of behavioural activity. It is hard for me to comment on the type of post the Chairman is referencing without seeing a specific example. If it is captured by our bullying policies which, as Ms Cummiskey has said, are really expansive, it would come down based on a report which would highlight one of those particular policies. Perhaps it would be best to follow up afterwards. I am kind of literal sometimes and need to see something to understand it.

Ms Siobhán Cummiskey:

Ms Sweeney makes the point well that AI is very useful for certain types of content which clearly violate our policies. However when it comes to bullying, the situation is extremely nuanced, as the Chairman has referenced. That is really a situation in which AI will not be a silver bullet. It is certainly something we are using more and more. It can be very effective in certain spaces such as our counterterrorism efforts. When it comes to bullying, that is where real people come in. This is where a real person reviewing and understanding the content is so important. When the person affected by the post reports it, that context is also taken into account by the reviewer. They can see that Siobhán Cummiskey, for example, has reported this page and the page is called Siobhán Cummiskey. Those kinds of connections are really important. The subtlety underneath is something that native speakers can really get a good handle on. That is also why we work with NGOs. They can tell us when there is a new trend. We do not know every new trend in every new country unless or until we have partners who can tell us. They can tell us what new slang words and slur words people are using. All of that helps us to be really nimble and we can basically cascade information to our reviewers the same day. We can tell them that there is this new trend or this new piece of information in this market which they should be aware of when they get reports into their queues. That is where reporting and real people become so important.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I will just ask a series of quick questions if the witnesses would not mind answering. Does Facebook vet advertisements? I am talking about content. Are they vetted and at what level?

Ms Niamh Sweeney:

All advertisements are reviewed before they go online. In some cases it is automated and in some cases it is done by human review. The human review part is subject to further investment in the next 12 months. That is part of what was recently announced in respect of our advertising transparency efforts. The Chairman mentioned some of the things which took place on the platform in the US last year. Things happened which should not have happened. It is part of the constant effort to evolve to meet challenges as they arise, things that we did not necessarily anticipate. There is a huge amount happening in the area of advertising transparency. I can detail some of that if it would be helpful. One of the things that is happening is that we are adding to the team that conducts the human reviews. Some of the process will remain automated, but some advertisements, once flagged for certain risks, will go for human review. In all cases at least one of those two things happens.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I am aware that nudity is not permitted on the witnesses' platform, but I think it is very clear from certain cases that nudity is on it, but in private messages. That has been illustrated on many occasions, both in the public domain in respect of a Member of the House in recent months and also in respect of children. It is very problematic when there is a rule for public posts which is different for messaging. Specifically in respect of children and young people, I would imagine that the exchange of photographs in private messaging is problematic, probably as much for the platform as for society in general. Will the witnesses comment briefly on that?

Ms Julie de Bailliencourt:

I can take that question. The rules that we have in place for Facebook apply to private messages. On the particular example the Chairman has mentioned, nude photos of young people are not allowed. They are added to our database of hashes, as mentioned, to prevent their resharing. Oftentimes if the young people, the parents, the safety NGOs we work with, or the police contact us, we are able to review the content and add it to our tool to prevent the resharing and further victimisation of the child. That said, an interesting debate has been going on for quite some time in respect of young people discovering their own sexuality and how to use technology. Our policies are strict and apply nonetheless, but we understand that different countries, different NGOs and parents themselves have been discussing this. The tools we have mentioned relate to the non-consensual sharing of intimate images. The tools we have in place with regard to child sexual exploitation imagery aim not only to limit but also to prevent this victimisation of children.

One thing which we have in place which is not necessarily broadly known by the public is that if, following on a report, we see that young people are engaging in this behaviour, we will tell them in no uncertain terms that what they are doing could be illegal and may lead to the termination of their accounts. If this behaviour is repeated based on further reports, the tone gradually gets sterner leading to the removal of the account.

We are also doing a lot of work on education. We talked a lot about education today but the resource we have called 'think before you share' and the work that we are doing with partners, supporting their efforts on campaigns relating to sexting goes in that direction, which is to highlight the potential risks and our position as a platform. It is a complex issue. We are keen to keep working on those topics and see what else we can do from a technology and educational side.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I refer to a point I raised at the very beginning of the meeting about fighting on line. I think Deputy Denise Mitchell covered the matter very well during her contribution. The particular fight I am referring to occurred in May 2015 in Balbriggan. It was an advertised event, in so far as I have been informed and from my recollection, an event was set up. It is no longer on Facebook but it is online as we speak because I have it in front of me on Reddit. The media at the time sold it as a racially motivated fight. It was not; it was just a group of teenage kids. The problem I have with the response is that assault and battery is a crime, whether it is consensual or not, whether charges are pressed or not. It is still a crime. It may not be prosecuted but it is still a crime. I have a slight difficulty with the concept that if somebody puts up incendiary language in the comments of a fight video that the platform will take that down but not the fight itself. I am making this point on behalf of the kids I am looking at in the video on Reddit, who probably know no better unfortunately than to be engaged in that kind of activity. I am pretty sure they knew they were being recorded but I took the time to read the comments on Reddit. In response to a person who commented two years ago, I would be more concerned about the lad pulling the "Stanley" at 41 seconds in. A "Stanley" is a Stanley knife. Clearly this video was taken down so it is not on the Facebook platform, but the problem I have with social media platforms taking the approach that Facebook has taken is that this video was available for well over a month. I remember discussing it with a senior member of An Garda Síochána in Balbriggan in 2015. The item was taken down and there was an investigation and people were identified, in so far as I am aware. From that perspective it was looked after. The fact is that it was up for quite a while because I was able to go back and view it because I received emails and had a very distraught parent phone me about the fact that their child was in the video but was not participating in the fight.

Assault is a crime. It should not be on the Facebook platform because Facebook is subject to the rules of this jurisdiction. I want to highlight that before the witnesses withdraw.

Ms Siobhán Cummiskey:

I will respond to this question. Please allow me to clarify the points I made earlier. When one talks about fight videos, one is covering a vast array of types of content. Facebook has a graphic violence policy which does not allow the worst forms of graphic violence on our platform. However, when the Chairman refers to fight videos or assault or violence, that is very broad. The Facebook graphic violence policy is very broad and it states there are instances where people can share videos or photographs of people being hit or assaulted in certain circumstances. If one thinks about places such as Syria where there are no traditional journalists on the ground, the only way for information to get out about people who are being assaulted, injured or killed is through social media. When the Chairman refers to a fight video or violent video, that covers those manner of things as well. If one is talking about a specific instance - and I do not have the benefit of seeing the video which the Chairman is talking about and I am speaking to it blind - I can tell the Chairman that if somebody shares a video of somebody being physically bullied and simply shares it, we would remove it because it is a video of somebody being physically bullied. If somebody shares a video of a fight or an assault of some sort, that assault may well be illegal but the content they have shared on our platform is not necessarily in and of itself illegal. If it is, we will remove it if we receive an order. We will also remove it if a person who is depicted in that video reports it to us. We will also remove any comments that are not there to raise awareness but when it comes to an actual video of somebody being physically bullied with no other context, we would remove it.

We also remove pages that are dedicated just to images and videos of bullying for that reason. When the Chairman refers to a fight video, he is actually covering a large manner of things but I hope that helps to clarify our policy.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

It was a pre-organised brawl.

Ms Siobhán Cummiskey:

I am glad it has been removed from our platform.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

As am I.

Ms Niamh Sweeney:

It might be useful to highlight that we publish a transparency report every six months, which covers a breakdown by country of requests we have had from law enforcement and how the request has been responded to. That includes a breakdown of preservation requests from the Garda. In an instance like that where as Ms Cummiskey said sharing the content might not be illegal but the actual act taking place would be illegal, the Garda Síochána may take a case and they would reach out to us through the dedicated contact point in the Phoenix Park and they would ask us to preserve that content so that they can avail of it for any prosecution that may take place. In the six months to December of 2016, we had 109 preservation requests covering 157 accounts. There are other types of requests that we would facilitate around basic subscriber information that the Garda may need to recall. We would have data on the percentage of requests from the total number of accounts that would have been affected and it is around the 80% mark. We try to be as transparent as we possibly can.

Photo of Alan FarrellAlan Farrell (Dublin Fingal, Fine Gael)
Link to this: Individually | In context | Oireachtas source

A number of comments were made on the service that Facebook provides. The platform that Facebook provides to the world is very positive. As Deputy Rabbitte has said it is about friends and connections and not about the abuse and nonsense that goes on. Human nature is as it is. In general social media is a very important tool not just in terms of keeping connections with loved ones around the world but also in terms of professional communications and as a tool for many other industries, education and health care among other things. I would like to comment on the positive side. I also thank sincerely the witnesses from Facebook for coming before the committee because they did not have to. We really appreciate that they have taken the time out of their morning and giving a large block of time to us on this important subject matter that will feed into the report that we will publish sometime in the new year, once we complete our hearings.

Members, and Deputy Neville in particular, have suggested that there may be follow-up and I certainly agree that there are a number of questions, particularly in terms of demographic data that are probably available but it might be helpful if such data were provided to us on the level of penetration in certain age brackets in this jurisdiction. That would be helpful in terms of framing our report on cybersecurity as it applies to children and young people and it certainly would be helpful in terms of interaction with other groups and organisations in the coming meeting.

On my behalf and on behalf of the committee I thank the witnesses sincerely for coming before us and being so robust and frank in their responses. It was a worthwhile engagement.

We will suspend the sitting to allow our next guest, the Ombudsman for Children to come before us.

Sitting suspended at 11.50 a.m. and resumed at 11.55 a.m.