Oireachtas Joint and Select Committees
Thursday, 7 March 2013
Joint Oireachtas Committee on Transport and Communications
Social Media: Discussion (Resumed) with Twitter and Facebook
11:05 am
Ms Patricia Cartes:
As I was saying, the way in which we provide support is through our user operations scheme which is based in four locations across the world. This enables us to provide 24-7 coverage. The centre that would be most relevant to Members is our Dublin office in which we have representatives from more than 25 countries, supporting 23 languages. This means that if a person in, say, France, has an issue and writes in French, a French specialist will respond to the queries. We investigate reports manually when we receive them from the community but we also build smart tools to automate our processes and support our robust community. If somebody decides to create a fake account, impersonating somebody like Ronan O'Gara who is a well-known public figure in Ireland, our Irish specialists will be able to work with our systems to block those keywords and, therefore, prevent the account creating any imposters.
We rely on the community to bring abuse to our attention, although we are proactive towards the more sensitive issues such as child protection or tourism. One thing we realised early on was that in the outlying world when one faces an issue one can talk to people in the community and share the background of the case. As there was not a similar site on Facebook we decided to create a channel specifically for this purpose. Now when one faces abuse on Facebook, whether it violates the Facebook standards or not, one is prompted to contact somebody in the community or the person who has uploaded the content and ask them to remove the content or ask for advice from the community. Therefore, if I am a student and getting bullied in the classroom, I can talk to my teacher through the social reporting mechanism - the teacher may or may not be my friend on Facebook - and will be able to enter their e-mail address within this system and this person will have the background of the case and will be able to provide advice. We also give the person access to our safety centre which contains much information on how to deal with abuse in order that they can make an informed decision before providing one with advice. This has enabled people to solve situations face to face and, in turn, has contributed to the education of the community. We speak many times about bullying. We see cases where bullying is not taking place but it is simply somebody not understanding the rules of engagement or not behaving responsibly. I could be a responsible user and upload a photograph of Mr. Simon Milner today that he may not like and he may get offended by that photograph. He is entitled to bring that to my attention in order that I can remove it or bring it to the attention of those who know us and eventually to Facebook if he wants to find a permanent solution.
When people report content to us we find that transparency is a very important element, so we let them know what has happened to their reports through a tool called the support dashboard. The support dashboard lives within one's account and one is notified when one submits a report of abuse to Facebook and also when we have looked into it. At all stages we tell one how long it will take for us to review the content and what action we have taken. Furthermore, we also tell one the policies of the case one has brought to our attention. Abuse comes in many forms, from spam campaigns to privacy violations to bullying. When one provides us with a report of a situation that is unpleasant we will look into the specific type of abuse and inform one of our policies with regard to that abuse. At all stages our policies are public and can be visited online by going to Facebook.com/communitystandards. We provide users with education throughout the product so that they understand at all stages to what rules we are holding their behaviour accountable.
Finally, I want to highlight our partnerships. Safety is taking part in this conversation. We are responsible as the providers of the platform but it is important that we take into account the feedback provided to us by members of our global safety advice reward which comprise five leading organisations which are experts in different fields from child protection to domestic violence. These organisations have privileged access to our platform and give us advice before we launch any features. Based on that feedback we will modify our settings to make sure the community can remain safe.
We have a number of local partnerships in Ireland. We work with the safer Internet centre, Webwise and we have an InHope node. During the previous session the committee mentioned the UK Safer Internet Centre which is funded by the European Commission under the safer Internet programme. We work on an ongoing basis with all of the safer Internet centres across Europe and, more specifically, in Ireland, with Webwise and ISPAI of which we are a member. We provide these organisations with a direct communication talk or policy case work through which they can bring to our attention cases they have received in their hotline so that we can review the case, facilitate a solution but also, if necessary, modify our policies and processes when we find gaps in our platform. We are by no means perfect. This is an ongoing conversation we have with all stakeholders.
I wish to draw the attention of the committee to our partner Samaritans, which enables us to provide people who are feeling suicidal, or friends of people who are feeling suicidal, with specialised support. Therefore, if one notices that somebody within one's environment is feeling sad or is having suicidal thoughts, and if one reports it to Facebook we will contact one and the person at risk and share the Samaritans' contact details in order that one can get further support. As with any reports submitted to us, they are confidential. At that stage, the person who has reported you will be shown to you, if you have been reported and the other way around. It is important to note that our blocking features enable one to prevent interaction with somebody, if one is feeling harassed. Again, it is important that we empower users to protect themselves, aside from the features that we have designed. One of those features is the blocking mechanism which prevents interaction both ways, both from the person who has been blocked and the person who is taking the action.
This is a very brief presentation on our safety features. We have tried our best to condense all the material. I invite the committee to visit our Dublin offices. The safety team and I would love to host the committee in the office and walk it through some of the procedures in more depth so that it can understand how seriously we take the protection of our users.
No comments