Oireachtas Joint and Select Committees
Thursday, 7 March 2013
Joint Oireachtas Committee on Transport and Communications
Social Media: Discussion (Resumed) with Twitter and Facebook
11:15 am
Ms Patricia Cartes:
I thank the Deputy for asking this question. Real identity is very important to us and I am glad to have the opportunity to explain it. Facebook is based on one connecting to people who matter to one. Therefore, when one creates an account with one's real name, one will be approving a friendship request from people one knows. In a way this helps to regulate the community because people will be reporting, others that they think are impersonating their friends but also if they do not know somebody they can mark that person as unknown to them, at the point at which they receive the friend request. In a way this is self-regulated. The community will bring to our attention fake accounts. We also have a number of mechanisms to detect fake accounts and we have a number of classifiers, automated mechanisms to detect them and these are quite powerful. Also we will examine factors such as one's date of birth, all of which is automated, changes in date of birth and changes of name. One cannot modify one's name more than a certain number of times. One can use nicknames but one will have to use one's real name. If we see that one is modifying one's name after one reaches the threshold, it is our user operations team who will approve any further changes. That in turn enables us to locate fake accounts. All in all, we have proactive and reactive measures to verify the authenticity of accounts.
On the issue of turnaround times for reports, we strive to prove a 24-hour to 48-hour turnaround time. However, the turnaround time varies from type of abuse to type of abuse. We also prioritise issues depending on what is happening in the real world. If we take the example of the London riots which was put up here yesterday, our platform was not used for the co-ordination of violence because we modify our supports to prioritise any related reports. In turn, when the community realised that we were providing timely supports and turning around the reports very fast, they started co-ordinating the clean-up operations rather than the violence. We were looking to what was happening in one country and we were also looking to who reported this. If one is a minor and reports a piece of content to Facebook we will prioritise that more highly than any other report, which means that minors will find a shorter response time when they submit content to us.
How many profiles have we removed? It is important to note that over time we have modified the way in which we proceed. One will hear many bullying experts speak about what measures have to be taken, whether it is a punitive measure or a restorative measure. We believe that trying to bring users into compliance in our platform is a very effective mechanism. If one is bullying somebody or uploading content that is inappropriate we have the ability to restrict some of one's features. If one is uploading photographs that violate the Facebook terms, we can remove one's ability to upload photographs for a certain amount of time. One will be prompted to read our community standards before one can fully engage with the site again. Of course, if the violation is very severe, we will seek to fully remove the account. What we found over time is that when people are taught in this way, through what we call these educational checkpoints, they are likely to react well to the education and modify their behaviour to be back into compliance.
The number of accounts we remove might not be as informative as one would want them to be. It is much more interesting to see how many users we bring back into compliance over time.
No comments