Oireachtas Joint and Select Committees

Wednesday, 1 August 2018

Joint Oireachtas Committee on Communications, Climate Action and Environment

Moderation of Violent and Harmful Content on the Facebook Platform: Discussion

12:00 pm

Photo of Joe O'ReillyJoe O'Reilly (Fine Gael) | Oireachtas source

I join others in welcoming Ms Sweeney and Ms Cummiskey. I thank them for attending. I cannot sufficiently impress upon them the absolute horror of the public, including people we meet, about the availability of violent images, hate speech and pornography. The risks associated with this kind of thing are a cause of great concern to parents, teachers and the general public.

I have some specific questions. There is an obvious question that will be on the minds of those who are watching us live today and those who will review our proceedings in the print and broadcast media tomorrow. If Facebook can remove all of these images and take corrective action now, why did it not have a system of monitoring in place to prevent this action from being needed at this point? In other words, if Facebook can cure its ills now, why was it not able to cure them previously? Many people would react with incredulity to the concept that Facebook did not know about this until it was revealed in the Channel 4 programme and the subsequent media coverage. That would be disturbing on one level if it were true. If it is not true, then it is disturbing on a different level.

I ask the witnesses to address the public concern in that regard. It is the first question in the minds of many of those watching these proceedings.

It is admirable that an increase in staffing is being considered but, in light of the billions of Facebook users, the billions of euro in turnover generated and all the other factors involved, is the proposed increase too limited? Is an insufficient amount of money being put into the staffing of the monitoring section? I ask the witnesses to comment on that specifically. Are they satisfied by the proposal to increase the staffing level from 7,500 to 20,000? Are there plans for a further expansion of the section? That staffing level does not seem adequate in light of the global responsibility of the section. The staffing level at the time the witnesses last appeared before the committee was insufficient to address the political concerns in connection with Cambridge Analytica. The controls at that time were inadequate. Will the proposed increased controls be sufficient?

The witnesses stated that there are three options when dealing with flagged material, including its deletion or otherwise or its being marked as disturbing. What is the effectiveness of marking a post as disturbing? How is that justified and within what limits is it done? Disturbingly, that categorisation may be an incentive to or encouragement for some users.

I wish to revisit the issue of it being financially attractive to keep people online such that they are exposed to advertisers, and the memo of Mr. Andrew Bosworth in that regard. Do the witnesses consider that to be the case? Some evidence suggests that keeping people online no matter what is financially attractive and, sadly, that that is being done. Why is Facebook not far more proactive in terms of immediately eliminating cyberbullying or other such behaviour at source? Would it have a far more aggressive policy in that regard were it not for the financial attractiveness of keeping people online?

As a parent and former primary school teacher, I am interested in the very serious issues in regard to children under 13. Many parents watching this meeting are very concerned by the issue. Is Facebook sufficiently vigilant in terms of its method of identifying those under 13? I ask the witnesses to expand on that insofar as they can. A trainer featured on the "Dispatches" programme stated that "we just like pretend that we are blind" in that regard, which is a very scary quote. Is Facebook pretending to be blind? Is it adequately vigilant in terms of the method of assessment? If the witnesses cannot answer that today, I ask that they send the committee a briefing note because it is an area of enormous concern.

Will there be major changes in the training provided? The issue has been well discussed. When will such major changes be made and how will the committee know the process has been completed? Is the staffing level sufficient to provide such training? Will it be offered immediately?

Cyberbullying relates to my first point on children under 13 but is a distinct concern of itself. A study by the Law Reform Commission indicated that 16% of the large sample of young people surveyed had met somebody online. That is a shocking statistic on which I ask the witnesses to comment.

If I understood the witnesses correctly, Facebook accepts and welcomes the concept of a digital safety commissioner. Am I correct in that regard? Implicit in such welcome is an acceptance that there must be external as well as internal regulation. I am hopeful that a digital safety commissioner will be put in place and that the legislation in that regard, which will be scrutinised by the committee, will be accepted. However, even if that is done, it would not excuse Facebook from carrying out major in-house reforms.

Is it necessary to leave offensive material online such that it can be reported to the Garda in this country or the relevant law enforcement agency in other countries?

Could it not be either reported or taken down immediately ? What is the merit for leaving it up? There would have to be an awfully shocking or compelling case made for leaving it up for the purposes of law enforcement, particularly in light of the harm or collateral damage that could be done. Should it not be a case of having to inform the law enforcement agencies, full stop?

Comments

No comments

Log in or join to post a public comment.