Oireachtas Joint and Select Committees

Tuesday, 6 November 2018

Joint Oireachtas Committee on Communications, Climate Action and Environment

Digital Safety Commissioner Bill 2017: Discussion (Resumed)

11:00 am

Ms Niamh Sweeney:

I thank the committee for asking me to be here today. I will give a brief overview because I have been before the committee on a few previous occasions when I provided more lengthy explanations of our approach to safety.

Every day, more than 1.4 billion people around the world use Facebook. They post everything from photos and status updates to live videos in many different languages. Deciding what stays up and what comes down involves hard judgment calls on complex issues ranging from bullying and hate speech to harassment. It is why we developed our community standards with input from outside experts, including academics, non-governmental organisations, NGOs and governments around the world. Our community standards have been publicly available for many years and in April of this year, we published the more detailed internal guidelines used by our review teams to enforce them for the first time. We decided to publish these internal guidelines for two reasons. First, the guidelines aim to help people understand where we draw the line on nuanced issues. Second, providing this detail makes it easier for everyone, including experts in different fields, to give us feedback in order that we can improve the guidelines, and the decisions we make, over time.

We also have a safety advisory board, which comprises leading Internet safety organisations on child abuse, domestic violence and Internet safety for children and women. Facebook consults these organisations on issues related to online safety and keeping our community safe.

For the content review process, we use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that may violate our community standards. These reports are reviewed by our community operations team, who work 24/7 in over 50 languages all around the world.

This year, we have doubled the number of people working on our safety and security teams to 20,000. This includes more than 7,500 content reviewers specifically. We are also investing heavily in new technologyto help deal with problematic content on Facebook more effectively. For example, we use technology to assist in sending reports to reviewers with the right expertise to cut out duplicate reports and to help detect and remove known terrorist propaganda and child sexual abuse images before they have been reported. Last month, we also announced thatwe have been increasingly using artificial intelligence and machine learning to detect child nudity and previously unknown examples of child sexual exploitation imagery. Content review on this scale has never been done, as there has never been a platform where so many people communicate in so many languages across so many countries and cultures, but we very much recognise the responsibility we have to get this right.

I am here today in the context of the committee’s detailed scrutiny of the Bill. Members will recall that I gave an outline of our thoughts on this topic at the 1 August hearing I attended but I am happy to be here again today as the committee considers the Bill in detail. We understand the motivation behind the establishment of a digital safety commissioner, particularly the appeal of having an independent, statutory body that is authorised to adjudicate in cases where there is disagreement between a platform and an affected user about what constitutes a harmful communication, or to provide a path to appeal for an affected user where we have, in error, failed to uphold our own policies. We also acknowledge the Bill's efforts to ensure its scope is not overly broad, in that an appeal to the digital safety commissioner can only be made by individuals where the specified communication concerns him or her.

We very much see the benefit in having a single office with the ability to oversee and co-ordinate efforts on the promotion of digital safety throughout communities, much of which has been captured in the Government's Action Plan for Online Safety. It is only through this multi-pronged approach, of which education is critical, that we can begin to see positive changes in how people engage and protect themselves online.

In its 2016 report the Law Reform Commission, LRC, in addressing the nature of harmful communications, stated the following:

While there is no single agreed definition of bullying or of cyberbullying, the well-accepted definitions include the most serious form of harmful communications, such as ... so-called “revenge porn”; intimidating and threatening messages, whether directed at private persons or public figures; harassment; stalking; and non-consensual taking and communication of intimate images ...

We agree with the LRC with respect to all of these types of communication, a very stark example of which was given by Mr. John Church in his opening statement. The sharing of non-consensual intimate images, harassment, stalking and threatening messages are all egregious forms of harmful communication and are banned both by our community standards and, in some cases, the law. We fully support the commission's proposals to create new criminal offences to tackle non-consensual sharing of intimate images and online harassment where those offences are clearly defined and practicable for a digital environment. We have also taken steps to step up how we tackle non-consensual sharing of intimate images that are shared on our own platform, more information on which can be found online and with respect to which I have shared the link with the committee in our opening statement.

However, as Deputy Ó Laoghaire outlined in his remarks today and previously, the proposed Bill is currently unclear as to what precisely constitutes a harmful communication.

There is no definition included in the draft Bill but it does appear from the draft that this concept is intended to be broader than content that is clearly criminal in nature, much of which has been touched on. The exact parameters are left undefined, which could lead to uncertainty and unpredictability. I acknowledge Deputy Ó Laoghaire's remarks about the need to include a definition. We welcome the opportunity to work with him on that if that is possible. Previously, on 25 October, he mentioned that he felt that there was a need for legal certainty and we very much welcome that.

As the LTC stated in 2016, the Internet:

enables individuals to contribute to, and shape debates on important political and social issues and within states with a repressive regime the Internet can be a particularly valuable means of allowing people to have their voices heard. Freedom of expression is therefore the lifeblood of the Internet and needs to be protected.

The report continues:

...balancing the right to freedom of expression and the right to privacy is a challenging task, particularly in the digital and online context. Proposing heavy-handed law-based measures intended to provide a remedy for victims of harmful digital communications has the potential to interfere with freedom of expression unjustifiably, and impact on the open and democratic nature of information sharing online which is the internet’s greatest strength.

I am aware that Deputy Ó Laoghaire is alive to these issues and has referenced these in his earlier contribution.

I will shorten my contribution because the committee is under pressure and I will finish by pointing out that Facebook has put community standards in place for the reason that we want our community to feel safe and secure when they use our platform. We are committed to the removal of content that breaches those standards, and we are keen to continue to engage with this committee and others as the Bill moves into the next phase in the legislative process.

I thank the committee.