Oireachtas Joint and Select Committees

Thursday, 7 November 2019

Joint Oireachtas Committee on Communications, Climate Action and Environment

Session 2: Industry Perspective

Dr. Monika Bickert:

I will not read my entire submission. I will just hit some high points. I thank the Chairman and the members of the Oireachtas Committee on Communications, Climate Action and Environment for inviting me to speak to it and the other International Grand Committee members today. I am the vice president of content policy at Facebook and based in our Menlo Park headquarters in California. I joined Facebook in 2012 after serving 11 years as a US public prosecutor and as regional legal adviser at the US embassy in Bangkok, Thailand. I now lead Facebook's global content policy team. My team's responsibilities include developing and enforcing the rules for how people can use our services. My remit also now includes work to further our company’s goal of facilitating appropriate regulation of content on social media platforms and the broader Internet.

Facebook welcomes governments and regulators taking a more active role in addressing harmful content online. Protecting the people who use our services is a top priority, to which we continue to dedicate a great deal of time and resources, but we do not believe any company should tackle these issues alone. This is why we work together with governments, civil society, experts and industry peers to develop rules for the Internet that encourage innovation and allow people the freedom to express themselves while protecting society from broader harms.

The aim of the committee’s session today is to advance international collaboration in the regulation of harmful content, hate speech and electoral interference. Facebook shares this goal and I am grateful for the opportunity to share our thoughts on how to meet it.

Freedom of expression is a core value of our company. Facebook exists to give people a way to connect and express themselves. At the same time, we want to make sure that people using our services are safe. That means we must make decisions every day about what is and is not acceptable among our community of 2.8 billion people. Some of these decisions are clear but many are nuanced and involve balancing competing principles like voice, dignity, safety and privacy. We work hard to get these decisions right and we have community standards that define what is acceptable content. Those standards are informed at every turn by relationships with hundreds of civil society groups and experts around the world.

We invest heavily in technical solutions to quickly identify potential violations of our rules. For example, more than 99% of the terror propaganda we remove from the site is content we identify ourselves using technical tools before anybody has reported it to us. We also have more than 10,000 people working around the clock to assess whether content is violating our rules and removing it if it is. We respond to the overwhelming majority of reports of potential violations within 24 hours. We publish transparency reports about how effectively we remove harmful content. These documents, which are publicly available, show how much content we are removing in each category and how successful we have been in trying to identify that content before it is reported to us. Nevertheless, we know that with such a diverse community, the lines we draw will never please everyone and we will never be perfect in enforcing these lines. To address these challenges, Facebook is creating an independent body called the oversight board to which people can appeal our decisions on content they have posted but we know that this too will not solve all of the challenges we face. We believe that a more standardised approach would be beneficial. Regulation could set baselines for what is prohibited and guide the development of systems to better address harmful content.

In the area of elections, regulation could address important issues such as the definition of political advertising, who is allowed to run political advertisements and what steps those persons must take before doing so. Regulation could also address how political parties can and cannot use data to target political advertising. We believe that Facebook and companies like it are central to solving these complex challenges but it is clear that we cannot and ought not do this alone. In that spirit, we look forward to collaborating further with governments, civil society, industry and all of the people who hold a stake in creating rules for a safe and innovative Internet.

Comments

No comments

Log in or join to post a public comment.