Oireachtas Joint and Select Committees
Thursday, 7 November 2019
Joint Oireachtas Committee on Communications, Climate Action and Environment
Session 2: Industry Perspective
Mr. Marco Pancini:
I thank the committee for the opportunity to join its deliberations today. I lead YouTube's public policy work in Europe, the Middle East and Africa. I have over a decade of experience in online safety issues and I am centrally involved in our work to keep our users safe every day. This committee's work over the past year has addressed a range of critical topics, including privacy, misinformation, election integrity and more. In our testimony before this committee previously, we outlined Google's commitment to information quality and how continued collaboration improves the ways we all address harmful content online. Today, I will focus on YouTube's efforts and outline improvements we have made to address misinformation on that platform. I will also highlight opportunities for greater collaboration among companies, government and civil society to tackle this challenge.
YouTube is an open platform where anyone can upload a video and share it with the world. The platform's openness has helped to foster economic opportunity, community, learning and much more. Millions of creators around the world have connected with global audiences and many of them have built thriving businesses in the process. At the same time, YouTube has always had strict community guidelines to make clear what is and is not allowed on the platform. We design and update our systems and policies to meet the changing needs of both our users and society. Videos that violate those policies generate a fraction of 1% of total views on YouTube, and we are always working to decrease that number. In fact, over the last 18 months, we have reduced the number of views of videos that are later removed for violating our policies by 80%.
Our approach towards responsibility involves four Rs. We remove content that violates our policy as quickly as possible. We raise up authoritative voices when people are looking for news and information, especially during breaking news moments such as elections. We reduce the spread of borderline content and harmful misinformation. We set a higher bar for what channels can make money on our site by rewarding trusted, eligible creators. Over the past several years, we have used those four approaches to address misinformation.
While we remain vigilant against new threats, we are proud of the progress we have made. We have raised up quality content by, among other things, implementing two cornerstone products, namely, the top news shelf in YouTube search results and the breaking news shelf on the YouTube homepage. These products highlight authoritative sources of content and are now available in 40 countries. We have worked especially hard to raise up authoritative and useful information around elections. For example, earlier this year, we launched information panels in YouTube search results. For example, when users were looking for information about official candidates running for seats in the European Parliament in May we showed them authoritative information. We have continued our strict enforcement of YouTube’s policies against misleading information and impersonation. From September 2018 through to August 2019, YouTube removed more than 10.8 million channels for violation of our spam, misleading and scam policy, and more than 56,000 channels were removed for violation of our impersonation policy. We have also undertaken a broad range of approaches to combat political influence operations, which we have regularly reported over the course of the past two years. For example, in September, we provided an update about disabling a network of 210 channels on YouTube when we discovered they behaved in a co-ordinated manner while uploading videos related to the ongoing protests in Hong Kong. We have also worked hard to reduce recommendations for content that is close to a policy line but does not violate it, including attempts to spread harmful misinformation. Thanks to changes we have made over the past year, this type of content is viewed as a result of recommendations over 50% less than before in the United States. YouTube has begun experimenting with this change in other countries, including Ireland and the UK, and we will bring it to other European markets soon. We know that this work is not done. That is why we continue to work with law enforcement, industry and third-party experts around the world to continue to evolve our efforts.
I will conclude by discussing opportunities for greater collaboration. The EU code of practice on disinformation is an important foundation that we can all build on. Launched just over a year ago, the code was developed in light of work that we and others had been pursuing with experts and publishers around the world to elevate quality information and support news literacy. As part of the process, we have provided regular reports on our efforts to address disinformation and we have highlighted the work that we can do collectively in this regard.
We must continue to support collaborative research efforts. For instance, we have invested in research on the detection of synthetic media, often referred to as deep fakes, and have released data sets to help researchers around the world to improve audio and video detection. We have also made the data about our election advertising efforts available in the transparency report. This information is available to everyone, including governments, industry and experts. We can work together with this data to improve matters. We strongly believe that addressing harmful content online is a shared responsibility, which is why we are so committed to meetings and collaborations like these. We are committed to doing our part and we look forward to answering members' questions.
No comments