Oireachtas Joint and Select Committees

Tuesday, 16 April 2024

Joint Committee On Children, Equality, Disability, Integration And Youth

Protection of Children in the Use of Artificial Intelligence: Discussion (Resumed)

Ms Chloe Setter:

The Senator mentioned a range of different harms and potential harms there. I wanted to set out that generally we work to remove content that can be violative on the platform proactively as much as possible, so using technology and artificial intelligence to detect content. We believe we are fairly successful in doing that with regard to the proactive removal. Data from our third quarter of last year, for example, indicates 96% of all content removed was done proactively without a report from a user, with 77% of that having zero views and 91% of it removed within 24 hours. That is across all potential violative content.

If I may, I will focus a little on child sexual abuse because I think we all agree it is a particularly egregious crime. One of the top priorities for us as a platform is to ensure we do not have that content on the platform and we do not provide a safe place for predators. TikTok is designed, essentially, to prevent the risk of harm from child sexual abuse. We do not allow under-16s to do direct messaging or private messaging and that is purely because we recognise the harms that can happen in private spaces, like grooming. We do not allow under-18s to livestream, again because there is a higher risk of violations happening when something is happening live. One cannot download the content of anyone under the age of 15 and all content that goes onto TikTok goes through a review using automated technology to look for child sexual abuse content. It looks for hashes which are known child sexual abuse images. All content uploaded to the app goes through that process and that helps to prevent people trying to reupload violative and illegal content. When we become aware of such content we, like colleagues, report to NCMEC and take immediate action to terminate accounts. We do this on a voluntary basis of proactive detection. In our most recent reporting to NCMEC, 83% of the reports were what is called “actionable” and that is something we are really striving to increase. That basically means it is quality information that can be handed to law enforcement and help to protect and safeguard individual young people who may be being abused. The industry average is approximately 50% and we are trying really hard to ensure we give quality information in order that it can help protect children in real life.

The platform is designed to be inhospitable to offenders. We block key words that are known to be used. We work with partners and external agencies to know what those key words are. We block known URLs to child sexual abuse content, provide deterrence messaging if someone is searching for that type of content and do not encrypt our messaging spaces, which makes them more inhospitable to would-be offenders. We are part of a number of partnerships and expert agencies on tackling child sexual abuse and welcome the ongoing debate in Europe around regulations to tackle child sexual abuse material. If there are further questions, I and my colleague Ms Moss are happy to answer.