Oireachtas Joint and Select Committees
Tuesday, 20 February 2024
Joint Committee On Children, Equality, Disability, Integration And Youth
Protection of Children in the Use of Artificial Intelligence: Discussion (Resumed)
Karen McAuley:
I will speak to the complaints piece first. To follow up on what Ms Hodnett has said, we are seeking to take a suite of measures under the Digital Services Act, our own legislation and the forthcoming online safety code. As Ms Hodnett stated, we opened a contact centre yesterday to support people with queries they may have in respect of their rights and to allow them to raise concerns they might have with reference to the DSA. For information, the phone number of our contact centre is 01 9637755. Members of the public can also contact us at usersupport@cnam.ie. That is what is live at present.
For context, there are several other pieces to this. The second is a provision under the Digital Services Act relating to trusted flaggers and national digital services co-ordinators. Coimisiún na Meán, as the national digital services co-ordinator here in Ireland, can award the status of a trusted flagger to a third-party entity. Last week, we published an application form and guidance for organisations that may wish to apply to us to be a trusted flagger. Trusted flaggers would be people who have necessary expertise and competence as regards identifying and notifying illegal content to the platforms, rather than to us. The idea is that trusted flaggers who identify illegal content on a platform have a priority pathway to notify the content to the platform in order that action can be taken by the platform. That is the second strand.
My apologies if my reply is a bit long-winded. The third stream relates to nominated bodies. Under our legislation, namely, the Broadcasting Act 2009, as amended, we are required to set up a nominated body scheme whereby third party external entities will be selected by an coimisiún to notify us of harms or issues related to platforms and compliance with our online safety code. That is a third stream we are working on. We have started work to develop that scheme this year.
With regard to individual complaints, members will be aware that sections 139R to 139ZB, inclusive, of the Online Safety and Media Regulation Act make provision for an individual complaints scheme. Our plan for this year is to start work to develop that individual complaints scheme with a view to having a scheme drafted by the end of the year. It is a process that we need to go through that is provided for under the legislation in terms of consultation and the issues the scheme will need to cover. We hope that by early next year at the latest, we will be in a position to present that scheme to the Minister. At first instance, the scheme will focus, again with reference to the Act, on complaints relating to children and harmful content affecting children online. I hope that gives the Senator a sense of the breadth of options that are available and coming down the line.
With regard to vulnerable adults, we are aware, including from submissions we received in response to our call for inputs last year about how we develop the online safety code, as well as the consultation we have conducted on our draft online safety code, that many individuals and organisations are rightly concerned about online harms for vulnerable adults.
My colleagues may wish to speak a bit more about the area of vulnerable adults in respect of the code. We are aware that, at a minimum, with AI tools and recommender systems it is not only children and vulnerable adults who may be more at risk but also individuals who have protected characteristics. Some people, by virtue of having, for example, a disability, a particular ethnicity, nationality or immigration status or being LGBTQI+, women and so on, may be more at risk. It is less about the technology itself but about its uses, its impact and how it is applied.