Oireachtas Joint and Select Committees
Wednesday, 25 February 2026
Joint Oireachtas Committee on Arts, Media, Communications, Culture and Sport
Regulation of Online Platforms and Supports to Improve Online Safety and Participation: Discussion (Resumed)
2:00 am
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
Apologies have been received from Deputies Brian Brennan, Gibney and Pádraig O'Sullivan. Deputy Keogh will attend in substitution for Deputy Carrigy.
Today's meeting has been convened to continue the committee's scrutiny of matters relating to the regulation of online platforms and supports to improve online safety and participation. This, for now, is our final session on this topic. I welcome the following witnesses to the committee room: from the Department of Culture, Communications and Sport, Mr. Seamus Hempenstall, principal officer with responsibility for digital policy and media plurality; Ms Sinéad O'Hara, principal officer, EU Presidency and media regulation; and Mr. Paul Kilkenny, assistant principal officer, digital policy and media plurality; and from Coimisiún na Meán, Ms Niamh Hodnett, online safety commissioner; Ms Karen McAuley, director of policy for children and vulnerable adults; and Ms Rafaela Knapper, director of platform supervision.
The format of today's meeting is that I will invite the witnesses to deliver an opening statement, which is limited to five minutes. These statements will be followed by questions from members of the committee. As the witness will probably be aware, the committee may publish the statements and presentation on its webpage. Is that agreed? Agreed.
Before we move on to today's discussion, I will clarify some limitations in relation to parliamentary privilege and the practice of the Houses as regards references witnesses may make to other persons in their evidence. The evidence of witnesses physically present or who give evidence from within the parliamentary precincts is protected, pursuant to both the Constitution and statute, by absolute privilege in respect of the presentation they make to the committee. This means that they have an absolute defence against any defamation action for anything they say at the meeting. However, witnesses are expected not to abuse this privilege and it is my duty as Chair to ensure that this privilege is not abused. Therefore, if witnesses' statements are potentially defamatory in relation to any identifiable person or entity, they will be directed by me to discontinue their remarks and it is imperative that they comply with any such direction. Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him or her identifiable.
I now invite the witnesses to deliver their opening statements.
Mr. Seamus Hempenstall:
I thank the committee for the invitation to discuss the regulation of online platforms and supports to improve online safety and participation.
At a meeting of this joint committee on 26 November 2025, this Department outlined the progress made in recent years when it comes to ensuring the safety of everyone, young and old, online. Key milestones include the passage of the Online Safety and Media Regulation Act in 2022, which saw the establishment of Coimisiún na Meán in 2023 and the adoption by an coimisiún of Ireland's first online safety code in 2024.
Online safety is inherently cross-cutting in nature and is, therefore, a whole-of-government priority and responsibility. For example, Ireland's online safety framework comprises the online safety code, the Digital Services Act, which is under the policy remit of the Department of Enterprise, Tourism and Employment, and the terrorist content online regulation, which is a policy responsibility of the Department of Justice, Home Affairs and Migration. Coimisiún na Meán sits at the heart of the framework and its remit continues to expand as more legislation is enacted across Government that is relevant to online safety.
This changing landscape represents significant progress. However, it is also abundantly clear that significant challenges remain and Ireland's online safety framework is being tested. Among these serious challenges was the creation and widespread dissemination of illegal and harmful AI-generated content through the Grok artificial intelligence, AI, tool. There is also the ongoing challenge of keeping our children and young people safe online as new technologies come on stream.
In relation to the Grok AI issue, we saw how the European Commission, as lead regulator under the Digital Services Act in respect of very large online platforms such as X, opened an investigation into how X sought to mitigate the very clear and apparent risks presented by its AI tool. Coimisiún na Meán was instrumental, as part of the Digital Services Act framework across the EU, in working with the European Commission to bring that investigation to commencement. This shows how online safety cannot be addressed by one EU member state alone but can be progressed by EU member states and the European Commission acting together.
The Government has also acted. The Minister, Deputy O'Donovan, updated Government on 4 February on the steps being taken to further enhance online safety, including advocating with the European Commission for the inclusion of this type of AI-powered feature as a prohibited practice under the AI Act, which is being led on by the Department of Enterprise, Tourism and Employment. The Minister wrote to Coimisiún na Meán to request that it would develop an information campaign to raise the public's awareness of online safety issues and the potentially criminal nature of certain types of content and also requesting that they collaborate with the AI advisory council to review and update taxonomies or categories of online harm to ensure AI-enabled harms are captured.
In relation to protecting children online, the Minister has been clear that it is his number one priority. The Office of the Government Chief Information Officer is leading on the integration of an age verification tool into Ireland's forthcoming digital wallet. It is envisaged that a pilot will take place in the coming months, with the support and participation of social media platforms.
The pilot is aimed at supporting implementation of the online safety code. The code sets out that where platforms have adult-only content, they need to ensure children under 18 years of age do not encounter it, and they can use robust age assurance, include age verification, to do that. It is important to say that the wallet itself and the age verification tool in it will be voluntary and that it is intended to act as one of a number of options for platforms to avail of in order to fulfil their regulatory obligations. Participation in the pilot will also be voluntary.
Members of the committee will have noted approval by the Government of the updated national digital and AI strategy. In the strategy, the Government commits to actively working with other like-minded EU member states on proposals to restrict access to social media for children and young people. The Minister recently met his French counterparts on this and engagement is continuing at official level. As set out in the strategy, the Government's preference is for an EU-level decision on the so-called "digital age of majority" and that such a decision would reflect the report of the European Commission's expert panel, which is expected to be published in due course. However, Ireland reserves a right to pursue this domestically if an approach is not agreed at European level. Notwithstanding this, any proposal will be subject to the proper and normal processes of consultation and discussion, including with children and young people.
The priority the Government attaches to online safety will be reflected in our Presidency of the Council of the European Union this year. A youth forum will take place in the lead-up to Ireland’s Presidency, with the key objective being to listen to young people's online safety concerns and to ask them what they would like to see done to make the Internet safer. Obviously, restrictions on access to social media would be an issue for discussion. Ireland will work with all stakeholders to ensure that any additional legislative proposals made during our Presidency are effective in addressing harms posed to children online.
Ms Niamh Hodnett:
I thank the committee for the invitation to attend today. I am the online safety commissioner at Coimisiún na Meán and am joined by Karen McAuley, our director of policy for children and vulnerable adults and Rafaela Knapper, one of our platform supervision directors.
I would like to take this opportunity to outline our approach to regulating online platforms to improve online safety for everyone, especially children and young people, and to give an overview of our work to empower people and make them aware of their rights online. At Coimisiún na Meán, we have been working hard to improve online safety since our establishment in 2023. We take our regulatory role seriously and we also recognise that online safety requires a whole-of-society approach, which we support through various initiatives. We know that people have concerns about the content and behaviour that they encounter online and especially what their children see online.
Central to our work since our establishment has been the development of the online safety framework, which came into full force in July of last year.
It consists of three pieces of legislation, namely, the Digital Services Act, the terrorist content online regulation, TCOR, and the Online Safety and Media Regulation Act. They are the basis for our online safety code.
Under our online safety framework, we hold platforms to account with regard to their obligations to keep users safe online. This includes the obligation for the larger services to assess and mitigate risks arising from how they are designed and used to ensure they do not cause harm. Under Irish and EU law, platforms are not treated as publishers. They are obliged to remove illegal content once it has been reported to them, otherwise they risk becoming liable for it. All platforms have to protect the safety, security and privacy of children online. Specific obligations exist for video-sharing platforms to restrict video content that is harmful to children or incites hatred or violence on grounds of protected characteristics, terrorism and child sex abuse material. They must have parental controls. Platforms that allow adult content, such as pornography and extreme or gratuitous violence, must put in place effective age assurance measures.
We will have further responsibilities under the AI Act, including the area of prohibited AI practices. Children can be exposed to inappropriate content and addictive design through AI recommender systems or chatbots. Generative AI can be used to create child sexual abuse material and non-consensual intimate imagery. We share the public’s concerns about nudification apps.
We have opened investigations into several platforms regarding their compliance with the Digital Services Act and we are supporting the European Commission’s investigations. In response to regulation, we have seen platforms introduce age assurance, prohibit pornography or violence in terms and conditions and introduce additional parental controls. Our supervision teams continue to evaluate the effectiveness of these approaches and they will escalate further issues to investigation and enforcement, as required.
We are responsible for platforms which have their EU headquarters in Ireland, which is many, but not all, of the most popular services used by children and adults here. This means that we work closely with the European Commission and our fellow regulators to ensure that EU rules are followed and people enjoy the same levels of protection, regardless of where the services they are using are based. We are also members of the Global Online Safety Regulators Network, where we are working to drive common approaches to improving online safety around the world, while recognising that legal frameworks differ.
We all want to enjoy the benefits of being online while being better protected from harms. Our website provides information about a range of support services available for children and resources for parents, teachers and young people, including videos and guides on how to report content. It also includes videos and lesson plans, which we have developed with Webwise and shared with schools. We support Webwise’s Safer Internet Day. This year, its theme is, “AI Aware: Safe, Smart and in Control”. We produced guidance for teenagers on using AI jointly with the Data Protection Commission, DPC.
We are running a national awareness campaign for parents and young people on the right to report content they believe to be illegal or harmful to the platform where they see it. If people are not happy with the outcome, they can contact our contact centre on 01 9637755 or by email at usersupport@cnam.ie. The campaign is supported by the Department of Health and will run to the end of this month.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
We will proceed with questions and answers from members. There are a couple of substitutions but we are beginning with Senator Noonan.
Malcolm Noonan (Green Party)
Link to this: Individually | In context
I welcome our witnesses. First, I welcome Mr. Seamus Hempenstall’s statement from the Department regarding the youth forum and the voices of children and young people. That is essential. I am not sure whether there is a mechanism by which we can do that. Perhaps, it can be done through Comhairle na nÓg. That is to be welcomed.
I have three questions. They are all directed to Coimisiún na Meán. In the context of the online safety framework under Irish and EU law, Coimisiún na Meán stated in its opening statement that platforms are not treated like publishers. Why is this? They are publishing or allowing content to be published. This content is harmful. It is disinformation and misinformation. The platforms generate their profits from these posts. That is my first question.
Ms Niamh Hodnett:
There are different positions under European law, which is where this derives from. It derives from the e-commerce directive, which was originally to promote e-commerce and Internet usage in general. It was held that platforms could be mere conduits or downpipes, as such. Obviously, things have evolved a long way since 2000 to where we are now. The Digital Services Act, another piece of EU legislation, looked at this at that point in time. That Act came into effect in 2024. It said that platforms have a duty to risk assess the risks on their platforms and take mitigation steps to ameliorate those risks, which can be seen with Articles 34 and 35. Platforms also have an obligation to protect the privacy, safety and security of children, which is something we are particularly concerned about here. Things have moved on but platforms are not the same as a broadcaster or radio station that we also regulate. However, when platforms are put on notice that content is illegal or harmful, they are then obliged under the Digital Services Act to take steps to amend that. That is why we encourage the Senator’s constituents to report. It is why we have information on our website on how best to do that.
Malcolm Noonan (Green Party)
Link to this: Individually | In context
When it comes to removing illegal content, we have heard before at this committee that the fines are simply not fit for purpose. In many cases, the platforms are not even paying them. They only pay a small portion. The profits they are generating far exceed any fines that could be imposed on them. Is there a more effective regime of sanctions against social media platforms? The bigger question is how we make them comply.
Ms Niamh Hodnett:
Our approach in Coimisiún na Meán is that we take a supervisory-led approach. I can pass to my colleague, Ms Rafaela Knapper, director of supervision, to talk more about how that works. Where we become aware of a platform’s non-compliance after that supervisory engagement, it can be escalated into an investigation. There are formal steps set out in legislation that we follow in an investigation and, at the final analysis can make a decision and impose a fine. We have been supporting the European Commission’s investigations and we have opened some of our own.
In general, we have received a lot of judicial reviews in this space. Even as we take investigative steps, we have 12 judicial reviews currently open with regard to platforms. A number of those relate to opening investigations. This is a new regime and it is being applied for the first time. Therefore, both we, as the regulator, and the platforms seek clarification from the court as to how far we can go.
Substantive fines can be imposed. Fines of up to 10% can be imposed under the Online Safety and Media Regulation Act, while fines of up to 6% can be imposed under the Digital Services Act and 4% under TCOR. It is not about the fines, however. We want to see behavioural change. We want the platforms to come into compliance. We engage with them regarding all of these matters. If, as an alternative to opening an investigation, they enter into compliance – some of them do – then that is what we want to see.
In respect of what the law should be in the future, I have seen this committee discuss that issue. I have been following this committee’s work. We are all at the stage where all options are being considered and no stone should be left unturned to find the appropriate approach in addition to fines.
Malcolm Noonan (Green Party)
Link to this: Individually | In context
I welcome the additional responsibility under the AI Act and the opening of investigations into nudification apps. Has the coimisiún any powers when it comes to recommender systems, such as the default switch-off of recommender systems?
Ms Niamh Hodnett:
This is something we considered when we were consulting on our online safety code. We first started to consult on our online safety code before the Digital Services Act came into effect. Once it came into effect, we engaged with the European Commission through a process called the technical regulation information system, TRIS, in order to be able to bring the online safety code into effect. It was agreed that the most appropriate avenue to address recommender systems was through the Digital Services Act, which has a number of provisions in that regard. For example, they have to be risk-mitigated. The large platforms have to offer a recommender system that is not based on profiling, chronological feed or otherwise.
There is a concern that they promote harmful material or have an addictive design. We have been supporting the European Commission on a number of investigations, including a preliminary decision on TikTok recently, which addressed the addictive design of recommender systems. That is in a process. There is a right of response from TikTok. There will be a final decision. We are also supporting the European Commission with an investigation into Meta and its design. I will hand over to Ms Knapper who supervises the platform.
Ms Rafaela Knapper:
I can share a little bit more detail. For instance, Ms Hodnett mentioned the preliminary findings on the TikTok case. In this particular case, we supported the European Commission in relation to the so-called rabbit hole-effect of such mechanisms.
The findings relate to a potential addictive design and whether the risks associated with that have been assessed appropriately by the platform before it was launched as well as whether those risks have been mitigated. So, there is a double obligation to both assess and to mitigate. It is due to those articles we have heard of already, namely, Articles 34 and 35 of the Digital Services Act. These are preliminary findings, so they are not final yet. The provider, TikTok, has the opportunity to defend itself. If those findings become final and they stand, there is a chance that the fines imposed are up to 6% of the global turnover, which is significant.
The Meta proceedings, which we are also supporting, are still ongoing.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
I thank the witnesses for coming here today and for their work in this area. We are very focused on online safety, particularly in the case of children and young people, and looking at what more we can do, apart from the online safety code, to address some of these issues. One of the suggestions that has been mooted is to look at a social media ban that does not allow access to apps for those under the age of 16, which is similar to the model in Australia. I know Coimisiún na Meán, including the online safety commissioner, has a good relationship with the eSafety Commissioner in Australia and is monitoring what is happening there. I am glad that the Department and Coimisiún na Meán mentioned the voice of young people. There is the youth advisory panel, the establishment of which was a recommendation of this committee. Reflecting on all of that, what is the view of Coimisiún na Meán with regard to a possible social media ban for under-16s? What is the opinion of the youth advisory panel?
Ms Niamh Hodnett:
I thank the Deputy. I will start by talking about our work in this space and then pass to my colleague, Ms McAuley, in relation to the Youth Advisory Committee. We note with interest, as mentioned by the Department, the national digital and AI strategy, which was published last week and addresses this issue by looking at how Ireland is continuing to work with like-minded member states to explore what the right answer is in relation to protecting children and young people from harmful content online and ensuring they are in an age appropriate space. We do what we can with our online safety framework. We have the Article 28 guidelines as well, where we want them to be responsible for the safety, security and privacy of minors. However, there is clearly more work to be done in this space.
We entered into a co-operation agreement with the Australian eSafety Commissioner towards the end of last year and she will be sharing with us her review of how Australia's ban has been progressing. It introduced that in December. We will learn from that with great interest. We note that President Ursula von der Leyen of the European Commission has set up an expert team to report on that and we will also learn a lot from that.
One of the key points Deputy Byrne made was that we had to bring young people into this conversation and hear from them in this space. We have consulted our Youth Advisory Committee on this. I will hand over the Ms McAuley on that.
Ms Karen McAuley:
I thank Deputy Byrne for his question. As he will know, our Youth Advisory Committee is a statutory committee established under our founding legislation. For those who are not familiar with the committee, it comprises organisations that work with and on behalf of children and young people, such as the Irish Society for the Prevention of Cruelty to Children, ISPCC, and Webwise, as well as young people under the age of 25. Since we established the committee in late 2023, we have had the opportunity to engage with it on a wide range of issues related to their remit. From last September-----
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
Will Ms McAuley reply very specifically to this question?
Ms Karen McAuley:
Last September, we took the opportunity to consult the committee on its views around further measures to strengthen the protection of children from online harm, including the idea of introducing age restrictions with respect to young people's access to social media. In the round, its general view is that it does not favour the introduction of such restrictions. I will briefly try to clarify its reasons for that.
The Youth Advisory Committee made the point that social media was very much embedded in children's lives. It referenced in particular what it described as a lifeline, particularly for young people who may be vulnerable and isolated. It also suggested that a ban risked pushing young people to unregulated online spaces and therefore shifting the problems elsewhere where they are harder to reach. It suggested that a ban might deter young people who succeeded in circumventing it from reporting concerns and potential harms to it as a result. It also suggested that the ban could be difficult to enforce and that young people would find ways to circumvent it through VPNs. Finally and importantly, it suggested that young people might see a ban as a form of punishment on them for their engagement with social media.
Overall, the committee favoured the more nuanced approach currently being taken by regulation. That is the approach we are looking to take in supervising and enforcing the existing rules and expectations under the online safety framework. It felt that that approach was more aligned with children's rights in the round and their evolving capacities.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
I agree with the views being expressed by young people on this issue. I am not convinced that the social media ban for under-16s would work. In terms of the online safety code, education is a more effective way to address some of the harms. We also need to tackle what is happening on the platforms.
With that in mind, I was conscious that Ms Hodnett mentioned she had been monitoring our committee and the AI committee as well and looking at the issues. What measures in addition to fines may be appropriate? I have a concern that some companies simply regard the fines as a business cost.
We have the online safety codes in place. Clearly, there are some companies engaging far more honestly and effectively with those codes than others. I praise those and some of them came before our committee, but we do have a problem with some of them. Let us be blunt, X is among the chief culprits. What further measures do we need? Do we need to look at holding individual directors of companies liable? What further action is needed to make those companies that are not co-operating with Coimisiún na Meán to pay attention and, as Ms Hodnett said, move towards compliance? We do not want to be levying fines. We want to see companies complying.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
It does but that is in respect of non-co-operation. My question relates to failure at the safety by design stage. Are there measures there that we need to address?
Ms Niamh Hodnett:
We are following with interest work from the Spanish. They announced measures at the start of this month. First, they announced measures looking at having a digital age of majority or how to protect under-16s in relation to age verification. They also looked at whether criminal sanctions should be considered around the promotion of illegal content through recommender systems and whether that should also result in director liability. I know this committee has been considering these matters also.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
I welcome the fact that Coimisiún na Meán is examining this issue and following the lead that has been taken by Spain. That is significant. Will it be providing a report based on what it is seeing on the Spanish side or has the commission adopted a position with regard to the Spanish approach?
Ms Niamh Hodnett:
As a regulator, we take the law as we find it and apply it as we see it. However, we are following what is happening internationally as a member of the Global Online Safety Regulators Network and otherwise. Our colleagues in the Department would be taking a position in relation to policy or legislation. That is not be a matter that we would-----
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
If I may briefly ask, does the Department have a view on that?
Mr. Seamus Hempenstall:
When we passed the Online Safety and Media Regulation Act, we included the provisions that Ms Hodnett discussed as a step towards what the Deputy is asking about. Doing something like he mentioned would be very complex. It is not something that the Department is ruling out but the Department's focus at the moment in terms of online safety is on supporting Coimisiún na Meán in implementing the legislation and regulation that we already have. It is still very new and it is a signal change from what applied before. I echo what Ms Hodnett said about wanting to see changes in behaviour by the platforms. That can be seen, for example, where TikTok Lite did not happen and Meta introduced Instagram teen accounts. We might all have issues or arguments about how effective or not the protections are that the platforms are doing, but we cannot argue that they are not doing it in response to regulatory actions and legislative innovation in this space.
What the European Commission did in terms of the preliminary findings is very important. As has been reported, it can apply to other platforms also. We need to support Coimisiún na Meán and we are doing so in terms of the legislation we already have and the legislation that is coming in. The issue of pursuing individuals and companies is not off the table. I suppose that is the way to put it.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
That is welcome.
Rónán Mullen (Independent)
Link to this: Individually | In context
I have huge respect for my colleague, Deputy Byrne, but to welcome the mere statement that such actions are not off the table is setting the bar very low for our expectations. To the experts in the area I would say that people should be an awful lot more ambitious. We should not at this stage be talking about things being not off the table. There is a clear feeling out there, which is widespread among parents, that our legislators and our regulators are not getting a hold of these problems. The fact is that not a single provider of harmful material has been prosecuted in this country. This is an indictment of a self-preserving regulatory system where everybody has the best of intentions but nobody is giving leadership.
I note in particular what was said about the youth advisory committee and the consultation with youth. With great respect, a number of different items of feedback from young people were given, regarding the dangers of people going to darker places and children's rights. I really did feel on thinking about it that these are exactly the same lines of argument that come from the social media companies. Have the witnesses engaged with the work of Professor Jonathan Haidt, the social psychologist? Do they have any familiarity with what he and other experts say, for example, about how young people themselves report feeling harmed by social media? Is this on their agenda or on their radar? Is it on their radar that, for example, experiments and research have been done which show that young people who withdraw from social media for a certain period of time feel the better for it? It is not just adults looking in who are worried about what this is doing to children but it is the children themselves who are often reporting the harm. Will the witnesses speak to this issue for me? All I heard from them there were the reasons young people think it might be a bad idea to legislate for a ban. I wonder whether our society is becoming so infantilised that we are looking at legislation by young people, who may themselves be very prone to the manipulation of very powerful forces, economic and otherwise.
Ms Niamh Hodnett:
I will start and then hand over to Ms McAuley. We consult widely with everyone. As well as the youth advisory committee, we consult with the Irish Safer Internet Centre. Hotline.ie deals with notifications and is very effective at getting CSAM or non-consensual image shares down. We deal with the National Parents Council, which represents parents' concerns, and the ISPCC. We also deal with a number of researchers. I lead the policy, research and strategy division. We deal with a number of researchers as well as the one mentioned by Senator Mullen. We have consulted Stanford on it. We ask every academic we meet about this. We are very concerned exactly about all of the items Senator Mullen has outlined. This is what our online safety framework is seeking to address. The code does require, in line with the legislation-----
Rónán Mullen (Independent)
Link to this: Individually | In context
If I could just interrupt, will Ms Hodnett not reflect back to me what she hears Professor Haidt, to take a very prominent example of a leading academic in this area, and experts like him say about social media, young people and their access to it?
Rónán Mullen (Independent)
Link to this: Individually | In context
No, I am asking Ms Hodnett what she thinks he is saying.
Rónán Mullen (Independent)
Link to this: Individually | In context
Will Ms Hodnett speak on it for me?
Ms Niamh Hodnett:
It outlines the concern, which we all have and which is real, that children are suffering from eating and feeding disorders, self-harm, suicide and toxic algorithmic feeds, that they spend undue amounts of time online and that there is an opportunity cost regarding what they could be doing with that time otherwise. Yesterday we had-----
Rónán Mullen (Independent)
Link to this: Individually | In context
The second issue is bullying and they are not necessarily in the order that has been described. There is a range of issues. There is an awful lot of evidence over time. Is Ms Hodnett aware, for example, that in and around about 2012, when smartphones came in and people could start taking photographs, is associated with clear indicators of a decline in young people's mental health, not only in America by the way but also in Europe and Britain? Not only this, but it is also associated with a decline in academic performance.
Ms Niamh Hodnett:
Yes and just yesterday we had an associate professor from UCD, Dr. Melissa Bohnert, give us a presentation on a decline in STEM scores with regard to reading and mathematics in children who spend more than three hours a day on social media. She drew this evidence from the data in the longitudinal study Growing Up in Ireland. We are very conscious of these issues. Cyberbullying is prohibited in the terms and conditions under the online safety code. It is a big focus of the Online Safety and Media Regulation Act. In this regard, I will hand over to my colleague Ms McAuley to discuss a pilot we have on cyberbullying. It is an area we are very concerned about.
Ms Karen McAuley:
In fairness to the members of our youth advisory committee whom we consulted, they are very aware and concerned, as we all are, about the risks and harm that children and young people can be exposed to online. We asked them whether they felt additional protections might be needed to safeguard children from harm and, in that regard, whether a ban might be a positive way forward. They shared their views with us and I will try to summarise them. They told us in response that they are acutely aware of the risks that are there and want to see them addressed. I can assure Senator Mullen on this. Something we are working on at present is putting in place a pilot programme to provide support to children who may be at imminent risk of harm from online content. In particular, we are looking at non-offence-specific types of harmful content, in particular online bullying. We have been engaging with member organisations, such as the Irish Safer Internet Centre, the ISPCC, Webwise, the National Parents Council and Hotline.ie, as well as with a number of platforms popular with children, with a view to working together to implement-----
Rónán Mullen (Independent)
Link to this: Individually | In context
I am running out of time and I am very grateful to Ms McArdle.
Rónán Mullen (Independent)
Link to this: Individually | In context
We are all under time pressure. Do the witnesses agree with me that we need to see prosecutions of people who are harming young people online and we need to change the situation in which it is easier for a young person to access pornography online than it is to buy a pair of shoes because they need a credit card to get the pair of shoes but not to access pornography online? I worry that with the best of intentions the witnesses are engaging in a cat and mouse game in regulation. We do not have prosecutions. We do not have enforcement. Children right now are not safe from a range of online harms. Do the witnesses accept that we need much more speed and urgency and possibly more direct legislative measures?
Ms Niamh Hodnett:
Some of the things we have seen since we started include that Tumblr used to originally allow pornography and extreme and gratuitous violence but as a result of our online safety code it changed its terms and conditions not to allow this adult content that we are concerned about. X does allow pornography and extreme and gratuitous violence but it did introduce age assurance. I can hand over to Ms Knapper, who is the supervisor of X, to discuss this. We are assessing whether these measures are effective but it did take a step to introduce age assurance as a result of our online safety code and as a result of our writing to it when the code came into effect. It did appear to put something in place at that point in time. We have seen TikTok Lite being pulled. We have seen a number of platforms take steps to make their recommender systems safer or to remove toxic feeds from children. We are only around three years. We have seen the platforms take a number of steps since we have come into existence. Even with regard to Grok, which was horrendous at the start of this year, we have seen a number of steps taken by X since we engaged with it. We have opened an investigation into it. Ms Knapper can speak about this but we are limited in what we can say because it is an ongoing investigation.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
We will have to come back to it because we have run out of time for Senator Mullen.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
I thank the witnesses for coming before the committee, some of them for a second time which shows the seriousness with which they take this. What we have noticed in recent months and weeks is the fact that words matter. When we speak about bans and restrictions, which have been mentioned several times today, they are seen as draconian and somehow taking away some of the fun. When we are speaking about age limits on alcohol, it is not called a ban on fun or freedom. It is something that has to be done and is necessary to protect and prevent harm and danger. This is exactly what we should be doing here. We have been watching it being watered down in recent weeks and people balking at the idea of a ban. We want to protect our vulnerable people and young people online and prevent them from seeing the worst of what is out there.
If that is what people are taking as a ban, I am still in favour of protecting them to the maximum.
Like the Leas-Chathaoirleach, I was a journalist for 30 years. I know the value of words. I also used to have to adhere to the strictest legal and ethical standards. Every single word we published was to the highest scrutiny. It carried legal consequences. We adhered to defamation law. I am watching the fact that if something was printed that was harmful, the publisher was held accountable. I am here to suggest that these platforms are anything but passive. The social media platforms have enormous influence over what people see, what trends and what goes viral. They determine prominence through algorithmic recommendation systems. They monetise engagement. They design amplification structures. Yet in law, as we have said, they are treated primarily as intermediaries rather than publishers. I am not suggesting that we automatically try to reclassify them as publishers. It would be oversimplistic and likely disproportionate. How could we look at a statutory duty of care to tighten up those laws so they do have a greater duty of care and responsibility for what is published on their platforms?
Ms Niamh Hodnett:
Under the Digital Services Act, they do have obligations akin to a duty of care. They have to risk-assess the harms coming from their platforms and how their recommender systems are organising that content. They have to take mitigation steps, a number of which are set out in detail, including adapting the design of those platforms under Article 55 of the Digital Services Act. There are those additional obligations on them. They do have a duty of care in relation to children. They have to protect the privacy, safety and security of children under Article 28 of the Digital Services Act.
Under our code, they have to restrict or prohibit in their terms and conditions cyberbullying, eating and feeding disorders, self-harm and suicide, dangerous challenges, child sexual abuse material, terrorism and the incitement to hatred or violence on all the protected characteristics and they have to apply those terms and conditions. They also have to use the age assurance in relation to the adult content, the pornography and gratuitous violence. They do have additional obligations set on them under the Digital Services Act and online safety code akin to a duty of care in that regard.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
Does Ms Hodnett feel it is going far enough? In my opinion, it is clearly not. Could we tighten it up and bring in more legislation to bring it to a statutory duty of care?
Mr. Seamus Hempenstall:
Generally, there is always capacity or possibilities of bringing in new legislation. When the Senator says "statutory duty of care", in terms of the regulatory framework we have, that translates into the things that Ms Hodnett was discussing in terms of the platforms being required to make sure that the content that comes up on the platform is mitigated in terms of risk. It is new and puts all the obligations on the platform. I do not think anyone would say that the situation right now is satisfactory, but I also think it is important to respect that there is a process involved in doing that. Improvements can be seen in how the platforms are acting as a result of that. Anybody would say they would love to see it being done faster. It does illustrate how online safety as a concept is cross-governmental and across society.
As a former journalist, the Senator knows the value of quality journalism and media pluralism. We are looking to promote that through the European Media Freedom Act and through our national counter-disinformation strategy. We are trying to co-ordinate actions to minimise the availability of harmful content like disinformation so that young people and adults can get access to quality news and establish the facts themselves. That is why there is Webwise and a range of other education and training initiatives around media literacy to support people to make their own choices. While we have a duty and want to make sure we regulate the platforms correctly, we also want to support people to be able to make their own choices about what they see and do and the content they consume.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
Would a tiered liability model, say distinguishing between hosting and the amplification, be legally viable? There is a huge difference between allowing people to put up what they want. It is being amplified and the algorithms are pushing them to a wider audience.
Mr. Seamus Hempenstall:
There is evidence that this is happening. I cannot deny that. At the same time, the European Commission investigation that both Ms Hodnett and I mentioned is specifically looking to address those issues to prevent platforms from amplifying harmful content. That is something that we looked at as well in terms of the national counter-disinformation strategy. I think the Senator would agree there is no silver bullet or one approach that is going to solve it. It is not about regulation; the other initiatives that I mentioned, which are about promoting awareness and education in people, are also important.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
When we had the platforms before the committee, across the board, they were not in favour of a ban or further restriction - whatever we want to call it - but I still think it is something we should be exploring. Does Ms Hodnett think the age verification is going to be firm enough to try to protect our young people, or would there need to be extra provisions?
Ms Niamh Hodnett:
That is a positive way of looking at it. What is the right age or the minimum age for somebody to use a social media platform? Is it 13? Is it 16? What is the minimum age? How can the platforms be sure, either through age verification or age assurance, that someone is a particular age so that they are applying their terms and conditions, and that the space they offer is age appropriate? Whether it is 13 or 16, is it age appropriate for children of that age to be on that platform? Is there adult content on it? Then it is certainly not suitable. Is that being marked as 18? We have seen the platforms introduce age assurance in relation to adult content. As I said, we are assessing whether it is effective.
We are supporting the European Commission with a number of investigations they have into pornography platforms - the co-ordinated action throughout the EU in relation to that. Article 28 is an obligation under the Digital Services Act that is addressed to the platforms. In line with what the Senator is saying, it is about a gradated approach. Hosting service providers have certain obligations under the Digital Services Act. Under our terrorist content online regulation, they have obligations too if they host terrorist content. There are additional obligations on a platform as opposed to a hosting service provider. Those platforms have to protect the privacy, safety and security of children. The very large platforms have these additional obligations under a gradated approach for risk assessment. I might hand over to Ms McAuley because she was leading-----
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
We might have to come back on that point because we have gone way over time. We will do another round.
Joanna Byrne (Louth, Sinn Fein)
Link to this: Individually | In context
I will start with some questions for Coimisiún na Meán. In the Department's opening statement, it said in reference to the investigation into Grok that this shows how online safety cannot be addressed by one EU member state alone, but can be progressed by EU member states and the European Commission acting together. I would actually challenge the Department on that. I do not believe that is a statement of fact. It has allowed both entities to sit back on their hands and allow somebody else to step up and take the lead on this. Both of the opening statements referenced briefly the Online Safety and Media Regulation Act 2022. The Department went as far as calling its passage a "key milestone". The bit I cannot get my head around relates to the 2022 Act, which is in the jurisdiction of Irish law. It is targeted at video-sharing platforms that are based in Ireland, such as Facebook, Instagram, YouTube, TikTok, Pinterest and Tumblr; the list goes on. The purpose of the Act is to protect children and the public from harmful and illegal content. This law gives the regulator the powers of significant penalties and fines - €20 million, or 10% of the annual turnover if it exceeds that - but it has refused point blank to act on that. That is a power that has been given to the regulator, and the Department is saying that it is a "key milestone". I do not think anybody who has been affected by this scandal or many scandals like it that have come about - and will recur into the future, if enforcement is not stepped up - will agree that it is not about the fines, or disagree that the regulator wants to see behavioural change. Of course it wants to see behavioural change, but would enacting fines not prompt that behavioural change more quickly? Do the witnesses get where I am coming from in that regard? I simply do not understand why the imposition of fines to X under Irish law for child pornography generated and shared on a platform has not happened. I would like an honest answer on that. I think the people who have been affected deserve an honest answer on that.
Ms Niamh Hodnett:
I fully agree with the Deputy. Ms Knapper and I will answer this question because we worked hand in glove in relation to the Grok issue. There will come a certain point where we cannot say more because of the live investigation.
As soon as we returned to work after Christmas, this issue blew up. We set about it immediately. We met with the Garda, the DPC and hotline.ie. We briefed the Department and Ministers, including the Taoiseach and Tánaiste, on this matter and we assessed the best legal options to address it. From a criminal perspective, that was our engagement with An Garda Síochána. We met a very senior member of An Garda Síochána on this, who was also before this committee to talk about the matter. We also met the DPC, which has since announced its investigation of that matter. We met hotline.ie because we went out with our statements to tell the public the best chance to get material taken down was to report it to hotline.ie, as it would get the CSAM and the non-consensual intimate imagery, NCII, taken down swiftly and it could share such reports with the Garda.
In terms of our own regulatory measures, we assessed whether we should use the online safety code or the Digital Services Act, DSA, and what the best tool was to get the best result in relation to this matter because we were very concerned about the sharing of CSAM and NCII. Our online safety code deals with video-related content for the main part, which mainly involves images, so we thought the best option was to use the Digital Services Act. That is another tool in our online safety framework where we do that hand in glove with the European Commission and where fines of up to 6% of turnover can be imposed. We immediately met the European Commission on this matter to discuss it. Together with the European Commission, we met X. I will hand over to Ms Knapper, who is the supervisor of X and is working on that investigation.
Ms Rafaela Knapper:
On the relevant law that applies here, we considered at the time that it was the DSA and those Articles 34 and 35 obligations. It was about assessing the risk in relation to a new feature that was being launched as well as mitigating that risk after it had been assessed. As this is within the competence of the European Commission, we considered at the time that this was the most appropriate way of addressing the issue. As Ms Hodnett mentioned, we worked very closely with the European Commission. It opened an investigation towards the end of January - I believe it was 26 January. We are supporting that investigation. It will assess whether that assessment of risks, as well as the mitigation of risks, has taken place in the appropriate way. That also includes the risk of disseminating illegal content online. That relates to the type of content referenced, such as NCII and CSAM content.
Joanna Byrne (Louth, Sinn Fein)
Link to this: Individually | In context
I accept that, but the Online Safety and Media Regulation Act is very broad. It relates to harmful online content, which includes a broad range of offence-specific harmful content online. It goes as far as covering over 40 different criminal offences. I think there are 44 - do not quote me on that - criminal offences in Irish law that it could be utilised for. That gives a very broad range to target these platforms and to really cull and eliminate. What I am hearing, specifically from the Department - Mr. Hempenstall referenced this - is that it is very complex to use the Act; I think those were the words he used. If we were to follow the Online Safety and Media Regulation Act, it would be very complex to use it, but it is not being ruled out. Why have we got this tool under Irish law that is too hard to use, or we are choosing not to use it, yet is sufficiently broad to allow convictions, sanctions or penalties on such a large scale that it would perhaps deter any future use like this online?
I respect the comment that we have seen a number of the platforms taking steps, but that is nearly excusing them. We have seen a number of steps the platforms have taken, but we have also seen an increase in cases being reported. Behind that, there are lives impacted. That is what it needs to come down to. It is how people feel safe or how people feel unsafe, particularly women and children. I just do not think the tool is being used. It is being shied away from. It is too hard to sit down and say we are allowing the Digital Services Act and our counterparts in Europe to lead on this when we have sufficient laws. If we are not using these laws, why are years and years of work being put into Irish law only for Coimisiún na Meán to decide not to use it? I do not understand that.
Ms Niamh Hodnett:
We share the Deputy's concern about women and children in particular wanting to feel safer online, which is not the case currently. In relation to this matter, we still have an incident response that Ms Knapper led. We assessed what the best tools were to address this issue. Had there been more video content, we may have decided to use the online safety code. The online safety code applies to video content and video-related content in relation to the measures within it. As it was predominantly images we were coming across, we considered that the most appropriate tool to have the most appropriate remedy was the Digital Services Act. We are supporting the European Commission. We are meeting X and the European Commission team on that. We are supporting that investigation, so we are working effectively on that.
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
I thank all our guests; they are very welcome. I thank them for their opening statements. While we will get into technical information in a second, it is important that we never lose sight of the fact that children, vulnerable people and younger people are at the heart of everything we are doing here. There is enormous responsibility on the people in this room - Coimisiún na Meán, the Department and us as legislators in the Dáil and Seanad - to get this right. The speed at which technology is changing is phenomenal. We talked about Grok two weeks ago, but there will be another Grok or something else in the coming weeks and months. We can never lose sight of the speed at which new technologies will come on board, but what is also important is how quickly we react to that.
It is in that context that I will start my questioning. I will focus on Coimisiún na Meán first. It stated that its "website provides information about a range of support[s] ... available for children and resources for parents, teachers and young people, including videos and guides on how to report content." How many issues have been contacted to Coimisiún na Meán?
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
How many reports did it get last year, for example?
Ms Niamh Hodnett:
I have those figures. We have a contact centre that we use. We set it up in February 2024. In relation to our contact centre and the reports that we got, we received 1,388 in 2024 and 2,385 in 2025. In 2026 to date, bearing in mind we are just coming up to the end of February, we have received 530 contacts, 31 of which relate to Grok.
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
I thank Ms Hodnett for having those figures and for the work Coimisiún na Meán put in to having them ready for us. Obviously, we can see there is a huge increase from 2024, albeit over an 11-month period. We are looking at a 70% or 80% increase in the number. If there were 530 in the first two months of this year, we are on target for well over 3,000 for 2026. A huge number of reports are coming through to Coimisiún na Meán in terms of the contact centre. If somebody has an issue and they ring either an coimisiún or the contact centre, what are the next steps? What happens at the moment?
Ms Niamh Hodnett:
We have a team that deals with the contacts that come in. Some of them might be queries, some might be complaints, some might relate to traditional media and some might relate to digital media. We triage those and provide information or help in relation to that. Some of them are complaints, which are passed from the contact centre to our complaints team. They might also be shared directly with the supervision team. If the matter relates to what we call a P0, that is, if someone's life is at risk or a child is at risk, we always say to go to the Garda first in those instances. When those P0 contacts come in, they are the ultimate priority for us. They are escalated and dealt with straight away as part of our instant response protocol. I might hand over to Ms Knapper, who leads on that.
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
I am sorry to cut across, but I am conscious of my time. I am happy enough with the answer that Coimisiún na Meán has a really strong process when something comes through to it and that it will be dealt with appropriately. I will bring it down a layer, where somebody has an issue with something on TikTok, Meta, Instagram, X or whatever platform is there and they report it to the platform. What comes to Coimisiún na Meán is fine but it is a tiny percentage of what is reported directly to the platforms. How do we know that is being followed through correctly? The answers we got a couple of weeks ago from these companies that were in were disgraceful in terms of the data. We have no line of sight.
For example, a couple of weeks ago, an abusive message came through to me on one of the platforms. I reported it, as it was very nasty.
The feedback I got was that there was nothing to see, it was fine and did not break the-----
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
What was said was vile, but the social media platform turned around and said the case was closed. What legislation is there? How can we check that in respect of the vile things being reported - it has been far worse for other people but I had that experience myself - these companies are self-regulating? That is not the way to do business.
Ms Niamh Hodnett:
We fully agree. That is why we have the regulation in place. Self-regulation alone does not work in this space. The ability to report has to be easy to use and easy to find. If you do not like the answer you get when you report, as the Deputy understandably did not in the case he outlined, you have the right to escalate that by way of an appeal. We have opened a formal investigation into X in relation to how it is dealing with-----
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
I do not mean to cut across Ms Hodnett, and I totally accept what she is saying, but while I make a complaint and get an answer in one, two or three days, that message or post is still up. Ms Gráinne Seoige was before the committee recently. She spoke eloquently about her own experience and where the trail stopped. Messages were sent and forwarded many times, but if the person is not prepared to give that information to the Garda, the investigation dies. It is absolutely crazy.
How many staff does Coimisiún na Meán have?
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
How many are dealing specifically with the regulation of online platforms, supports, improved safety and participation?
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
Has the commission enough staff, given the pace at which things are progressing?
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
When we think of the billions of posts and users on these platforms, 300 staff, even if they were all involved in this work, does not sound like a massive number. The commission has, for example, full responsibility for all those companies that are headquartered in Ireland. Is the commission getting any support from Europe at all?
Ms Niamh Hodnett:
I want to say that we acknowledge the support we get from our own Department. We are engaging with it in relation to additional staff sanctions for those new functions, particularly under the AI Act. That will be very important. We work closely with the European Commission. For those investigations we mentioned earlier to the Deputy's colleague, we worked closely with the European Commission. It is important to remember that there is an organisation like ours in every EU member state. We would work hand in glove with the EU regulators in each of the member states on these matters.
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
The commission has, for example, responsibility for Meta, TikTok and the other companies that are headquartered in Ireland. Is the commission getting any support at all from its European counterparts in terms of body numbers or advice, or is it completely on its own?
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
I ask Ms Knapper to be brief. Time is up.
Peter Cleere (Carlow-Kilkenny, Fianna Fail)
Link to this: Individually | In context
The Leas-Chathaoirleach was much better in the Chair. She allowed us to add time.
Ms Rafaela Knapper:
It would be important to note that we are looking at the systems rather than at something at content level. While all of that is concerning and could eventually point towards systemic failure, we are looking into the systems rather than looking into potentially thousands of piece of content that have been reported. There are obligations, especially under Article 16 of the Digital Services Act, for example, to allow for easy-to-access and user-friendly reporting. It also includes a requirement for those reports to be assessed effectively and a requirement to notify users of the outcomes of those reports and to allow for an appeals option. The committee may be aware that we have a horizontal review in relation to Article 16 that is still open. We have opened two investigations on the back of that and have seen changes being implemented.
Keira Keogh (Mayo, Fine Gael)
Link to this: Individually | In context
I thank the witnesses for coming and being on top of their briefs. I will start with the Department. We are going to be rolling out an age verification app through the digital wallet. Did we seek to develop it and is this our own pilot instead of trialling the second edition blueprint that is out at an EU level? Is this that pilot, only we are implementing it at a national level? Will I keep going with my questions and let the witnesses answer them all together or will I ask them one by one? I will keep going. What will be the start date for the trial, roughly?
My third question relates to the ages, demographics and geographical areas that the children will be coming from. I think that I am correct in saying that X is the only social media company based in Ireland that allows adult content, including pornography and gratuitous violence. It has not come to the table at all. Its representatives have not come before any Oireachtas committees. I know when the AI Minister of State reached out, they gave her an online meeting. They are not very forthcoming in coming to the table. This is going to be a voluntary approach. What other social media companies are coming to that trial if X is the only one that is limited to over-18s? Others, such as Instagram, Snapchat and TikTok, are all 13 and over. What social media companies are going to come to test the pilot? That is the start of my questions.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
For the benefit of people watching, Mr. Hempenstall might explain what the OGCIO is.
Mr. Seamus Hempenstall:
It is the Office of the Government Chief Information Officer, which is based in the Department of public expenditure and digitalisation. I thank the Chair.
The Deputy is right that the European Commission is piloting what it calls a "white-label mini wallet". It is essentially open-source code, so anyone can download and use it. The OGCIO is saying that it has progressed work on a digital wallet to such an extent that it makes more sense, resource-wise, to implement the age verification tool we would like to use through the digital wallet as it stands now rather than engaging with the European Commission on its pilot. The OGCIO has had discussions with the European Commission about its mini wallet. There are discussions going on. The current project is to do it through our digital wallet. I do not have a timeframe for the commencement of the pilot. We are hoping to do it in the coming months. It would run for a number of months. While we have talked about it with the OGCIO, we have not yet put a time limit on how long the pilot might run because we want to make sure that we get enough people to participate in it. The pilot would be voluntary.
In terms of demographics and involving young people, the digital wallet is based on MyGovID, which, in turn, is based on the personal public service number, PPSN, dataset in the Department of Social Protection. People over 16 are eligible for MyGovID. The pilot will necessarily be confined to people over the age of 16. How we involve people over 16 - those who are 17 - compared with how we involve people who are over 18 is something we need to work out in more detail.
Keira Keogh (Mayo, Fine Gael)
Link to this: Individually | In context
In relation to the social media companies, is it just going to be X? What are we-----
Mr. Seamus Hempenstall:
We are trying to support implementation of the code because that is where the legal age verification requirement for over-18s is placed. We have invited all the video sharing platforms services that are established in Ireland. We wrote to them before Christmas. We have not had a definitive answer from X. I will put it that way.
Keira Keogh (Mayo, Fine Gael)
Link to this: Individually | In context
Will the Department be doing different limits for 16-year-olds? If Instagram comes to the table, it is not limited to over-18s, anyway. Does Mr. Hempenstall know what I am saying?
Mr. Seamus Hempenstall:
Yes. What we do not want to do is to confuse the idea of having a digital wallet with an age verification tool as an aid for verifying age for the purposes of the code with the idea of the digital age of majority. We are focusing initially on plus or minus 18. We have, as I said, invited all the platforms to participate. There is, I think, support among most of them for it. The extent to which they will be involved in doing the pilot is something we have to work out. We are at the stage where these questions are looking to be answered and that is what we are doing.
Keira Keogh (Mayo, Fine Gael)
Link to this: Individually | In context
I will move to the representatives of Coimisiún na Meán. I thank them for coming before the committee again and for all their work. I do not think the general public fully understands what the commission does, with the greatest of respect. It is our online safety regulator and there is confusion between age assurance and age verification. I think the public thinks that the commission is here to protect us from everybody online in every which way. The online safety code applies to video sharing and very large platforms. Do we have a way to go? If I look at the "How to report" page, some of the video guidelines are not working at the moment, by the way. We will get that up to date.
There is nothing on Discord, Roblox or Twitch because they are not based here. People are going to go to Coimisiún na Meán because they think it is our online regulator. Twitch has over 20,000 users in Ireland, but it does not feature with Coimisiún na Meán at all because it is based in Germany. I feel like we have a long way to go. Similarly, Discord is a huge threat to our young people’s safety, as is Roblox. What is the commission’s view on that, particularly as it is not currently responsible for regulating them because they are not based here and some are not very large platforms? Members of the public expect that.
Ms Niamh Hodnett:
That is a very good question and one in respect of which we could be clearer in the campaigns we are running in this space. That is why it is important that there is an authority like ours in every EU member state. Roblox, for example, is based in the Netherlands. The Dutch authority opened an investigation in relation to Roblox recently. That helps to protect users. Roblox is very popular with children in Ireland. I will hand over to my colleague Ms McAuley on the pilot-----
Ms Karen McAuley:
We have approached a number of platforms which are popular with children about their potential participation, on a voluntary basis, in this pilot programme. It is about reporting concerns and having the opportunity to escalate concerns that fall within the scope of the pilot to those platforms, with a view to them looking at them where we would have triaged the information that came to us and were confident that the concerns met a threshold.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
I call Senator Ní Chuilinn. I apologise to the Senator because I should have called her first.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
That is all right. I thank the Chair and the witnesses, as always. I will pick up with the digital wallet, the pilot and the challenges involved. The witnesses stated that they are looking at all the questions that are popping up now because they are looking at implementing it. The youth advisory committee stated that it would not be in favour of this type of ban. It is a voluntary pilot. The wallet, as it stands, would not work for under-16s. Are there challenges around what kind of data we will get out of a pilot if it is voluntary and might not work with under-16s? The kids do not want a ban, so will they engage with a pilot? Are there any questions about what kind of data we will get and how robust it will be?
Mr. Seamus Hempenstall:
The idea of banning or restricting access to social media for under-16s is separate from the idea of doing the age verification pilot. We will always need a method by which users can verify their age. Not that I have done it, but if you want to buy alcohol online, you would need to verify your age. You can imagine that betting sites would want to have a method for verifying users’ age. Australia did a report before it introduced its restriction in respect of under-16s - or delay, as it calls it – and there were 40-odd ways of verifying your age. That might include age estimation measures too. There are already ways by means of which you can verify your age, but a digital wallet is really secure and robust, and it will be very safe for users. It is based on the PPSN, which is our unique identifier. It will have great benefits. The idea is that it is essentially a public good for the Government to offer an age verification option as part of the digital wallet. What my colleagues will also be saying is that the digital wallet is much broader than just age verification.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
It can do a lot more, obviously.
Mr. Seamus Hempenstall:
It is about access to public services but under the Digital Services Act, particularly Article 28 on the protection of minors. The guidance the European Commission published in July of last year identifies that the digital wallet is a good way for users to be able to verify their age. We are focusing on the under- and over-18s because, while it may be slightly narrower, it is set out already in the online safety code. Irrespective of that, what we are seeing from platforms is general support for the principle and being involved in helping with it but again we have to clarify with them and the OGCIO on the start date, the pilot, the numbers we need and how exactly it will work to ensure we protect users’ rights. The Data Protection Commission and data protection officers will have views on that and we need to make sure the data transfer is safe and secure. Whatever feedback we get from it, we will use to inform how we go forward with the legislation.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
On resourcing, would any of the moderation or regulation of the pilot or wallet be done by AI or will it be done by humans?
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
That is a genuine question. I do not know the answer either.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
Deputy Cleere asked Coimisiún na Meán about resourcing. It said it was grateful to be able to get to the ceiling of 303 staff. Is any of its regulation or monitoring done by AI or is it all done by boots on the ground?
Ms Niamh Hodnett:
In getting to our 303 staff, we want to set up a technical insights and digital team. We are currently hiring a chief data and technological officer to be able to set up that particular division because we think it is important to be able to get those technological insights. Many of our staff have come from the tech sector, including Ms Knapper, who is here with me, so we do have a lot of insights into these matters from those who have worked in the sector previously. I will hand over to Ms Knapper on how we supervise-----
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
Sorry, what I mean is whether any of the work Coimisiún na Meán is doing to regulate and monitor what is going on done by AI? It is just a yes-no answer.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
In that context, how do the witnesses feel about the platforms using a lot of AI regulation? Much of their monitoring work is done by AI, so I wonder how the Department and Coimisiún na Meán feel about humans not regulating a lot of the content?
Ms Niamh Hodnett:
AI models are really useful in this space in the context of being able to monitor huge amounts of content at scale. Human monitors would not be able to deal with that same amount of content. Of course, they are not 100% reliable, so even if the AI model can monitor and get it right for 99% content – which is to be welcomed because it stops humans having to review that very harmful and upsetting content as well – it still leaves a residue which needs a human in the loop to assess that 1%. It is to be welcomed that AI is used to moderate the sheer volume of content that is uploaded onto these platforms.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
Does the Department have an opinion?
Mr. Seamus Hempenstall:
I agree that it is worthwhile having the AI involvement in content moderation. These platforms are global, so it supports them in doing that 24-7. As I understand it, their AI models are always evolving on the basis of the data they get. As Ms Hodnett said, it does reduce the amount of harmful content a human would have to look at as well. The platforms would say, not that I am a spokesman for them, that there are human moderators at the end of the process as well. We think that is important too, especially in relation to content that might be borderline. Genuinely illegal content is usually quite clear. Where content might be borderline, then a human moderator might be important.
Evanne Ní Chuilinn (Fine Gael)
Link to this: Individually | In context
When they were here, the platforms – with the exception of X, obviously, which did not turn up - told us that recommender systems are not binary. They are not black and white, so it is not as simple as switching them on and off. Why is that? Why can we not just switch them off for some of the stuff we are hearing about? Particularly with TikTok, for example. If you are a 14-year-old girl who goes on TikTok, within 20 minutes you will see some harmful content relating to eating disorders. In those instances, why can we not target switching off those recommender systems for young users?
Ms Rafaela Knapper:
It is not so much about turning it off but how the recommender system is designed. We believe it should be designed in a way that does not increase or push illegal or harmful content to users. There are certain obligations, especially in relation to recommender systems, which I think we mentioned earlier under the DSA. That includes Article 27 on the transparency of how those systems work and giving options for how a user has more authority to decide what type of content they see. If there is more than one option, they should be able to understand that and make a decision based on it.
There is also Article 28, which is most relevant here and which relates to the protection of minors.
Obviously, we all care that the recommender system is designed in a way that protects minors and puts their security, privacy and safety to the forefront. This is the TikTok investigation that Ms Hodnett mentioned. There have been preliminary findings recently in relation to the so-called rabbit-hole effect. Those are preliminary as opposed to final, but it has been found that in this case, the risk was not assessed accurately and was not mitigated with the users at the centre and protection of minors in mind. It is not final, so the company will be given the right to defend itself. Those are the relevant regulations regarding recommender systems. It is less about turning it on and off and more about how it can be designed in a way that improves user experience rather than disseminating illegal and harmful content to particular users.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
We have spent months on this topic now. We have had lots of different hearings, and some witnesses have appeared twice. We have heard no evidence from anyone which states that a ban on under-16s is a good idea. We have had zero submissions in that regard. That is not an issue for the witnesses. There is confusion between the digital wallet, age verification and a ban on under-16s. That is a fact. There is confusion in these buildings in respect of this matter, never mind outside.
From a Government point of view, there needs to be some form of segmentation with regard to three different things. I am a former IT manager, so I get this stuff. We are using PPS numbers as a unique identification code. This is the only unique identification code we have for people in this country. I understand the manner in which we use that all the way through. I was in favour of the public services card. I was not one of the people who was opposed to it. We should have done a lot of this a long time ago. I understand the wallet. Age verification of 18 makes sense. The under-16s thing makes no sense. I do not even know why. If I pick up my phone, I can go into settings and turn on VPN. A ban is not practical, and it is also probably socially unfair. I do not think it will work. That is probably the biggest issue if we skip everything else. That is the first thing.
I have a concern about what has been done at EU level versus the direction we have taken. I will be open about this. Is it the case that we are pursuing the wallet versus the EU mini-wallet because, frankly, the likes of this committee or the public accounts committee will absolutely tear the Government apart for wasting money? Is it the case that it is the right thing to do to continue on the path we are on rather than engaging and opting for the European project, for want of a better phrase?
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
Which we should bring in front of this committee, because it seems to have more of the answers. I mean no disrespect.
Mr. Seamus Hempenstall:
That is fine. It was of the view that introducing an age verification tool as part of the wallet would be better taken forward as part of the digital wallet and not as part of what the European Commission is doing. As I understand it, there are five or six EU member states working on it with the European Commission at the moment. We are not one of those, but we are pursuing it.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
There could be good reasons for that. It is just something we need to flesh out. I have a couple of points. The process relating to Coimisiún na Meán investigations is too cumbersome. I am not sure how many times I referred stuff that was said about me online, and nothing happened. It is too cumbersome. It does not have an endgame, and people give up. We need to pull that into the discussion here, because people do not pursue matters. Gráinne Seoige's evidence is one example, but there is lots of other evidence. People just give up. It is way too cumbersome and goes on too long.
Is the Department pursuing a plan to make executives of some of what I call e-commerce companies - because that is what they are - criminally liable when there are severe situations and cases where they are not dealing with harmful content? As far as I can see, they build in fines as part of their business costs. Unless people are going to be held criminally liable, I do not think we will be able to get there. Is that being looked at?
Mr. Seamus Hempenstall:
I reiterate that there is already provision in the OSMR Act to do that. When I said it complex, I meant that there are a number of steps to get from the start of an investigation to where you would be able to prosecute someone. You can also see by the fact that there are 12 judicial reviews being taken by platforms against Coimisiún na Meán how litigious the platforms are being. I cannot say that we have a specific plan to strengthen provisions on executive and secondary management liability, but it is not something that is off the table. We are not saying that we would never look at it or that we will not look at it, but I cannot say there is a specific plan.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
It is something that is under discussion.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
This is a road we will have to go down. Algorithms come up quite a bit. In an ideal world, I would agree with Ms Knapper. Unfortunately, however, I believe the world has gone so far that we have to go backward in order to move forward. These companies are ruthless. They have no morals. When it comes to algorithms, the idea that we would condition it and over time they will learn or change is not going to work. We have to give people the power under legislation - and I will bring in this legislation if I have to - in order to ensure that when it comes to these e-commerce companies, you have a choice. We can do all the work Ms Knapper is talking about as well, but we have a choice. The choice is that I, as an individual, can turn off recommending systems or turn them on if I want to do so. Underneath that, we can have all the work Ms Knapper is doing. Unless we get to a point where people can take control, I do not believe we will ever change these companies. Is that something Ms Knapper disagrees with?
Ms Niamh Hodnett:
I will start on that, and then pass to Ms Knapper. We considered this as part of the consultation on the online safety code in the context of whether the recommender systems should be switched off. The main thing is that we wanted them to be made safe. We engaged with the European Commission through that as a TRIS process we had to do at the time the Digital Services Act came in. That had obligations expressly called out in relation to algorithms, including the obligation under Article 38 of the Act, which is that the large platforms have to offer a recommender system not based on profiling but on a chronological feed, which may be akin to what the Chair is suggesting. We have been supporting the European Commission regarding a number of investigations into Meta, TikTok and others relating to algorithmic concerns because we share the Chair's concerns on that.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
I appreciate that, but it does not fully answer my question. Ultimately, I have been briefed to legislate for this and for the platforms. You are never going to be able to deal with this unless you reverse the power relationship. The relationship has to be that all e-commerce companies have to work within the realm of legislation in this country. Otherwise, it is not going to happen. We should legislate for this. That is something I will suggest as part of this committee's recommendations, which is that people have the option to turn these off. Underneath all that, I respect 100% the work to make the algorithms and recommender systems safe. Unless we reverse the balance, however, these companies are not going to listen. We have enough evidence. We do not need any more. Tomorrow morning, if I were a Minister, I would be legislating for this. You take back control, and that is where we have to go. For me, that is far more important than trying to impose a ban on under-16s' social media, which is, frankly, impossible. There are also probably moral reasons against doing it.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
While I agree with the Cathaoirleach, I am not sure about using the slogan "take back control". That did not work out so well with the last group that ran with it.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
I thank the witnesses for their work. What we are trying to do is look at mechanisms to ensure greater compliance with online safety. Let me look at this another way. There are general product safety regulations that operate throughout the EU with most products, whether they are sold online or offline. If I manufacture children's toys, I have very clear safety requirements, including reporting requirements. There are data protection assessments that companies and so on have to do. If I manufacture a financial product, I have to go to the Central Bank to report on it. I am becoming more convinced that, as in other sectors, if companies develop products, particularly AI, we should look at reporting requirements about these products before they are released on the market, for example, that the necessary safety impacts have been done and they are in line with the online safety code. It would not be a guarantee in every case but the mechanisms would be in place. Is this an approach that has been or would be considered? If we were to look at implementing it, what resourcing would be required to follow through on it? Our regulators regulate in so many areas before products and services come to consumers. The problem I see in much of this space is that it is released onto the market and it is only afterwards that, as regulators, the commission and others have a responsibility to step in. Has any thought been given to it?
Ms Niamh Hodnett:
Under the Digital Services Act, the large platforms are supposed to risk assess regarding changes to the design of products or services, risk assess with regard to harmful or illegal content and put mitigations in place with regard to that. That was part of the Grok issue. Under the AI Act, there are two different provisions I can think of. Article 5 is on prohibited practices, which means that they should not be there at all. It is a prohibited practice and that is one where we expect to get enforcement powers from August 2026. That is where we will be looking for sanction for more staff with regard to the AI Act. There is also high-risk AI-----
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
It is the safety side that is the real concern.
Ms Niamh Hodnett:
There is also high-risk AI and the need to do risk assessments there. The AI Act mirrors the Digital Services Act in terms of carrying out risk assessments, which differs from looking for prior authorisation to launch a product or differs from many data impact assessments that are also carried out under the GDPR.
Mr. Seamus Hempenstall:
I am not a subject matter expert on the AI Act but I have to echo what Ms Hodnett said about the framework within it being analogous to the framework in the Digital Services Act around the risk assessments a company is supposed to do on the products and services it has so that they are safe for the people for whom they are intended. Safety by design is another way of putting it. Perhaps the Deputy is talking about something stronger but the current framework is set up like that.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context
It is and I am aware of the obligation that is there but I am concerned. This is about building public trust. We are seeing how, particularly with regard to some of the AI-enabled programmes, they are released, it is a case of companies saying they did the assessments and it is only afterwards that the regulator catches up. Should we put reporting requirements in much earlier? In much of the other legislation, such as the European general product safety requirements, somebody's name has to go in around that product so that that person is responsible for signing off and the regulator can hold that individual responsible.
Ms Niamh Hodnett:
It is something. I have been following the Oireachtas Committee on Artificial Intelligence, which we appeared before yesterday. Our chairperson Mr. Jeremy Godfrey was there, together with our colleague Mr. Tiernan Kenny. Amendments to the AI Act are being considered as part of the digital simplification agenda. The Government has published material on this in the national digital and AI strategy, with particular consideration given to prohibited practices in light of AI-generated CSAM and AI-generated non-consensual intimate image sharing. Given that the AI Act is under consideration, the time may be right to consider this matter.
Rónán Mullen (Independent)
Link to this: Individually | In context
I agree with the Cathaoirleach about the need to legislate for the option to turn off recommender systems. I am in strong friendly disagreement with him about the non-advisability of a ban on social media access for those under 16. I have a letter on my desk to go to him asking that we would bring in somebody like Professor Jonathan Haidt, who I mentioned here earlier, because he and experts like him are very clear about the damage that social media is doing to young people. The evidence of that is in in a considerable way. He would probably say that it was a question of the availability of devices as well as the content on those devices, so we have more work to do before we draw conclusions at this committee about that. I will be in touch with the Cathaoirleach.
Regarding the digital wallet, the Data Protection Commission ruled in 2019 that there was insufficient legal basis for the card beyond social welfare functions. Later investigations found aspects of facial imaging and biometric processing unlawful under the GDPR. The Government has challenged some of that. I foresee problems with this. People are speaking about problems relating to privacy. Is it not a better focus for us to put the onus on providers of content to engage in age verification? If a person goes online and looks at a pair of shoes, there is no problem. Nobody needs age verification to look at a pair of shoes. If the person goes to buy that pair of shoes, he or she presumably needs a credit card. Age verification of a third-party kind is necessary before you can even have that credit card and presumably there is some age verification obligation in respect of that transaction. My point all along has been that if a person sees pornographic material online, be it linked to a paid-for service or not, they are already consuming that content due to its nature. Is it not possible to put a legal obligation of age verification on providers of such content such that at the gateway before a person can even see material of a certain kind, they would have to engage in a zero-cent credit card transaction, for example? Is that not a better and potentially less controversial way with regard to people's privacy rights and the concerns people have about a State-controlled digital identity system?
Mr. Seamus Hempenstall:
The idea of the digital wallet is not to be a State-controlled digital identification system. The wallet is designed to enable safer and more efficient access to public services more generally. Under the EU framework legislation, called the electronic identification, authentication and trust services, eIDAS, it will not be mandatory for anyone to have the wallet but it would put obligations on providers of public services to enable people to use the wallet to access them. That is the wallet in general. As the wallet would be based on the public service identity, PSI, dataset and adhere to the required standard of data protection, it would represent a robust way of enabling people to-----
Rónán Mullen (Independent)
Link to this: Individually | In context
I heard Mr. Hempenstall say that earlier but could he address my proposal involving the credit card system, which involves third-party age verification before someone can have a credit card? Could that not be a more satisfactory and potentially less controversial way of ensuring age verification? We pass a law stipulating that before a site can make any kind of pornographic content available - to take pornography as a particular social concern - it requires people to have some kind of age verification equivalent to credit card usage? Would that not solve the problem if it was legislated for?
Rónán Mullen (Independent)
Link to this: Individually | In context
If there is a very clear way in which someone under 18 - I presume I am correct in this - cannot buy a pair of shoes on the Internet or cannot engage in a gambling transaction because of the need for a financial device such as a credit card, which is time-honoured and seems to be unimpeachable, is that not a clear and practical way forward instead of this cat-and-mouse regulation as to who can and cannot and whether it will be self-regulation or some kind of mixum-gatherum where the commission is working with platforms, waiting for them to do the right thing, they are not doing the right thing and we still have no prosecutions?
Can we not just legislate for something that already works to keep under-18s from accessing certain services?
Rónán Mullen (Independent)
Link to this: Individually | In context
That is not working. Meanwhile, they cannot buy the pair of shoes without a credit card. Is this not a no-brainer?
Mr. Seamus Hempenstall:
If the Senator would let me finish, there are apps available or age verification techniques that might involve, for example, someone scanning their driving licence or showing a photo of their passport page. That would verify their age as well. My point is there are a range of age assurance measures and I am not sure that it would be the most efficient way to ensure-----
Rónán Mullen (Independent)
Link to this: Individually | In context
Am I correct in saying that Mr. Hempenstall cannot see any problem with what I am proposing?
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
Until all harmful content is removed from social media platforms that I would not want an eight- or ten-year-old to see, I still think we need to talk about restricting access to social media accounts. I do not think it should be removed completely from our discussions. The last time we spoke about this on 4 February, I suggested mandatory ads being carried on social media platforms outlining the potential harm of the use of social media for anybody of any age. Has that been discussed or would it be considered?
Ms Niamh Hodnett:
That was raised in the US by the leading medical adviser under the previous regime as to whether there should be warnings in relation to social media in particular. I will hand over to Ms McAuley and it might also address some of the issues Senator Mullen raised. She led the Article 28 guidelines work at the European Commission, which was about how to protect children and discussed some of the issues Senator Comyn raised.
Ms Karen McAuley:
I will try to be brief. I hope it will be helpful in both cases. As I am sure the Senator is aware, Article 28 of the DSA places a positive obligation on online platforms that are accessible to children - not for children or intended for them, but accessible to children - to put in place measures to ensure a high level of privacy, safety and security for children. The European Commission published guidelines last year. We were heavily involved in that area of work, together with the other digital services co-ordinators, DSCs, across Europe. Those guidelines outline a range of measures designed to ensure, if implemented, that children enjoy age-appropriate experiences online while being safer from harms. The guidelines go into quite a lot of detail in relation to age verification. They speak about a number of use cases for age verification. One use case is to mitigate against children's access to pornography. Alongside that, as was spoken about earlier, the European Commission has been developing a white label app. As Mr. Hempenstall said, a number of countries are piloting that app, including France, Italy and Greece.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
With due respect, Ms McAuley is addressing Senator Mullen but I asked a question about these particular ads. We are bombarded by ads for everything. There should be a particular ad that outlines the length of time you spend on social media and the potential harm of staying any length of time on it or actually using social media.
Ms Karen McAuley:
Apologies. The guidelines also include recommendations in relation to information being placed by online platforms on their platforms about advice and supports for children. It is not the same as what the Senator is talking about but there is an onus on platforms to provide guidance to children via their platforms in terms of helpful information, access to supports and so on that may benefit them.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
It is not specifically an ad that says the use of social media at a particular age or particular stage of life or whatever length may cause harm.
Alison Comyn (Fianna Fail)
Link to this: Individually | In context
The likes of the nudification app on Grok dropped without much fanfare. We had to scurry and react to it. It took some time and people were harmed; there is no question. What is to say they cannot just do that again? How prepared are we that somebody, even another tech company, could not do exactly the same with a brand new type of application? How ready are we to react to it next time?
Ms Rafaela Knapper:
The obligations on platforms to risk assess and mitigate prior to any launch exist. If platforms should fail to comply, there might be consequences, which include investigations and potentially fines at the end of the road. As Ms Hodnett alluded to, we have our own internal response mechanism in place. This is a framework that allows us to deal with such situations in an effective and rapid way. It brings together people from across the organisation to make sure we are looking at it from all possible angles. That might be outreach to some of our other regulators or stakeholders. For example, Ms Hodnett mentioned that in the particular case of the Grok issue earlier this year or late last year we reached out to An Garda Síochána, the Data Protection Commission and hotline.ie. On the other hand, it might involve actual engagement with the provider where an issue occurs. Last but certainly not least, if there is a pan-European element to it, it involves working with the digital services co-ordinators where similar things might be happening as well as with the European Commission and our counterparts there.
That is our internal response. We are quite satisfied that it sets us up to deal with such a situation as quickly as possible.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
As feedback for Coimisiún na Meán, a lot of people do not know what it does or what it should be doing. An awareness campaign would be helpful to show what it can and cannot do - may I emphasise what it cannot do. It would probably help legislators to focus in on what legislation it needs to help it. I often hear, especially on this topic, having dealt with it for so long, people asking what Coimisiún na Meán is doing about this. Then I have to go through the specific issue to show where it can and cannot do stuff, how long it takes and its capacity.
To the Department, I will revert to the wallet and age verification. It is probably drifting into the OGCIO, but that is not in front of us. I would probably end up in a techy conversation if it was. In 2024, as I understand it, the European Parliament rubber-stamped what the Council had said in relation to the creation of a digital wallet to be in place by 2026. It has to be in place this year, which it will be, judging by the evidence the Department has given. It also developed an age verification app in tandem with that. Six countries are piloting the age verification app. Within the wallet, which will be broader to include your driver's licence and all of that sort of stuff in years to come, is the age verification - 18 up and down - linked? We are doing our own thing with the wallet. We are not doing the mini version. On age verification, are we separate from Europe as well or are we taking a lot of it and using it? I accept I am asking the Department because we do not have the other crowd here.
Mr. Seamus Hempenstall:
There is the European digital wallet and there is framework legislation called the eIDAS. That is what has to be in by the end of the year. That provides the framework for every European member state to develop its own digital wallet. The idea is each digital wallet will interlock the other.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
We are not part of the mini wallet across Europe. We have been advancing something better, I hope.
Mr. Seamus Hempenstall:
We are developing our digital wallet. The idea is that it will meet the standards and requirements and be nominated as our digital wallet for eIDAS. As part of that, we are doing the age verification tool. The mini wallet was started last year, not in 2024, by the European Commission because it released the guidance under Article 28 of the DSA that said you could use the European digital wallet with an age verification tool in it to verify age and that would be a good thing to do to protect children from harmful content.
Then they said, let us develop a mini-wallet as a kind of interim solution. The decision that Ireland made was through the OGCIO, as well. We have pursued our digital wallet to such an extent that it makes more sense-----
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
That is fine. I get that. I am really trying to find out about this massive interoperability. I was down this road before when I was an e-commerce manager for Bord Fáilte and Fáilte Ireland a long time ago. What I am trying to get is information about the age verification process specifically, not the mini-wallet. Surely there would be a lot of commonality there. With regard to the age verification, not the wallet, are we taking that from what is being done in Europe or are we taking it in a partial way? Is it like the Kilmeaden cheese, take the best and get rid of the rest. Are we doing that or are we replicating? I actually do not know. That is why I am asking.
Mr. Seamus Hempenstall:
I am not going to sit here and tell the Cathaoirleach that I know precisely the answer. We are piloting our own age verification. It is a credential. The way it would work is that a person would be able to send a platform a simple "Yes" or "No" signal to say that he or she either over or under 18 years of age. The platform would have a mirroring kind of application or facility on its website. I know for sure that the digital wallet is designed to be interoperable with every other digital wallet in the EU.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
Will the age verification mechanism in the EU versus that which will apply here be interoperable?
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
In the modern world, it has to be.
Mr. Seamus Hempenstall:
I do not know what each EU member state is doing in respect of age verification. As I understand it, Denmark has its own state ID and has an e-version of that. A person can scan that in order to get into a nightclub. That is the form of age verification operating there. It is a simple "Yes" or "No". Another example in Ireland would have been the Covid cert. People could have that scanned for various purposes. I suspect the idea would be that once we have tested it here and put in place the appropriate legislation to allow people to use it in the wild, so to speak, we would be able to use it as an over-18 or under-18 credential in the rest of the EU. I cannot confirm that, however.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
I will have to dig a layer underneath, I am sorry. We will probably have to talk to the OGCIO.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
You did well in fairness.
Alan Kelly (Tipperary North, Labour)
Link to this: Individually | In context
That concludes our public engagements for today. I propose that we go into private session to deal with some housekeeping and correspondence. I thank all the witnesses for coming in. I look forward to writing to the Ministers with our report in the coming weeks.