Oireachtas Joint and Select Committees

Wednesday, 17 September 2025

Committee on European Union Affairs

Engagement with Coimisiún na Meán

2:00 am

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

We have received apologies from Deputy Eamon Scanlon.

I am delighted that we are going to engage with representatives from Coimisiún na Meán, which is designated as a digital service co-ordinator under the Digital Services Act. This is one of the most significant EU legislative measures in recent years. Under our work programme, this committee is examining the Act's impacts, with a particular focus on online safety, digital regulation and governance, and increasingly advanced AI-generated context.

We are joined by Dr. John Evans, digital services commissioner; Mr. Wayne Moroney, director of platform supervision; and Mr. Tiernan Kenny, director of communications and public affairs at Coimisiún na Meán. They are very welcome. I thank them for giving us their time.

Before we begin, there is a note on privilege - I am sure they are already aware of this, but we will go through it anyway - and housekeeping matters. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of the person or entity. Therefore, if their statements are potentially defamatory in relation to an identifiable person or entity, they will be directed to discontinue their remarks. It is imperative that they comply with any such direction.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the Houses, or an official, either by name or in such a way as to make him or her identifiable. We do not have any members attending online, so we do not need go through to provisions in relation to joining online.

I invite Dr. Evans to deliver his opening statement. He has five minutes. I ask him to keep to the time if possible. I will then open the floor to members.

Dr. John Evans:

I thank the committee for the invitation to meet it. Since the establishment of Coimisiún na Meán in March 2023, we have been focused on the development and implementation of a robust online safety regime for Ireland. The result of these efforts - our online safety framework - is now fully in place and there are now binding rules for platforms to keep people safe online.

The framework is composed of three pieces of legislation, namely, the Digital Services Act, which I am mainly here to talk about today, the Online Safety and Media Regulation Act and the terrorist content online regulation.

The Digital Services Act is a crucial element of the framework. It has rebalanced the relationship between users of online services and the online platforms themselves. Under the DSA, users have new rights, including a right to accessible and user-friendly ways of reporting illegal content, and platforms have new responsibilities to make it easier for users to report illegal content, to respond to these reports and to have terms and conditions that they enforce.

As Ireland’s digital services co-ordinator, an coimisiún has a central role to play in the enforcement of the DSA for Ireland and Europe, as many of the largest online platforms are established here. Coimisiún na Meán was one of only a small number of digital services co-ordinators established on day 1 of the DSA in February 2024 and we have been working tirelessly on the supervision of online platforms since then. Our approach is supervision led and risk based, and among our strategic priorities are the protection of children, democracy, fraud and scams, hate and public safety. We take complaints from members of the public via our contact centre, which has processed 2,800 contacts since February 2024, with 30% of those relating to digital services. We also handle complaints from other EU member states and work closely with the European Commission and other digital services co-ordinators across the EU.

As an organisation, an coimisiún currently employs almost 250 staff members, having grown from just 40 only two and a half years ago. Approximately 100 staff work on online safety in key areas, such as platform supervision, investigations, regulatory operations, user complaints and user experience, policy and research. We have rapidly scaled up our platform supervision teams, who have identified and engaged with over 150 services in 2025, including 15 very large online platforms and over 130 intermediate service providers of other kinds.

Through our regulatory collaboration with the European Commission and the other digital service co-ordinators across the EU, we have informed the development of the DSA Article 28 guidelines for the protection of children, the code of conduct on countering illegal hate speech and a co-ordinated regulatory action on below-threshold pornography platforms to ensure that children here and across Europe cannot access pornography. We are also assisting the European Commission with two investigations under the Digital Services Act. Through this work, we have been able to strengthen our relationships with the Commission, deepen our understanding of platform behaviours and use the outputs of these investigations to inform future enforcement action.

We also want to empower young people, as well as their parents and teachers, with knowledge on how they can protect themselves online. We have developed educational resources, issued to every post-primary school in Ireland, on the rights of users online. These resources include lesson plans for junior and senior cycle students, and videos, infographics and posters. Later this year, we will run a further campaign on online safety in conjunction with the Department of Health.

For the rest of 2025 and into 2026, we will continue to prioritise three key areas, namely, democracy, the protection of children, and fraud and scams. Our experience of the local, European and general elections last year means we are well prepared in advance of the upcoming presidential election. In the coming weeks, we will be bringing online platforms in to meet with us, the European Commission and other stakeholders to assess their readiness for dealing with election-related issues, such as the protection of civic discourse and electoral integrity. Those are our priorities.

We will also continue to prioritise the protection of children online. Under our online safety code, platforms that allow pornography or gratuitous violence must use effective age assurance to ensure that children are not normally able to access this kind of content. Platforms are also required to have parental controls in place. The code has been fully in force since July and we know that platforms have begun to make changes to meet these new requirements. We are currently reviewing all of the designated video-sharing platforms to assess their compliance with the code and will take further supervisory, investigative or enforcement action as required.

We also recognise that financial scams and fraud are a concern to the Irish public and this year, we granted trusted flagger status to the Central Bank of Ireland, which means that online platforms that receive reports about financial scams and frauds from the Central Bank must treat those reports with priority. The work of the Central Bank of Ireland in this area will also feed into Coimisiún na Meán’s identification of trends in financial scams online and will be instrumental in establishing an informed, evidence-based approach to our ongoing supervision activities.

The past few years have been busy for Coimisiún na Meán. We have established ourselves. We grow to meet our expanded regulatory remit and implement the online safety framework for the benefit of people in Ireland and across the EU.

We have begun to see some behavioural changes from the platforms that we regulate, but we are not complacent about the risks that people still face online.

As we continue to supervise the online safety framework, I reassure all users of online services, whether parents, children or indeed elected representatives, of three important points. The days of platform self-regulation are over. Coimisiún na Meán is here to ensure that platforms meet their regulatory obligations, and to ensure that everyone, especially children, can have a safer online experience. If the platforms do not comply, we will have no hesitation in launching enforcement action and investigations, which may result in fines that can run to hundreds of millions of euro for the largest platforms.

We also want people to know that they have new rights online, which we would encourage them to use. Where they see illegal content online, or content they think might break the terms and conditions of an online platform, then they should report it to the platform where they saw it. If they cannot find the report function, or if the platform does not respond to their report, they should contact Coimisiún na Meán. These reports are crucial for us in building a case for regulatory intervention and in holding the platforms to account. We are currently finalising a review on Article 16 of the DSA - a priority area concerning the ability of users to report illegal content. We expect to make decisions on next steps in the coming weeks.

I again thank the members for their invitation to meet today and for the support that elected representatives have provided to Coimisiún na Meán in recent years. I know that they share our goal of keeping everyone, especially children, safe online. My colleagues and I are happy to answer any questions they may have.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I thank Dr. Evans. I call Deputy Ó Murchú, who has five minutes for his questions and the answers to those. I ask everyone to keep it brief.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

Tá fáilte roimh na finnéithe. Dr. Evans has said that the entire framework for the online space is now in place. I ask about Coimisiún na Meán's interaction with the online platforms. We are talking about the likes of TikTok, Meta incorporating Facebook, Instagram and WhatsApp, and YouTube. There are probably a million and one streaming services, including those operated by young people.

Anybody who has been online recently will have seen a number of brutal murders, including the murder of Charlie Kirk. There have been attacks in this State and across the world on politicians and others. There have been nefarious attacks even on politicians' family members. Nobody is against robust debate and everyone is for freedom of speech. However, while we can have all the rules and regulations we like, the people who have the power are the online platforms. With the resources Coimisiún na Meán has, including European and domestic legislation, I still see this as a five-a-side team going up against Bayern Munich or Barcelona in their prime just on the basis of the resources they have and at some level the international backing they have from the likes of the Trump Administration at this point in time.

We are all very worried. The fact is that advertising works. Many people are not in a very good space. They end up in what we would have called groupthink years ago but now end up with their own thoughts being focused back on themselves and reinforced. Some of them are not necessarily very positive. Beyond that, I would say that it is not a very safe mental health place for many people and the outcomes can be catastrophic.

I really appreciate that Coimisiún na Meán is attempting to do battle. Could anything else be provided to it to make the battle a bit more equal? If the witnesses can answer all that in two and a half minutes, I will be impressed.

Dr. John Evans:

I thank the Deputy for his support, which is appreciated. I do not think of Coimisiún na Meán as being alone in this regulatory space. There is an equivalent to me in every EU country and there is also the European Commission, which has significant responsibility in this area. We often think of it not even as a network but rather as a system, like the system of financial regulation, which is much more mature than the online safety regime but similar in characteristics.

All the time we are looking at systemic risks on the platforms.

The Deputy called out political violence in particular. If content depicts gratuitous violence-----

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

I refer also to misinformation and disinformation.

Dr. John Evans:

I appreciate that. There are many different kinds of content in respect of which different kinds of issues can be raised, from content breaching intellectual property, IP, rights all the way through to content that is designed to undermine the political process or content that is harmful to children. If a platform has as part of its terms and conditions that it will not permit certain content to be shown to minors, it needs to put in place an age assurance function. Over the past few days, I have noticed that some of the Charlie Kirk material was placed behind what we call an interstitial on some platforms. This means individuals will be prompted with a warning that what is behind the prompt may be upsetting. Sometimes, users will be prompted for age verification.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

However, they can remove it.

Dr. John Evans:

Yes, they can remove it. This is the line we walk all the time. Some platforms do not permit certain kinds of content under their terms and conditions, but all platforms are supposed to take down content that has been flagged as illegal content.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

Do they take down illegal content?

Dr. John Evans:

Yes, they do.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

However, they probably do not do so in a very speedy sort of way.

Dr. John Evans:

I mentioned in my introductory remarks that one of the inquiries we have open at the moment relates to Article 16 of the Digital Services Act, which concerns the notice and action mechanisms on platforms. We are looking at ten platforms. As I said earlier, we are about to produce some results relating to that inquiry. We have been looking at how easy it is for people to report content. If people are not satisfied with that process, they can complain to us. As stated, that helps us to build a picture.

Photo of Seán CroweSeán Crowe (Dublin South West, Sinn Fein)
Link to this: Individually | In context

The witnesses are very welcome. I am a complete novice when it comes to many of the technical issues. I will try to express some of the concerns ordinary people have. Scams are the big one, with not a day going by on which people do not get a text on their mobile phone or a call on the house phone. I do not know anyone who has not been affected by it at this stage. It seems to happen all the time. What resources does Coimisiún na Meán have to tackle that type of fraud, not just online but through texts or at the front door? The pursuit of ad click revenue is leading to prompts such as "Simon Harris has been arrested" or stating that Micheál Martin has invested in something or other or has a skincare routine and all this type of old nonsense. The prompts seem to attract people. What can the coimisiún do about this?

Dr. John Evans:

Our part of that picture is the online part. We do not have much to do with text scams, say, but we do have a role in the online part. Earlier, I outlined a number of priority areas for us. Two key areas we have been pushing so far are democratic integrity and the protection of minors. However, this area of fraud and scams is one we are developing. My colleague Mr. Moroney is looking after this for me and helping to develop that capability. I ask that he explain a bit about what we will be doing.

Mr. Wayne Moroney:

The area of scams is particularly difficult and incredibly complex, especially when it comes to those reported in the areas of content and advertising. We have done a couple of things so far that have had very useful practical outcomes. First, we joined the anti-fraud forum of Banking and Payments Federation Ireland. This allows us to connect with other stakeholders, other regulators, industry and interested parties. It enables us both to share our information on the rights users and entities have under the Digital Services Act and allows us to learn from them and understand where they are seeing particular issues to which we may wish to give further attention in the future.

Also, in May of this year the Central Bank of Ireland was approved and certified as a trusted flagger under the DSA. As Dr. Evans said, this is a powerful position that allows an entity to bring quicker or more powerful attention to their reports, so that platforms are required to prioritise those in their actions. The Central Bank is obviously the most experienced and professional organisation when it comes to identifying and dealing with fraudulent content with respect of financial services, although it is a wide network. There are issues, much like the Deputy mentioned, that go well beyond financial content. We are beginning to develop a relationship with other bodies such as the Advertising Standards Authority and to understand what is in the marketplace. I always remind people that the important powers we have under the DSA come to the notice and action mechanism, as Dr. Evans said, as well as the ads transparency under Article 26 and the ads repositories under Article 39. These obligations provide transparency to users in terms of what they are seeing and why they are seeing it. If they cannot understand something or they do not trust it, or if it looks strange to them, they can absolutely report that content to the platform and the platform is expected to take it seriously and act on it.

Photo of Seán CroweSeán Crowe (Dublin South West, Sinn Fein)
Link to this: Individually | In context

Some people will say that the difficulty with some platforms is that if they try to pursue the complaints process, there are huge difficulties related to that. Some platforms are particularly worse than others. I will not go into which one but I think we all know which platform I am talking about.

The difficulty in combating misinformation was mentioned. How can we combat misinformation when the owners of certain social media platforms post incredibly partisan comments calling for changes of government in Ireland, Britain and other parts of the world? Some platform owners even claim that they can get people elected. The owner of one platform has claimed to have gotten the President of the United States elected. We are taking about people who are multibillionaires. There is a challenge in terms of online safety, threats to democracy and hate when the platform owners have enormous reach. That is the big challenge we are actually facing. The imposition of a fine on these particular companies is derisory in the context of the capital they actually have. I am thinking in terms of what can be done and what Coimisiún na Meán can do about this. Up to now, the issue does not seem to be going away; it seems to be getting worse.

Dr. John Evans:

I alluded to this in my response to Deputy Ó Murchú's question. The issue is the difference between misinformation and disinformation, and then illegal content. The Digital Services Act targets misinformation and disinformation via Articles 34 and 35, which deal with systemic risks. If misinformation and disinformation on a platform presents a risk to electoral integrity, to public health or to a number of other areas - I think the Deputy has these two key areas in mind - that platform can be taken to task by the regulator. The European Commission has responsibility for that part of the Digital Services Act. I can talk a little about some of the cases that the Commission has open. It has a portfolio of approximately ten open cases. There are four cases open into pornographic websites. There were three but there are now two cases open into TikTok. Coimisiún na Meán is involved in those. There is one open case into X which is looking, among other things, at the issue of whether the risks presented by misinformation and disinformation are adequately addressed by mitigations on that platform. The precise kinds of issues the Deputy spoke about are being tackled right now. The regime is still early at under two years old so it will take a bit of time for the cases to come through and for them to see the light of day. I expect to see results on that quite quickly.

Photo of Seán CroweSeán Crowe (Dublin South West, Sinn Fein)
Link to this: Individually | In context

I have gone over my time but maybe at some stage we could talk about the mysterious bots that are based in different countries. Maybe at some stage we will get a chance to go into whether anything can be done about them. Again, we are talking about the influence these mysterious, anonymous computers have on the way people think.

Mr. Wayne Moroney:

I would be happy to speak about that.

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

I thank the commissioner for his opening statement. I would like to explore a number of issues today around election integrity and child protection. However, first I wish to ask about the resources available to the commission. Dr. Evans mentioned it had scaled up to 250 staff. I am conscious of the sheer size of these platforms. Has the commission the appropriate resources to be able to carry out appropriate enforcement?

Dr. John Evans:

The last digital strategy produced by the Government committed to well-resourced regulators in the digital space. That is borne out by our experience. We got up and running remarkably quickly for a State agency and moved from 40 to 250 quickly. Not all are working in online safety. Some are working in more traditional broadcast sectors, media development areas and so on. We have further sanction that will bring us to about 300. As we continue to develop our pipeline of cases as they come through, we may need more resources for some of those areas, but at the moment we are getting what we have asked for.

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

As we prepare for the presidential election, how comfortable is Dr. Evans that platforms are ready to deal with misinformation and manipulation? What is the level of co-ordination between the commission and the regulators and electoral authorities? How co-ordinated is that co-operation? Is it robust enough for real-time response?

Dr. John Evans:

I will ask Mr. Moroney to come in on this in a second but, in short, it is very co-ordinated. Last year, because the Digital Services Act was new, the European Commission published guidance to platforms making recommendations to them about mitigating the risks to electoral integrity that online platforms present. It set out how digital services co-ordinators should interact with the European Commission on one level and with the electoral areas locally. Part of that involves preparation for each election. We have experience of doing three of those last year and we have one more now with the presidential one coming up. I will let Mr. Moroney explain what we are planning for that.

Mr. Wayne Moroney:

One of the things we have done in co-ordination with other digital service co-ordinators and the European Commission is develop a toolkit for the approach to elections. This is a guidance for digital service co-ordinators to approach engagement with platforms and other national authorities. Next week we are doing our second electoral round-table with the platforms and authorities here in the country. This is an opportunity to present information and concerns to the platforms and engage in a question and answer session with them to express our concerns and the concerns of those in attendance. This is where we start to hold the platforms to account for their obligations and commitments under the DSA election guidelines and other features of the DSA. That will be followed by bilateral meetings with the platforms and the European Commission and, ultimately, engagement up to and after the election on the process itself overall. Not only do we do this on co-ordination for ourselves but we also participate in round-tables and support with other digital service co-ordinators across Europe too. This is a large-scale effort for all of us to learn and share best practice in the approach to safe elections.

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

I have two quick questions on the new online safety code and the Media Freedom Act. On the online safety code and age assurance, to what extent are platforms genuinely complying? Is the commission seeing resistance or does it believe further legislation is needed at European or local level? Will the witnesses comment on the Media Freedom Act and getting the balance right between ensuring democratic debate on one hand and online safety regulation?

Dr. John Evans:

The online safety code and the Digital Services Act work in a complementary way. They come from different legislative measures. The online safety code comes from the audiovisual media services directive. It applies to video sharing platform services. The Digital Services Act applies to all intermediate service providers. What is different in the online safety code is that we were able to be a bit more prescriptive about what we wanted. For example, if a video sharing platform service wants to show pornography or gratuitous violence and make it available on its platform, it needs to put in place age assurance measures that ensure that minors do not normally encounter that material.

Part B of the code came into effect in July and we are now looking at all the video sharing platform services that we-----

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

Are the witnesses seeing any resistance to that compliance?

Dr. John Evans:

Are we seeing some resistance? We have already successfully defended three judicial reviews on the online safety code.

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

So, we are seeing resistance.

Dr. John Evans:

Regulated firms do not like being regulated. Putting in place measures is costly. One needs to turn the regulatory levers to make that happen. They are exercising their due rights in defending themselves. It is not an entirely smooth process and neither would we expect it to be.

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

Finally, what about the Media Freedom Act and getting the balance right in terms of democratic debate and enforcement?

Dr. John Evans:

We are taking on some responsibilities under the Media Freedom Act and we are building up the team. We will be working closely to make sure that it works well with our other online safety instruments. Freedom of expression and all of these rights are what we look to balance.

Photo of Michael MurphyMichael Murphy (Tipperary South, Fine Gael)
Link to this: Individually | In context

I thank the Chair for the leeway on the time.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I have two other members but they have just stepped out. Unless another member wants to come in, I have some questions. Deputy Gogarty is indicating.

Photo of Paul GogartyPaul Gogarty (Dublin Mid West, Independent)
Link to this: Individually | In context

I thank Dr. Evans for his presentation. I have a couple of issues and queries relating to Coimisiún na Meán's three key areas for the next year: democracy, the protection of children, and fraud and scams.

On the democracy side, as we know there are a lot of so-called "troll farms" or "troll factories", some of them based in St. Petersburg in Russia, China and various Asian countries. In many cases, these actors are put in place to try to disseminate misinformation in relation to elections or just distrust in public authorities, whether it is related to climate change, free speech, etc. With the advent of AI, how close are we to tools where the IP address of someone making a post can be flagged with an alert to say that the IP address is likely from a troll farm or another suspicious location? A lot of the time, it looks like someone from the USA who is just throwing in his or her tuppence worth of MAGA blah-blah-blah, but that person could be in Russia or somewhere else.

I also have a follow-on query relating to something that was raised in 2022. I do not know whether the directive has been changed in this regard. There were some stoppages of political parties that were engaging troll farms or semi-legitimate troll farms to do a bit of spin online, but it was suggested that PR firms did not have the same level of scrutiny. What is there to stop PR companies employed by political parties from using the dark arts to foment misinformation towards other political parties in an election campaign?

I have a specific query on child safety. I previously asked the Tánaiste to consider bringing in a ban for under-16s similar to what the Australians are working on at the moment. In anticipation of that, has Coimisiún na Meán done any work in terms of looking at what is happening in Australia and looking at the pluses and minuses, including the pitfalls? It will be hard to enforce.

What Australia is trying to do is get to a level where, while it might not keep everyone under 16 years of age out of it, it will keep a lot more under-16s out of it.

I have a specific question about Roblox because it allows communication. A case has been brought to my attention where one child was contacted and made friends with someone outside the European Union. They then created a Discord account for them and started having conversations. It is hard to know if they were a predator but there were some sexually inappropriate commentaries from someone older. It is very easy for someone on Roblox to make these sort of communications with 12- or 13-year-olds, so I wonder how tight Coimisiún na Meán will get in this regard.

As regards spam, trolling and stuff like that, I have reported about 50 incidences on Facebook where someone is talking about a free meet-up, a missing dog or some form of information. When you go into these profiles, you realise the same message has been sent throughout the USA - all around the place - and it is shared on community Facebook pages. Every time you report this as a fake profile, Facebook comes back and states that this does not violate its community standards. Basically, Facebook is right now, every day, facilitating tens of thousands of spam phishing accounts where the article might turn into click bait or is trying to engage someone in scamming and Facebook does absolutely nothing about it. What is the Coimisiún na Meán's role in enforcing tighter scrutiny by Facebook in relation to these type of scams?

Dr. John Evans:

I thank Deputy Gogarty. There are a few things in there and I will bring in my two colleagues to help me answer that. On the first question, there is a role for regulation but there is also a role for media literacy. In Ireland, we rank reasonably high on European scales for how media literate our population is. I think we are sixth in Europe. We would not be perceived as a high-risk country but often we are standing in the shoes of a European regulator when we are doing our work under the DSA and that is not the case across Europe. There are some democracies which are more susceptible. I ask Mr. Moroney to explain a bit about foreign information manipulation and interference, FIMI. That is part of the election guidelines and part of the mitigation measures that the platforms are supposed to put in place so he will say a bit about that. He can also say a bit about the fraud and scams piece which the Deputy asked about. Then I ask Mr. Kenny to say a little about the Australian experience and how that is going so far because we are watching that closely and we work with our Australian colleagues quite a bit.

Mr. Wayne Moroney:

I will mention FIMI first and co-ordinated inauthentic behaviour, CIB, as it is known to platforms. For both of those things, under the DSA election guidelines and the DSA in general these are behaviours that are generally either potentially prohibited by terms of service, in which case they can be reported and should be stopped, or which are prevented because they are considered to be a systemic risk, particularly when it comes to elections or information like that. Our options are, again, that there will be engagement of platforms in the coming weeks, both at a round table and a bilateral situation. This will be one of the topics we will bring up as part of the election guidelines and we will continue to engage on those matters.

In response to the Deputy's specific question around identifying locations by IP addresses and perhaps tracking people, I am not entirely clear on the technology available to both identify and track. I am aware that there are also masking opportunities when it comes to IP addresses. That would not be exactly under our remit so it might be quite difficult to identify.

Moving on then to fraud and scams in particular and the Deputy asking about our role and what we can do when it comes to that, in his case, when he is reporting things, obviously it is very disappointing to see a response he thinks is incorrect. There are options to appeal that decision. Also, if he is unhappy with the service and the answers he has received, it is our role to take a complaint from him then and potentially turn that into action with the platforms and, beyond supervision, potentially move towards enforcement as well to follow that path. As always, when it comes to material like that, I would strongly encourage any user, including the Deputy himself, to report that to Coimisiún na Meán as a complaint and we may be able to take advantage of that for our work too and seek a resolution.

Mr. Tiernan Kenny:

What I would say regarding teams and social media, we think children should be able to have safe and age-appropriate experiences online. As a reflection of how we live in the offline world, there are some things we do not permit children to do until they reach a certain age.

As Dr. Evans said, we have put in place a host of regulations over the past year or two. A lot of that is aimed at keeping children safe online. We are starting to see some of the platforms make changes in response to that. It is very important to hear the voices of young people in this debate because they will be directly affected. At Coimisiún na Meán, we have established our youth advisory committee and we regularly engage with it. We consulted with the committee on the development of the online safety code and discussed the European Commission's Article 28 guidelines with it. The Deputy might have seen that European Commission President Ursula von der Leyen announced last week, as part of her state of the Union address, the establishment of a European task force to look at the possibility of a social media ban. So that certainly is an area of active debate.

When you speak to young people, you find that they can have challenging, difficult or dangerous experiences online. They recognise that there are benefits to being online. People who are growing up discover their identity there and find like-minded tribes. They get exposed to different forms of creativity and there are educational advantages as well. Obviously, the risk with a ban is that all of this goes away.

As Dr. Evans mentioned, we very closely co-operate with Australia's eSafety Commissioner. We are both members of the Global Online Safety Regulators Network. We will stay in close contact with our Australian counterparts and will learn from their experience as the ban is implemented.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

I might need more than five minutes. I may come back in after the Chair with a supplementary question, if that is okay. I thank our guests for being with us.

I am very concerned about the term "age assurance". As I have said previously, I worry that this is something that is less strict than age verification. Will the witnesses give me an idea of what exactly is meant by age assurance and how it will work in practice?

One of my concerns is that pornography platforms that are not based in the EU cannot be reached by Coimisiún na Meán. Am I correct in thinking that there are platforms that can make stuff available in Ireland that could be seen by children? Parents are increasingly worried about the impact of pornography, in particular on underage persons. In the United States, certain pornography providers have literally been forced offline in certain states because of laws that make them criminally and-or civilly liable if young people access pornography. Does what Coimisiún na Meán is doing have any kind of teeth like that? In my view, that is what people want. Is age assurance enough? Does it fall short of age verification in terms of strictness? Is it the reality that platforms outside of the EU can put stuff online and that Coimisiún na Meán has no way of reaching it? I understand that platforms which have their EU headquarters in Ireland cannot be reached by Coimisiún na Meán, but I presume that equivalent entities in other countries can deal with them. Does Coimisiún na Meán have the power to take down or neutralise platforms where it is or can be established that young persons can access pornography online? I would like some explanation and reassurance on the whole business of what is meant by age assurance, because I am concerned that it does not go far enough.

In the context of recent events, does Coimisiún na Meán think that social media platform posts which glorified the assassination of a well-known public figure, for example, should be removed? Is Coimisiún na Meán capable of taking action if such material turns up on digital or social media platforms? Does Coimisiún na Meán regard that as incitement to violence? Is there anything it can do about such things?

Dr. John Evans:

I might answer the last question first and then ask Mr. Moroney to talk a bit about a co-ordinated action that is happening in the EU around pornographic sites. I will address one of the aspects of that first before I hand over to him. Then Mr. Kenny will talk about some of the technology and practices relating to age assurance.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Thank you.

Dr. John Evans:

I understand a bit more about what happens there.

On the question of glorifying violence, it again comes back to whether the content is illegal. It could be illegal for different reasons. Incitement to violence is a possibility. Some platforms go further than that. They do not allow content that would not be illegal but that would be against their terms and conditions. Different platforms police those in different ways.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Does Coimisiún na Meán have the power to influence how platforms police it?

If they take a laissez-faireapproach, is it all the same to Coimisiún na Meán? Is it left up to them to decide what their standards are?

Dr. John Evans:

I have two responses to that. First, if a platform is not enforcing its terms conditions - for example, if it is not taking a laissez-faire approach and has terms and conditions that limit certain content - and if somebody flags that and the platform is just letting it through, then the person involved can appeal. There is an out-of-court dispute settlement process that can help with that.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Platforms are held to their own standards, not to those of Coimisiún na Meán.

Dr. John Evans:

They are held to their own standards, namely those that appear in its terms and conditions.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Okay.

Dr. John Evans:

If a platform is taking a more laissez-faire approach, to use that term again, then we work back to the situation of determining whether the content is illegal. If illegal content is flagged, it must come down. A platform has responsibility for that content when it becomes aware of it.

On pornography and the question of EU extraterritoriality, the way the Digital Services Act is set up is that a company can choose to nominate where it is headquartered and established in the EU. For example, many of them are here, some are based in the UK and some in the US. Snapchat is based mainly in the UK but has nominated the Dutch authority as its regulator. If a platform does not nominate a regulator to be regulated by, any one of the DSCs in Europe can pursue that platform if it wants. By that, I mean that the platform has either nominated a regulator or is are established here. In such circumstances, the European Commission or the digital services co-ordinators in the different member states can go after it. I will hand over to Mr. Moroney to talk a bit about what we are doing on the pornography side.

Mr. Wayne Moroney:

What we are describing is the co-ordinated action on pornography that is taking place. This is ongoing work with other digital services co-ordinators and the European Commission. The goal is obviously to intervene with these platforms and take action against them. The first step is a categorisation and mapping piece whereby each location is responsible for identifying and understanding what services there are under its remit and then, possibly locally, a bit of prioritisation as well. The next step is to engage these platforms and ensure they are doing their best and are in compliance with the DSA to protect children and to make sure they are engaging with the DSA and with their obligations. Subsequent to that, we will be looking at potential enforcement action. Every step here is one that will be done in co-ordination with other DSCs in order that we can learn from them and they can learn from us, we can understand that we are all moving together as one and that we are largely taking similar actions at the same time. I suppose it is bringing a fairly large boot to the table, the goal being that we tackle and identify all platforms and we work as best we can to close those gaps the Senator is speaking about, namely platforms that are maybe not identified or do not appear to be under our remit straight away.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

I am still not clear what doing their best means or why it is left to them to do their best. Where is the age assurance? What does that mean and how far do they have to go? What are they required to do to stop children from being able to access their stuff?

Mr. Tiernan Kenny:

I might come in on age assurance and age verification. With regard to age assurance, the most important part is the effectiveness of the measures a platform puts in place. You could look at age assurance as a kind of umbrella term for a range of techniques that are available either to estimate or verify somebody's age. There are varying levels of friction involved with different approaches. There are obviously privacy considerations around some of them as well. Age verification is more at the firmer end where you are able to positively identify who a person is and what is their exact date of birth. There are other techniques such as age estimation. If you have ever used online banking, you might find they try to get you to take a picture of yourself, or something where you move your head. There are some services that can use that to estimate your age to a reasonably high degree of accuracy. That does not involve them having to know who you are.

There is work ongoing at an Irish and at a European level. The Office of the Government Chief Information Officer is developing a digital wallet based on the MyGovID.

It is possible that this could provide a service whereby you would get it to produce a token that you would be able to show to a website that would tell it you are over a certain age. It would have to start at 16 because that is when you get the code. In theory, however, it could be any age over 16. That is privacy preserving because the website or platform does not have to know who you are; it just has to know you have verified your identity with a trusted entity. The website can effectively take that token as a guarantee you are the right age to access the content on it.

There is also the second round of the European digital identity regulation, the purpose of which is to try to bring that in on a pan-European basis. The Commission is developing a white label pilot app that is being tested in, I think, four countries, namely France, Spain, Portugal and either Italy or Greece. I can look that up afterwards. Again, the intention is to give users a frictionless way to prove they are above a certain age while respecting their privacy. You would expect more of those solutions-----

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

What does Mr. Kenny mean by frictionless?

Mr. Tiernan Kenny:

Sorry. As easy as possible for the user. Some people find it quite invasive to have to prove their identity every time they want to use a certain website, or if you are out and about, you might not have a passport or ID to show, if that is the approach that is needed. Sometimes, depending on the device you have, the camera might not work if you are in certain light conditions. There is a host of stuff out there on age assurance, but we have said it is up to the platforms to choose the most effective form for them. We will ensure their approach to age assurance is effective and that, in relation to the online safety code, they prevent children from being able to see pornographic content or scenes of gratuitous violence where that is permitted by the platform. I am terribly sorry for going very over time.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

That is okay. We will blame Senator Mullen entirely.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

I think you will admit, Chair that it is in a good cause.

Eileen Lynch (Fine Gael)
Link to this: Individually | In context

I thank the witnesses for being with us and for their presentation. The importance of Coimisiún na Mean's role cannot be understated. It is quite a difficult regulatory role.

While many of the measures coming in under the new legislation are preliminary and very much at an exploratory stage in terms of their results, what mechanisms are in place for monitoring and enforcing compliance by digital platforms with content moderation requirements? I am thinking particularly of elections. We have a presidential election coming up later this year. When it comes to moderating the information on these platforms, what measures can be taken to stop the force of misinformation and disinformation and to get a balance whereby we still have transparency and accountability for the public and are not seen to be interfering with the right to free speech and information? Obviously, disinformation, especially through media platforms, is a massive issue. How do the officials see Coimisiún na Mean being able to stop that, particularly as it has difficulties with platforms that are not based here and with different regulatory authorities? How do they see that working out? What has been learnt in the context of past elections and misinformation?

Dr. John Evans:

I will start by saying that there is a difference between illegal content and content that is just misinformation or disinformation. There is a different way of dealing with that than there is for illegal content. Illegal content, once it is flagged and a platform is aware of it, must come down. Misinformation and disinformation are different. The Digital Services Act approaches that from a systemic perspective. Articles 34 and 35 require platforms to mitigate risks their platforms might present. For example, if misinformation and disinformation on a platform is a threat to electoral integrity or public health, the platform is supposed to put in place measures to mitigate that. Mr. Moroney will explain how that works in the context of elections. We are preparing for that right now.

Mr. Wayne Moroney:

The DSA election guidelines are very much an extension of that obligation under Articles 34 and 35. They are responsible for providing appropriate staff and resources to a particular election. They are responsible for understanding and providing a response to a particular election, so each election in itself.

Platforms are also responsible for mitigating those risks across multiple service areas and must work to ensure that they and their advertisements provide transparent and quality information to users. Beyond that, there are also reporting obligations. It is not just that they have to provide staff and act on the election in good faith and with transparency; they must also provide reporting at the end of the period. If they do not adhere to the measures and guidelines provided under the DSA election guidelines, they must explain why their measures are good enough. They are not only accountable in the context of the guidelines; if they choose not to follow those guidelines because they have their own suite of responses, they must explain what they are and why they are good enough in the face of those guidelines. There is a suite of responses.

Alongside that, I reiterate that the power of the terms and conditions and notice in action mechanisms is the opportunity for users to report content and require the platforms to act. If they do not act or if the response is unsuitable, it can be reported to us. One of our opportunities is to receive and pursue that complaint.

Dr. John Evans:

I will add to that. I earlier spoke about some of the open cases that are with the European Commission. As an example, part of the investigation that is open in respect of X involves considering how it is dealing with the dissemination of misinformation and disinformation online and whether the measures on the platform are adequate to mitigate the associated risks. A similar investigation is open in respect of TikTok. It was instigated around the time of the Romanian election last year. We are involved in that investigation. We were up and running in the country of establishment for the platform. It is not just a question of the guidelines that are in place; there are also consequences for not implementing and adhering to the guidelines, which is enforcement activity.

Eileen Lynch (Fine Gael)
Link to this: Individually | In context

Are the witnesses confident that the resources are there to deal with these issues? I appreciate the DSA election guidelines are in place. There is also a situation, as the witnesses have outlined, whereby a platform can decide not to adhere to the guidelines because it has its own suite of measures to which it adheres for whatever reason. Who is the decision-maker in that case? What is the role of Coimisiún na Meán when it comes to fact-checking, researching and dealing with the electoral commission? It is something we are seeing more and more. Dr. Evans mentioned the Romanian presidential election last year. I am aware of that situation and the allegations of interference. Are the resources there? What is an coimisiún doing actively in respect of preventative, rather than reactive, measures? What are we doing in that context to promote media literacy among users?

Dr. John Evans:

I will hand back to Mr. Moroney. As part of our election preparations, we are planning a forum with the platforms, the European Commission and civil society and other agencies involved with some of the issues the Senator has identified. On fact-checking, for example, we have invited EDMO to participate. I will let Mr. Moroney explain.

Mr. Wayne Moroney:

I thank Dr. Evans. The DSA election round table is a key feature of engagement between digital service co-ordinators, platforms, local authorities and experts prior to any election. That is going on for many countries now. Next week, we will have our round table. We will use one half of it to present information and concerns to the platforms. We will do that, but the European Commission will be in attendance. Experts from other bodies, including the electoral commission and safe participation in public life, will also be there. In the second half, we will have a question-and-answer session in which we will explore the concerns we have around general safety, the platforms' readiness for the election and their adherence to the guidelines, as well as other concerns, such as, for example, candidate safety and advertising. On top of that, we will consider the benefits of media literacy and broader information, and users having that power. It goes back to users having that power in their hands and being able to engage with content and believe it or report the content if they believe they need to.

I should have mentioned earlier the upcoming transparency and targeting of political advertising, TTPA, regulation, which has had an interesting effect in that many platforms are now moving to prohibit political advertising in the face of the difficulty with complying with that particular legislation. That is an interesting effect for us.

Mr. Tiernan Kenny:

I will add one thing in terms of our broader role. Since last year, we have been operating the first two journalism schemes for reporting on the activities of the local courts and local authorities.

There is a lot of media coverage of local authorities inviting in presidential candidates. One of the ways to contribute to that healthy information environment is to ensure there are more trusted sources of information available. That part of our remit contributes to the overall attempt to deal with the challenges of disinformation and misinformation.

Photo of Fiona O'LoughlinFiona O'Loughlin (Fianna Fail)
Link to this: Individually | In context

I apologise for having had to step out for a brief period. There was another commitment I had to attend to.

I thank the witnesses for being here. The work they are undertaking is really important, particularly in the online world we live in now. Part of what they are doing is trying to ensure users find it easy to report illegal content, but it is about how we define "illegal", to a certain extent. People may put up content that is libellous or smearing. I am talking about within election cycles and about my personal experience. When a smear campaign is started online, for that content to be declared illegal, one has to pursue a legal case. When we are talking about a three- or four-week electoral cycle, by the time it would have the opportunity to go through the courts, if that was the course of action, the campaign is over. I am genuinely very concerned about candidates from all parties and none undergoing this type of smear campaign. It needs to be stopped in its tracks immediately once it starts. I am trying to figure out the witnesses' role in that with the platforms. Going back to my own experience, it takes time between contacting the platform and getting it to take it down. It may take down one and another may appear. It is an incredibly frustrating and upsetting process for those involved.

Regarding the Digital Services Act, I would be interested in knowing the criteria the witnesses use to determine whether a platform meets its obligations and in knowing about the challenges the witnesses come across when developing enforcement procedures.

Dr. John Evans:

I thank the Senator for her support. We published research two weeks ago which was the result of a survey of general election and local election candidates from last year. Key findings included that a number received death threats or had other very unpleasant material directed at them. It is highly concerning and comes through in the report that a number would reconsider whether to put themselves forward for election again.

Photo of Fiona O'LoughlinFiona O'Loughlin (Fianna Fail)
Link to this: Individually | In context

That is horrific, but I am talking specifically about smear campaigns where it is not a death threat but it is putting false information about a candidate or individual.

Dr. John Evans:

There is a distinction between what is illegal and what is just mis- or disinformation. If mis- or disinformation, as I explained earlier, has the potential to undermine electoral integrity, that could be picked up for enforcement activity if the platforms are not sufficiently mitigating the risks. As well as electoral integrity and civic discourse, the European Commission, which has responsibility for this part of the DSA, also has to have in mind other fundamental rights, including freedom of speech. It has to balance these rights.

Photo of Fiona O'LoughlinFiona O'Loughlin (Fianna Fail)
Link to this: Individually | In context

I hear you. It does not give me comfort but I hear you.

Dr. John Evans:

If it is of some comfort, during the most recent election, we put out a candidate pack which explained some of the scenarios in which election candidates can and should report content. The death threats go straight to An Garda Síochána first, but if it is other content the Senator thinks could be against the terms and conditions of a platform, it is worth reporting that to the platform.

For candidates, we identified fast channels to get to the right people inside the platforms during election periods. We also stood up, and will do so again for the presidential election, an internal team that is on high alert at this time. When we get calls or emails to our contact centre, for example, we triage them and attach what we call a "P0 priority" to anything that involves a threat to a minor or to life or, during election periods, anything that involves an election candidate. What that means is if that comes to us quickly and we can advise on how to help, we will do that as quickly as possible during election periods.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

When was the candidate pack put out?

Mr. Tiernan Kenny:

For the general election, it would have been shared with all the major political parties and all the independent candidates for whom we had contact details. It was similar for the local elections, although it was obviously a bit more challenging because there was a much larger number of candidates. I hope you had a chance to see it and, if not, I would be very happy to send it to you again.

Dr. John Evans:

Just on that point, that was something we did jointly with An Garda Síochána as well. We try to work hand in glove with An Garda Síochána on this as much as possible, especially when criminality is an issue when dealing with threats to life, so An Garda Síochána needs to be in the driving seat on that.

Photo of Paula ButterlyPaula Butterly (Louth, Fine Gael)
Link to this: Individually | In context

I acknowledge where the witnesses spoke about the empowerment of young people and how they can protect themselves online. I was following the meeting online so I know it is a theme that has been brought up already. It is very important to note that when there are television series such as "Adolescence", there was not a parent who watched it that did not feel the hairs rise on their arms. We are living in a society where, as parents, we think we are doing pretty okay with our children and we are not fully prepared for the threats and the onslaught our children are being exposed to. When I talk about threats and onslaught, I am not necessarily talking about threats against children but the absolute onslaught of information that is coming at them. I noticed my own son just scrolls like an automaton, which I do myself because I am obsessed with news and information and I find myself checking the news under the guise of working, every morning while I have my coffee, and then I go to reprimand my son. I am very concerned about it because it seems we are not prepared as parents. We are not equipped as parents to deal with this situation until it is too late. Any sort of protection or remedy we can put in place that protects our children, whether that is excluding them from platforms, increasing a digital footprint that ensures their identity is either hidden or that they cannot access it, is absolutely vital.

Mr. Tiernan Kenny:

The Deputy has highlighted two concerns there. First was the content children are seeing and second was some of the functionality of the apps that may incentivise them to spend more time online. That is something that is being addressed through the DSA. Dr. Evans mentioned earlier that we are involved in one of the European Commission's open investigations into TikTok, which is looking at the impact of the recommender system, what is called the rabbit hole effect, where users are constantly served the same content again and again and it eventually maybe ends up shaping their views.

The European Commission has also recently published guidelines for Article 28 of the DSA. Article 28 states that online platforms have to ensure the safety, security and privacy of minors using their services and that addresses some of those design elements. For child accounts, platforms are not supposed to have that infinite scroll functionality. They are also not supposed to have the pull or swipe to refresh function anymore.

They are also supposed to check the content they are feeding into the recommender systems and to consider intervention. We know already that some of the platforms, when they deal with a child's account, will nudge the child away from the platform after he or she has spent a certain amount of time watching content. If a child has watched a number of videos on a particular topic, they will try to reset it by introducing something that is a little different to try to break up that whole effect. We would like to give members some reassurance that progress is being made there. These are issues we are alive to, along with the European Commission and our regulatory counterparts across Europe.

Photo of Paula ButterlyPaula Butterly (Louth, Fine Gael)
Link to this: Individually | In context

I acknowledge everything Mr. Kenny said. I note he referenced child apps and child information but it appears to me, through nieces, nephews, their friends, my own son and so on, there is accessibility to what are not necessarily child apps or content, and how they are nudged maybe to a certain element of content. I remember when my child was very young, he watched some cartoons. He loved "Teen Titans". I was working away and I put the cartoon "Teen Titans" on the iPad using YouTube Kids at the time. He was watching away and, as you do when one episode finishes, he clicked on the next one. The next episode was wholly inappropriate. He was beside me. It was only because my attention was drawn to it. That should never have been accessible to him. He was about eight or nine at the time. I get the distinct feeling that it has got worse. Platforms are finding ways to get around the system to feed that content in. I just do not think they are overly concerned with what the age of their audience is. That is a real concern.

Mr. Tiernan Kenny:

I can offer the Deputy some reassurance that we are concerned that platforms only show age-appropriate content to children of whatever age. We have also put obligations into the online safety code that video sharing platforms have parental controls. That would allow parents to do things such as set time limits for how long a child can use an app or prevent other people from contacting him or her. That was raised by one of the other Senators or Deputies in relation to Roblox. We are putting measures in place to address those issues. As Dr. Evans said, the regulation is still reasonably new. It is bedding in, but we are certainly aware of the issues the Deputy raised. We are doing our best to improve things.

Photo of Eoin HayesEoin Hayes (Dublin Bay South, Social Democrats)
Link to this: Individually | In context

I thank the witnesses for coming in. I am sorry that I did not make most of the meeting, unfortunately. I had to engage with the Taoiseach in the Chamber. I have a few questions. I hope I am not repeating myself. If I am, or if something has been covered previously, please let me know and feel free to skip over it.

It was stated that the era of self-regulation is over. That is something the Minister, Deputy Helen McEntee, also said in April 2025. I am very conscious that, pre Covid and pre major reductions in headcount of policy teams in technology companies that were monitoring the Internet and enforcing certain policies on their platforms, the headcounts of those policy teams were very high. One of the major things they did during the Covid reduction in forces they went through was a reduction in those policy teams. I would like to get a sense of this from the commission. How many people were employed in the policy teams, at their peak, of the big technology companies? How does that compare with the staffing in the commission's teams? How does it compare with EU-wide responsibilities, the hiring for languages and the use of tools to evaluate and monitor online content?

Dr. John Evans:

I do not know how many people are employed on the policy teams of platforms. I have already discussed a little how systemic regulation is intended to work under the Digital Services Act if a platform's service presents a risk. There are four categories of risk. One is electoral integrity. The second is health. Fundamental rights is another, including free speech and so on. The last one is proliferation of illegal content online. If the measures platforms have in place are insufficient to mitigate the risks associated with those areas, then enforcement action can be taken.

We spoke earlier about misinformation and disinformation. The moderation teams, processes or a combination of teams and technology need to be sufficiently robust to mitigate the risks I spoke about. Some of these issues are the subject of active cases that the European Commission has open. X is involved in one that is addressing some of these issues specifically. That is the way you get at that. There is no perfect size. It depends on what kind of content the platform has, the features the service has and its reach. The higher the risk, the stronger the mitigation the platform is supposed to put in place. If it is not doing that, then it can be tackled under the DSA.

Photo of Eoin HayesEoin Hayes (Dublin Bay South, Social Democrats)
Link to this: Individually | In context

Those specific ways of tackling it under the DSA include things like enforcement mechanisms such as fines. Is that correct?

Dr. John Evans:

Yes, but sometimes not going all the way to enforcement. Sometimes, it will be on guidelines. In the electoral context, we were talking about the mitigation measures recommended to guard against deleterious effects on civic discourse. There are a lot of them. There are also the Article 28 guidelines, which concern the protection of minors. It also contains recommendations on how to mitigate some of those risks, including age assurance techniques. The kind of mitigation measures can often depend on the kind of harm, whether that is illegal content or illegal products, because the marketplaces are caught by this as well. If it is democracy that is at issue, it is the conversations between individuals.

Photo of Eoin HayesEoin Hayes (Dublin Bay South, Social Democrats)
Link to this: Individually | In context

I imagine Dr. Evans is more familiar with a lot of these things than I am but by way of anecdote, I was an intern in Google in 2011. I sat alongside a lot of these policy teams and a lot of the other interns were on the policy teams. A lot of those policy teams then became subcontractors. Accenture and other major consulting firms were engaging with companies like Facebook, Google and Twitter, as it was at the time, to build these policy teams. From a legislative and oversight perspective, do we have sufficient resources to keep up with misinformation and disinformation, in particular when the number of users is higher and AI is in some ways corrupting the information environment? Back then, there were thousands of people in Dublin whose exclusive focus was reviewing things like YouTube ads and things that were going up on the Internet that were flagged. I know technology can play a role in making those processes more efficient and perhaps that happened in the intervening years but I am concerned at a European level, not just domestic, about whether we have the sufficient resources and we enable Coimisiún na Meán to do that work on behalf of the Irish State and the European Union.

Dr. John Evans:

Another Deputy asked about resources earlier. As I explained, I do not think we will fight this battle alone. There is one of my position in each of the other member states. We are probably the best resourced so far. That is natural because so many of the large platforms are established here. We have a special responsibility under the Digital Services Act. Still, there are competent resource regulators in other members states looking at these issues as well. Somebody did a calculation of the number of people working on the regulatory side in trust and safety or online safety across the EU. Between the European Commission, us and the other DSCs, I think the number was around 800 people. That is not the same scale as the number of people who have been working at the platforms in these areas but it is still a significant number. Content moderation and protection of users is supposed to be something that the platforms do. Regulators are not doing it for them but they are supposed to hold them accountable for the mechanisms and systems they have in place to achieve that.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

How many social media platforms are there?

Dr. John Evans:

I will defer to Mr. Moroney.

Mr. Wayne Moroney:

To be clear, does the Cathaoirleach mean how many there are in Ireland?

I am sorry, but that is a "How long is a piece of string?" question. When we think about social media platforms, obviously there are the larger platforms we are all familiar with and then one begins to go down to the smaller platforms, which some or many people may have heard of, and then smaller ones again such as small forums that one might only be aware of per country. In Ireland we have at least 135 online platforms under the DSA. This is how many are established in Ireland. Across Europe it must be hundreds and possibly thousands more.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Okay. So, with regard to the ones in Ireland, if there are at least 135, am I right in saying that you do not actually know exactly how many there are?

Mr. Wayne Moroney:

We had information from a couple of years ago from a study that was being done. That came back with, I believe, the number at the time which was 138. Through the quality checking when I joined Coimisiún na Meán, we reduced that number by a few and then added a couple more to it. There is a current piece of work being done in order to rebuild that list and validate it for us to make sure we have an accurate number and an accurate view of the population in the country.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I agree with the members who have acknowledged just what a mammoth task the commission has, but how can the commission regulate an entity if it does not know if it is there or not?

Mr. Wayne Moroney:

Our approach is risk-based supervision. We prioritise and extend our attention based on risk. We are informed by a number of factors. The things that we consider now would be the size of the platform, complaints and concerns we receive, intelligence or awareness from other sources, and a variety of other factors that drive how risky a service may be. While we can build that list based on risk so we can prioritise effectively, at any given time a new service could appear and it might also be considered risky. We can take that on board. We can and we will prioritise based on risk. This is not to say that a platform we have not heard of, or that we are aware of but we do not pay that much attention to, is not risky. It is based on the scale that there are riskier services we are prioritising first. Anything that requires our attention will receive it.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

In answering another question earlier, the witness said he did not know how many content moderators there are, or were before they were reduced. How can the commission be satisfied? I accept what the witness said that it is not the commission's job necessarily to moderate all the content and that the providers have a role, and I welcome that. How can the commission be satisfied that each provider has sufficient moderation staff or moderation teams if the commission does not know how many they have?

Dr. John Evans:

The way I think about this underlines a sort of call to action by us, which is the need for people to report when they see illegal content online or content that they think is against terms and conditions. That would then allow us to gather information on how the platforms are actually dealing with it, which then allows us to see patterns in what they are getting right and what they are not getting right. For example, I talked a little bit about the Article 16 work we are doing at the moment on the notice-and-action mechanism and how easy it is to use. We are looking at ten platforms. The reason we decided to do that piece of work was because there was a pattern in the large number of complaints coming in saying that this is really not easy to use and this is not something that is working well at the platform, so we decided to look more closely at it. This is the way we get at that kind of issue.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Okay. If I set up a social media platform in a country outside the European Union and I call it - let us pick a name - Verity Social, and if I start to do things that the commission agrees are unreasonable, illegal, unfair or whatever term you want to use, how does the commission shut me down?

Dr. John Evans:

This goes to Senator Mullen's point about the extra-territoriality. A platform like that would have the option of saying it wants to be regulated by this regulator or that regulator. It can pick a member state where it wants to do that. If it does not do that and a regulator in a member state spots something going on, it can choose to go after that. If that company is trading in the EU, which in this scenario it is, it is subject to European law.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

So does the commission just block it?

Dr. John Evans:

Can we block it? Eventually, yes, but we would be trying to engage first. One might make investigations and impose fines-----

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

That is reasonable but if at the end of the day they are rogue, they are not engaging, they are not accepting what the commission is saying and they are pushing on with misinformation, or whatever it is, can they be blocked?

Dr. John Evans:

Yes. It can be blocked in the system, whether it is a commission matter or a regulator matter.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Can it pop up under some other name very quickly? If it does, do the commission or the regulator have to go through the same rigmarole?

Dr. John Evans:

Like a sort of whack-a-mole activity.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Yes.

Dr. John Evans:

I have not spotted that yet. The regime is at an earlier stage. I would say the regulatory challenges are with the bigger entities for the moment. We will then move down through that kind of scenario later. For me, that is sort of a mature regulatory situation. It kind of goes to that risk point. If it is a very big entity with a lot of members and users, simply switching names is not going to be an effective mechanism for evading regulation.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I know I am over time, but I ask the members to indulge me. We have kind of disregarded the time limits in some respects. I have two issues in relation to the larger entities. First, all of us will have the experience of reporting comments, replies or whatever they might be that are absolutely out of bounds, only to have the platform find that they do not breach its community guidelines despite the fact that they might be defamatory or wrong in lots of ways. If people are experiencing this, and I think it is quite common, should they be bringing it to Coimisiún na Meán's attention?

Dr. John Evans:

They should. The way the system is set up means that if you flag content with the platform, you are supposed to get a decision back on whether it is being taken down. You can then appeal that decision within the platform. You may get the same response back, but at least you have that right of appeal. There is then a third element, if you want to go that far. The DSA architecture provides for out-of-court dispute settlement. We certified the Appeals Centre Europe, ACE, which has been up and running since just before Christmas. There are a number of other settlement dispute bodies. What they are doing is putting out decisions about whether, for example, a particular platform's content was contravening its terms and conditions. It is all about data for regulators. The pattern that we will see coming out of those three mechanisms is important, but if a person is not happy with a platform, perhaps because they did not get a response to their initial flag, they can come to us with a complaint. If they are not happy with the appeals process they have experienced, they can also come to us with a complaint. We will build a picture and figure out our priorities from there.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

The second issue is the algorithms and how and why they push material towards certain users. How much information or insight into the algorithms does Coimisiún na Meán have?

Dr. John Evans:

The algorithms are mainly captured by Articles 34 and 35 of the Digital Services Act. I will let my colleagues come in on this in a second, but if there is any aspect of the provision of a platform service, which includes the underlying algorithm, that creates a risk, the platform is supposed to put in place mitigation measures. If we think a platform is not doing enough, we can act. For example, Mr. Kenny talked about rabbit-hole effects. That is essentially an algorithm issue, or a recommender system issue. In that case it was TikTok. If TikTok has not put in place adequate measures to prevent harmful rabbit-hole effects occurring, it is potentially liable for a fine. That is an ongoing investigation and we will see how that goes. As part of the investigation process, the team will be testing the algorithm. There are experts at the European Commission. We are about to recruit our own team. There is also a centre for algorithmic excellence in Barcelona, which the European Commission uses. In terms of expertise, the big guns are there and are available to use.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Coimisiún na Meán cannot direct the companies to give them access because the algorithms change all the time, do they not? If Coimisiún na Meán is testing them today, how will it know they are not changing tomorrow to do something similar or worse?

Dr. John Evans:

Article 40 of the Digital Services Act is the one that deals with the vetted researchers. In it, regulators and vetted researchers are allowed to get access to information on the platforms.

How that will work out in terms of how much visibility we need of the algorithm is still be determined but we are building up that capacity.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I will come back.

Eileen Lynch (Fine Gael)
Link to this: Individually | In context

I want to raise the role of Coimisiún na Meán regarding online sellers and online marketplaces based outside Ireland and the way they advertise online. Does Coimisiún na Meán have a role in ensuring that these online sellers comply with the same standards and regulations Irish providers have to comply with? That very much ties in with the online socials platforms as advertising for sellers is such a large part of that. What role does Coimisiún na Meán have in regulating that and ensuring that third party countries are compliant with the regulations here? If they are not, there is a very unfair advantage and a bias against Irish traders.

Mr. Wayne Moroney:

There are two parts to this around sellers or marketplaces and advertising. The DSA has different authority so we are the main authority for the digital services co-ordinator, but when it comes to marketplaces and those engaging in the sale of products to users, the CCPC is the authority responsible for those. However, a marketplace is still covered by the rest of the DSA - by us - so there are other concerns there. Even when they are in a marketplace, they are responsible under the obligations under the DSA and can be captured either by us for general activity or by the CCPC when selling a product. There are advertising obligations if they are advertising a product or service either on their own platform or somewhere else and these come on under Article 26 across all online platforms and Article 39 for advertising on very large online platforms. Those have the same obligations for all services, which involves transparency. Users must know that what they are seeing is an ad and they must know who paid for that ad and why they are seeing that ad. That transparency gives a bit of power back to the user and places an equal amount of obligation on all the services to collect that information from the seller in that case to provide it to the user.

Eileen Lynch (Fine Gael)
Link to this: Individually | In context

More so with regard to third country traders, Mr. Moroney mentioned their compliance with the CCPC and the DSA but how does Coimisiún na Meán enforce that compliance and how does it ensure that traders outside Ireland or the EU in particular are held to the same standards? Once it has been identified, how does enforcement work?

Mr. Wayne Moroney:

This relates to what Dr. Evans said earlier, the obligation that, outside the EU, a service is required to nominate a legal representative inside the EU or a country of establishment. When it does that, the country's DSC is their regulator. If the service does not do that, any DSC, given motivation, can say that it has an interest in that service and will notify the other DSCs and pursue it. Not complying and not engaging with a legal representative is not really an escape. There is a process for us or any other DSC to pursue a service to get it in line.

Eileen Lynch (Fine Gael)
Link to this: Individually | In context

What role does Coimisiún na Meán have in the protection of the consumer rights of online shoppers and health and environmental issues, particularly with regard to things like Cyber Week and Black Friday? A lot of that involves consumer protection but is there a role for Coimisiún na Meán to inform shoppers about their rights when it comes to what is going on with these big events?

Mr. Wayne Moroney:

Not specifically on shopping events. When it comes to consumer protection, we would defer to the CCPC. However, we will continue to work on literacy and awareness of services and users' general rights when it comes to services. That involves awareness of terms and conditions, the ability to report content that users think is at issue on the platform or might be illegal content and their rights in being responded to and treated fairly by the platform.

Mr. Tiernan Kenny:

We are a member of the Digital Regulators Group along with ComReg, the CCPC and DPC. One piece of ongoing work there involves trying to give consumers easy information about which one of those authorities to go to when they experience an issue because we spoke a lot about trying to make sure people can use their rights.

We are regulators. We love acronyms and very complex terms that have entered the lexicon of everyday people, so we are doing work to help people know what their rights are, how to use them and who is best positioned to help them.

Eileen Lynch (Fine Gael)
Link to this: Individually | In context

Are there actual campaigns based around that kind of literacy? I know we discussed it earlier regarding election literacy but is it that kind of digital literacy or a more general all-encompassing-----

Mr. Wayne Moroney:

We have not done any joint campaigns under the DRG banner yet. We run our own campaigns as Coimisiún na Meán trying to make people aware of their rights and some of the things they can do under the online safety framework, mainly around what types of content they can report, how to do that and when they should be getting in touch with us.

Dr. John Evans:

The Senator will see that the CCPC would put something out around Black Friday, so it would be the more relevant regulator on most of those consumer protection issues.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

The witnesses are completely ILWA - in love with acronyms. I wish to ask about two things - malinformation and pornography. The Minister, Deputy O'Donovan, recently defended the need for a national approach to combat what he called online misinformation, disinformation and malinformation. Malinformation, as I understand it, refers to information that is accurate but which might be shared out of context or with harmful intent. We could be talking about truthful information - a discomforting, inconvenient or even inflammatory truth. That is a controversial business. Is Dr. Evans in a position to confirm that the work of the commission has nothing to do with malinformation as I describe it? I understand the other terms the Minister used but I would like to have clarity on that specific question about whether the commission sees malinformation as coming within its brief.

Dr. John Evans:

I have not heard that term before but if I understand the way the Senator describes it, it sounds like a particular type of misinformation and disinformation.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Let us say one had information that was true but likely to cause annoyance, unrest or hostility. It could be about crime statistics and particular cohorts. I do not want to get into inflammatory examples. It would not be misinformation or disinformation.

Dr. John Evans:

I have seen a lot of attempts to categorise in a scientific way different ways to present truthful or untruthful information and it is very difficult: they overlap, they do not overlap, the context changes and suddenly there is a new category of information. I would push back a little back and say I think that malinformation is still a kind of mis- or disinformation and, as such, it could get caught as a systemic risk issue under the Digital Services Act. The more sophisticated and distant you go from illegal content, the harder it is to make those calls and trying to always maintain that balance of freedom of expression.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Leaving aside the question of child protection for a moment, there are porn platforms that display and, therefore, promote violence against women - things like choking, slapping, etc. Is the commission active or does it intend to be active in taking on platforms engaged in the provision of that kind of so-called service?

Dr. John Evans:

Mr. Kenny will talk about the online safety code on this.

Mr. Tiernan Kenny:

We would always say that if something is illegal offline, it is illegal online. It does not necessarily matter if it is pornography or another type of content. We have talked a lot about-----

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

I am not necessarily talking about illegality.

Mr. Tiernan Kenny:

This goes back to what Dr. Evans was saying about the balance between freedom of expression and ensuring people's safety online. You would have to recognise that there is a lot of behaviour and content online that you would not necessarily condone or could not endorse, but if it stops short of being illegal, the methods to address it are a bit different. Again, as Dr. Evans said, you could be looking at the Digital Services Act in terms of the risk assessment and mitigation there if you felt that the presence of this type of content, allied to how the service was designed or could be used by people, was leading to systemic risks in the area of safe public health. If you talk to some of the public health bodies, they will tell you some fairly damning stories about some of the people who access their services as a result of activities they have engaged in with other people.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

I will come back to child protection again because I think most people would agree that freedom of expression really has to take second place there. I just worry, and I am wondering whether the coimisiún is making it too easy for porn companies by not existing on strict age verification. At the end of the day, if I want to open a bank account, the bank will have to know for sure that I am over 18 and it will not be the case that it is going to ensure that normally people aged under 18 will not be able to open a bank account, or if I am required to use a credit card in order to get access to a particular service, it can be taken that I am 18 or more. People will be wondering why we cannot have insistence on the same level of difficulty. If some people are worried about their privacy because of requirements the coimisiún imposes on porn companies, why should we worry too much about that? We are not talking about some socially beneficial service here.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

The question has been asked now.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

That is the question I am asking Dr. Evans. Is he going to-----

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

It is asked now; I am going to ask the witness to respond.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Yes, I am sorry. I thank the Cathaoirleach.

Dr. John Evans:

There is quite a lot going on in the pornography space in terms of regulation at the moment, but some of it is just gathering momentum. We have only had our online safety code since this summer. The European Commission announced the opening of the investigation into the four biggest porn providers in Europe. As those investigations are worked through, we will be able to benefit from the precedents they set, and that will have a demonstration effect on other areas. Allied to those actions, Mr. Moroney explained the co-ordinated action for below-threshold porn sites that the different DSCs in the different members states are pursuing. In Ireland, for example, we have five names. The Senator would not recognise them because they are below threshold.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

What do you mean "below threshold"?

Dr. John Evans:

I am sorry; it is another term of art. It gets confusing for us too sometimes. The way the Digital Services Act is organised is that the European Commission has responsibility for some articles under the regulation - the systemic ones I was explaining a bit earlier - provided that the number of users in the EU is greater than 45 million.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

The very large online platforms, VLOPs.

Dr. John Evans:

Yes, that is right. That is the threshold. Everybody below that number we refer to as the below-threshold providers.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

Does the coimisiún have the power to up its level of regulation in this area? If it wanted to go for strict age verification, does it have the power to do that?

Dr. John Evans:

If strict age verification is the only way of assuring effective age verification or effective measures then we can insist on that, but if there are other ways of doing it that achieve the same objective then we would be happy with that also.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

However, that is a discretion matter for the coimisiún.

Dr. John Evans:

It is, yes.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

If Ireland is unhappy with the way an organisation that has its registered headquarters in Cyprus, for example, is doing its job, being too lax or whatever, have we the right to legislate to make a more difficult environment for porn providers if we wish in order that our people are protected as we see fit?

Dr. John Evans:

Correct me if I am wrong here but I think one of the biggest porn providers in Europe is based in Cyprus and is subject to this European action. The way we co-ordinate that is through the European digital services board. The Cyprus digital services co-ordinator is on that group as well. That board is supposed to inform the priorities of the Commission and also the digital service co-ordinators in each of the member states. We are often on the receiving end of those concerns. For example, if someone is making a complaint of the kind I am encouraging people to do in Germany or Spain, but it concerns a platform that is based in Ireland, it comes to us. We get over 90% of the complaints traffic in Europe around this issue. We can see that it is, as I said, a system of regulation. It is the DSCs and the European Commission all working together to try to make online a safer place.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

I thank Dr. Evans.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I call Deputy Ó Murchú, who is not going to repeat everything he said earlier.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

I am definitely going to say apologies if I do, and the witnesses should feel free not to answer, but I do not really mean it. That is my level of disinformation and misinformation.

Some 90% of complaints are coming through Coimisiún na Meán on the basis, obviously, of us being home to some of the online platforms. This is a follow-on from my previous question. It is the interaction the coimisiún has had to date. Dr. Evans said the framework is there, and he has spoken about what people have to do if they see something that falls into a bracket they consider an issue.

My fear is that the entities with the resources and capacity to deal with this, if they really wanted to, are the online platforms. Given the human resources they have, and in this age of AI, they can catch a lot of this if they want to do so. The problem for them with changing the algorithms is that they make far too much money from them. I acknowledge there is a consideration in relation to the European Commission and the Government in that some of these platforms are very big entities from an employment and taxation point of view, etc.

However, at this point, we all are aware that incredibly detrimental information is out there, particularly some of the stuff getting to young men, whether we are talking about misogynistic material or content that is entirely untrue. While people have an absolute right to have conversations on migration, for example, we have seen total untruths being published. With the telling of the lie, it becomes more normalised and acceptable. People might ask what even is the truth at this point, and that is a dangerous place. Young girls are sometimes driven down a particular rabbit hole regarding eating disorders and whatever else, which is utterly wrong and terrible in many ways. That will continue until these companies decide to take action. Fighting them, whether it is the coimisiún or others across the European Commission doing so, will be incredibly difficult.

How can the Government make the coimisiún's job easier by way of resourcing and whatever else? What needs to be done in that regard? I am incredibly worried. Will Dr. Evans talk about some of the engagement he has had with some of the platforms?

Dr. John Evans:

Deputy Hayes made the point that the platforms are so much bigger and have so many more resources available to them than do the regulators. There are approximately 800 staff working on online safety across Europe at this time, and that number is growing. However, that is not really the point. It is up to the platforms themselves, as the Deputy noted, to ensure the services they offer to society are safe. The purpose of regulators and of legislation is to provide the correct incentives to make that happen. The fines available to us and to the European Commission when we find breaches are very substantial.

That is one part of it. If I had a wish list, one thing I would like to see more of is stronger information-gathering powers. We are talking to the Department about that at the moment and it has been quite supportive. Currently, we can only compel the production of information if we have an open investigation. This is quite inefficient as it means we must open an investigation if we want to rely on information to feed our activity and drive behavioural changes on platforms. We are working with the Department on legislative proposals to improve that. Any support members could offer in that respect would be very useful.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

The big question is how to deal with the algorithms, the recommender functions and other things that make a huge amount of money for the platforms but create a huge amount of the difficulty we see regarding misinformation and disinformation, which can be used by everybody from chaotic characters right through to organised entities, both governmental and non-governmental.

Dr. John Evans:

I will say a little about recommender systems and systemic effects and then I will ask Mr. Moroney to explain, as an example, what we are doing in this regard in the electoral space as part of our electoral preparations.

Recommender systems are part of the service online platforms provide. That aspect of their service may create a risk to society under four broad headings: a risk to democracy and civil discourse; a risk to public health; a risk to fundamental rights such as freedom of speech; or a risk of the proliferation of illegal content and products online. If a platform's service, including its recommender system, presents a risk, it is supposed to mitigate that risk. If platforms do not mitigate that risk adequately, then there can be enforcement action and, ultimately, significant fines.

There are a couple of examples we have talked about. I have spoken about the open investigation on X that the European Commission has. That is in part focused on the mitigation of risk presented by mis- and disinformation. A more preventative measure is what we are doing under the election guidelines.

Mr. Wayne Moroney:

Part and parcel of the DSA election guidelines is that we engage in a pre-election round table with the platforms and local authorities. We have an opportunity to bring forward concerns we have and to question them on their adherence to the DSA election guidelines. This is useful because they are supposed to staff and resource elections sufficiently. They are supposed to understand elections and have a response for a particular election. It is not a general response; it is for that one in particular. Their algorithmic work goes hand in hand with that. It is supposed to be transparent and to address those concerns. Following that election, they are supposed to assess their response and report on it both to the European Commission and also in a public way so there is transparency and accountability on the platforms' performance throughout that period. That can be reflective of advertising algorithms and content moderation. Some of it will be confidential and some will be public. We can engage on that as well.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I want to ask about access to social media platforms for under-16s. Assuming that becomes policy, is there a difficulty from the witnesses' side? We have talked about age verification and age assurance. Do the witnesses see any practical difficulty in implementing a rule that would restrict access to social media to over-16s or over-18s?

Dr. John Evans:

Technically, such possibilities are envisaged already. The Article 28 guidelines are the part of the DSA that deals with safety of minors and they envisage the possibility that member states may approach age thresholds in different ways. Mr. Kenny spoke of the identity wallet. That is being set up so different ages could be involved. Technically, I think it is possible.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I want to talk about the downloading of apps. We all download apps for all kinds of purpose. When we do, we are often asked to sign up to terms and conditions. I am a lawyer - a senior counsel - and I have been at it for a long time. I do not read the terms and conditions; they are muck to me. I do not understand them and certainly do not have several hours to sit down, go through them and parse what each term and condition might mean. Is there any realisation of that within the commission? Will it do anything to require app providers to put up writ large at the start the specific implications that might be of concern to users?

Dr. John Evans:

That is not a focus of our work at the moment. It is a recognised problem, though, that often when one engages in transactions online, one is bombarded with terms and conditions. The most frequent lie is probably, "Yes, I have read them and consent". It is a problem. That line of work is not something we are looking at at the moment but potentially we are happy to engage further on it.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Does Dr. Evans accept there are Irish people downloading apps from different sources who give permissions they do not understand they are giving?

Dr. John Evans:

That is quite possible. This goes back to media literacy. People need to be aware of what they are doing online, what they are consenting to and how their information might be used. That is extremely important.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I appreciate that. Where somebody signs up to an app that takes certain information from them - data, contacts from phones, whatever it might be - to what extent is it permissible for the data operator to share such information with a third party?

Dr. John Evans:

That is probably a GDPR question, to be honest.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Does Dr. Evans agree it is not permissible or that they should have some kind of express permission to do that?

Dr. John Evans:

I am not a GDPR expert so I am not going to answer that, if that is okay.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Okay, fair enough. Thank you.

I wish to ask about the programme the commission is rolling out involving people receiving text messages about possible spam or scams. I have received a few of them and every one of them has been wrong. I appreciate the efforts behind it. Is there any update on the programme? Is it working? Can it work?

Mr. Tiernan Kenny:

On the online space, we are involved through some of the networks. I have an awareness of how this is being dealt with. ComReg has initiated the programme to alert people when it thinks spam texts are coming through, but I am not sure how that is going yet.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

On the European social media task force that has been announced, can the witnesses explain to me how that works in tandem with the work the commission is doing? Is it above or below the commission or does it work hand in hand with it?

Mr. Tiernan Kenny:

We are not quite sure yet but we are very eager to find out. As Dr. Evans mentioned, we are one of the largest regulators in European terms. We found out about the task force at the same time as everybody else, that is, when the state of the union speech came out. It is something we will seek input on, as appropriate, based on the experience and expertise we have developed over the past two and half years.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

The commission has no idea how it is going to be resourced or if that will be shared with the commission.

Mr. Tiernan Kenny:

Unfortunately, we do not . I assume it will be led by the European Commission and DG CONNECT, which is the department we interact most frequently with. It would seem to be its natural home, but I am very happy to provide the committee with an update as and when we learn more about that.

Dr. John Evans:

I attend meetings of the European Union services board. We will have our first meeting since the summer next Tuesday and I expect an update on that kind of issue.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

On non-Irish content, I find that in many different online spaces one is bombarded with content, whether it is advertising or other content, that is not at all designed for the Irish market. Is that an issue of concern for the commission?

Dr. John Evans:

The issue always goes back to whether it is illegal content.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

I think it is legal, but it is definitely undesirable. That maybe rules it out on the commission's side.

Dr. John Evans:

It is still a relevant issue to us if the content could contravene the terms and conditions of the platform's own services. If it is not policing that well, that is an issue for us.

Mr. Tiernan Kenny:

More broadly from the Coimisiún na Meán perspective across our entire remit, it is important that people can access content that is relevant to Irish culture and society. The committee might know that we run the Sound and Vision funding scheme, which produces radio broadcast content and has been going for 20 years. I mentioned the journalism scheme earlier. Overall, it is a consideration for the organisation. In terms of the specific DSA regulation, though, it does not come under our remit.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Does anybody else want to come back in?

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

Yes. I did not hang around just to listen to the Chair.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

We know that.

Photo of Rónán MullenRónán Mullen (Independent)
Link to this: Individually | In context

You never ask a question to which you do not know the answer. I thought that was a golden rule.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

The problem is, I knew the answer.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

To follow on from the previous questions, I refer to the Trump Administration and the position it has taken regarding freedom of speech. Sometimes, I think people want to protect certain types of freedom of speech and not others. One can foresee that this will cause issues in terms of the work that Coimisiún na Meán, the European Commission and others want to do in this particular space.

Perhaps this has already been discussed, but the Finns have done a serious amount of work on social media, computer literacy and teaching people to analyse information and make a determination. It is not that people have to buy every conspiracy theory. We have all come across people who asked us about a particular thing they believed and we have shown them what we would say was a reputable source that tore the argument apart. We are then told that they cannot trust that, but some people want to trust a guy walking around on the street shouting and screaming who is obviously a really good source. Unfortunately, we need to get that message to people early. People are choosing to be advertised to by some really nefarious people. McDonald's, Coca-Cola and others would not spend millions of euro on advertising if it did not work.

I am very worried about the future. What can be done here about the media literacy piece? What can be done about operating a system like the Finns have?

Is the commission worried about the Trump Administration and its fight against the regulation of big tech and social media companies?

Dr. John Evans:

I will give a very boring answer to that last question. We are an independent regulator. We are enforcing the law that is in front of us and we will do that to the best of our ability.

Media literacy is hugely important. Mr. Moroney has talked about our election preparedness a few times. One of the things we often say is that electoral integrity is something that goes on constantly. Elections are just events but the ability of the electorate to discern information accurately and to have that facility is really important. It is even more important now with the advent of AI and the proliferation online. It is easier to create misinformation and disinformation more quickly. There is no doubt that the stakes have been upped. You do not believe everything you read and now you do not believe everything you see or hear either. You really do need those skills to be able to discern. The risks to society are significant. There is no doubt about that.

Photo of Seán CroweSeán Crowe (Dublin South West, Sinn Fein)
Link to this: Individually | In context

I do not know if the question about the use of robots, particularly from outside the EU's jurisdiction, was answered. The control of data is important when a handful of people have huge control over it. We are all concerned about what happens during elections and so on but, as has been pointed out, the election can be framed in people's minds by the use of data, disinformation or whatever else. My concern is that people can buy robots to filter a particular view, information or whatever else. How can the commission combat that? These are based in different countries around the world. It is very complicated and very worrying because it is about information and data. Those who control the data at the moment have a particular world view in relation to various things and seem to be skewing information to align with that view. People have mentioned different things such as misogyny and people of colour. How can that be combated?

On an aside or separate issue, Dr. Evans mentioned misinformation, fact-checking and so on. I do not know if this was covered earlier but I would really be worried about some of the stuff that is online. You ask how to cure cancer and there is stuff about going on a diet of garlic and onions for six months or something like that and people actually believe it. They may be desperate enough to believe it. It is an extreme example but does the commission have any role in relation to that because, as has been pointed out, people of all ages are reading this and it will have a particular impact on young people?

How can robots, particularly from outside, and that information be controlled? I also asked about that handful of people who control data and about misinformation in relation to health stuff.

Dr. John Evans:

I will ask Mr. Moroney to talk a little bit about the robots. On misinformation and disinformation around health, the Deputy may remember the systemic risk I mentioned when explaining Articles 34 and 35. Electoral integrity is a big one that is on our minds all the time but the Digital Services Act was actually conceived during the time of Covid when this misinformation and disinformation around health issues became really quite prominent. That is why health is called out as a specific area of high risk on platforms. The kind of issue the Deputy was talking about can get caught by the DSA. Would Mr. Moroney like to talk about robots?

Mr. Wayne Moroney:

I will talk about robots in two ways. The first thing I will mention relates to Article 28 and the Article 28 guidelines under the DSA.

There are a couple of measures there which are particularly useful when it comes to a service like Roblox. One is the prevention of contact by strangers. Children should not be contactable by strangers. That is one of the guidelines we would like to see implemented. Alongside that as well is moving conversations off platforms under the DSA and through dark patterns. This is common for other types of servicing issues as well where you see a user engaged with and moved across to a different platform, service or form of communication. We are seeking to prevent that type of activity as a means to protect children under the guidelines.

In following the guidelines and using the other aspects of the DSA, we expect a service like Roblox and other services to become safer for users, particularly for children. For Roblox, in particular, that service is established in the Netherlands. The Dutch DSC is the regulator for them. We have a relationship with that DSC as we do with many others. If we receive concerns and complaints here, we can transmit them to that DSC and it is responsible for dealing with that issue and making sure that it is a well-regulated service. It is a relationship that is working quite well. If there are specific concerns, we are happy to receive them and pass them on.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

Finally, because I am conscious of time, Deputy Ó Murchú has a ten-second question.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

Mr. Evans spoke about the idea of stronger information or data gathering powers. That makes sense, but what exactly are we talking about and what does the commission need? Are we talking about things such as agent bots, storage capacity and cloud computing in relation to going-----

Dr. John Evans:

No, it is very straightforward. Most regulators have the power to issue statutory information requests. This means that whoever is in receipt of one of those has to respond to it and has to provide the information that is requested. We have that power under the Digital Services Act but only after we open up an investigation. What we are looking for is to have that power ahead of opening up an investigation when we are doing supervision activity.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

It is about streamlining it.

Dr. John Evans:

Yes.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

It is not a capacity issue.

Dr. John Evans:

No, it is not about capacity. It is about the power.

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

Does the commission not need any of those things?

Dr. John Evans:

Do we need extra capacity to handle it?

Photo of Ruairí Ó MurchúRuairí Ó Murchú (Louth, Sinn Fein)
Link to this: Individually | In context

No, I meant in the sense of going out and finding it. Seek and destroy, if you will.

Dr. John Evans:

No.

Photo of Barry WardBarry Ward (Dún Laoghaire, Fine Gael)
Link to this: Individually | In context

We will call it a minute. I thank the witnesses because that was a long session. They addressed everything that they could except the questions that I asked them that were outside their remit. I am really grateful for their time and expertise. I acknowledge what other members have said about it being a mammoth task. We do hope they can do it because it is a very important task as well. We wish them the best of luck. They have the support of lawmakers here in whatever way they need it. I thank them for their time and expertise.

We are going into private session now and will then adjourn until Wednesday, 24 September 2025 at 3.30 p.m. We will suspend the meeting for two minutes before we start the private session.

Sitting suspended at 5.38 p.m. and resumed in private session at 5.42 p.m.

The joint committee adjourned at 6.20 p.m. until 3.30 p.m. on Wednesday, 24 September 2025.