Oireachtas Joint and Select Committees

Wednesday, 28 February 2024

Joint Oireachtas Committee on Jobs, Enterprise and Innovation

Operation and Resourcing of Coimisiún na Meán: Digital Services Commissioner

Dr. John Evans:

I thank the committee for the invitation to speak today. I am digital services commissioner at Coimisiún na Meán and I am joined by Mr. Tiernan Kenny, director of communications and public affairs. I will give a brief overview of our work and structure at Coimisiún na Meán and describe the EU Digital Services Act and our role under it, as well as what it means in concrete terms for people in Ireland.

Coimisiún na Meán was established almost a year ago, in March 2023, and took on the functions and staff of the Broadcasting Authority of Ireland. Our remit covers broadcasting and on-demand regulation, media development and online safety. Since our establishment, we have been working hard to grow the organisation. Our headcount now is 102 as of this week and we have sanction to hire 160 people, a number we expect to hit in the middle of this year. We are arranged in a divisional structure that we believe is the best way to meet our objectives.

Coimisiún na Meán has five commissioners, who oversee the different divisions. I am the digital services commissioner and I work alongside Ms Celene Craig, our broadcasting and on-demand commissioner; Ms Niamh Hodnett, our online safety commissioner; Mr. Rónán Ó’ Domhnaill, our media development commissioner; and Mr. Jeremy Godfrey, our executive chair. Ms Craig will step down in mid-March and a recruitment process is under way to replace her. The organisation is structured into different divisions, each of which is overseen by a commissioner. There are four external-facing divisions. I look after platform supervision and investigations, which focuses mostly on compliance and enforcement of platform obligations in respect of harmful and illegal content under the DSA, the terrorist content online regulation and national law. The second division is regulatory policy, which focuses on consulting with civil society and industry on harms and issues and on making rules for industry to follow. It is overseen by the online safety commissioner, Ms Niamh Hodnett. The media landscape division focuses on audiovisual and media services, AVMS, providers, sound broadcasters and content producers and is overseen by the broadcasting and on-demand commissioner. The user support division focuses on audiences and users of online services and is overseen by the media development commissioner. In addition, we have a data and technology division, a legal services division and a corporate services division, which includes functions such as governance, communications and finance.

This year, we are putting in place an online safety framework in Ireland. This will apply to the online services used every day. This framework will end the era of self-regulation in the technology sector and make online platforms accountable for how they keep their users, especially children, safe online. Platforms must also uphold fundamental rights, including freedom of expression. This framework has three main parts: the EU terrorist content online regulation, for which we became a competent authority in November 2023; the EU Digital Services Act, which became fully applicable on 17 February 2024; and the draft online safety code, which is out for consultation at the moment. It is our intention to implement the different elements of this framework in a coherent way, to avoid unnecessary duplication and burdens for businesses and make it easy for citizens to know and enforce their rights. We have responsibility for regulating services which have their European headquarters in Ireland. I will explain our role under the DSA in more detail, as this sits within my role as digital services commissioner.

The Digital Services Act is an EU regulation that sets rules for online intermediary services. The definition of an online intermediary service is broad, covering almost any service which is provided online, although it does not include private messaging services. The DSA applies a baseline set of obligations to all intermediary service providers, including having clear terms and conditions for how they can be used. Further obligations are added depending on the functionality and size of the service. The DSA applies a particular set of obligations for online platforms, such as social media services, online marketplaces and app stores. These include providing a way for users to flag illegal content, publishing transparency reports on their activities, allowing users to appeal content removals and devoting sufficient resources to content moderation activities, without solely relying on automated decision-making.

The most stringent obligations apply to very large online platforms, VLOPs, and very large online search engines, VLOSEs. Any platform or search engine with 45 million or more monthly active users in the EU qualifies as a VLOP or VLOSE. These services have additional obligations to assess and mitigate the risks that arise from how their services are designed and used across four categories, including how they can facilitate the spread of illegal content or cause negative outcomes for public order, electoral integrity, and public health. These provisions can help to address issues such as misinformation or disinformation. The European Commission has designated 22 VLOPs or VLOSEs, 13 of which have established their EU headquarters in Ireland.

Each member state has to appoint a digital services commissioner, DSC, to enforce the DSA. In Ireland’s, case the DSC is Coimisiún na Meán, with some responsibilities for online marketplaces allocated to the Competition and Consumer Protection Commission. Each DSC has responsibility for regulating the service providers whose EU headquarters are in its country, but the European Commission has primary responsibility for some of the obligations applying only to VLOPs and VLOSEs. As online services can be provided across borders, international co-operation will be important in the application and enforcement of the DSA. The legislation set up a group of national DSCs and the European Commission, called the European Board for Digital Services, which met for the first time last week. It is our intention to work closely with the European Commission and our EU counterparts to ensure that the DSA leads to improvements in online safety for people, especially children. This co-operation will be particularly important when dealing with the largest platforms, given the European Commission’s role.

I might say a few words about users' rights under the DSA. A key aim of the DSA is to provide a more predictable and trustworthy online environment for users. The DSA gives users several rights when using online platforms. As well as the obligations to explain their terms of service in plain language and provide a mechanism for flagging illegal content, the DSA also obliges platforms to inform users when their content has been removed, down ranked or demonetised or when their accounts have been suspended or terminated. Users also have the right to appeal content moderation decisions made by platforms to the platforms themselves and then to an out-of-court dispute settlement body where one exists.

Coimisiún na Meán can receive complaints from users of online services who believe their provider of an online service has not complied with its obligations under the DSA. We have opened a contact centre to give users advice and support and so that we can feed real-world intelligence into our platform supervision teams and take action when the rules are broken. While it is within Coimisiún na Meán’s remit to assess if providers of online services are doing what they are obliged to do under the DSA in respect of illegal content, it is not our role to act as an online censor or to tell people what they can or cannot say. On Monday, 19 February, we opened our dedicated contact centre, providing advice to the public on their rights under the Digital Services Act. It will also be used to gather intelligence that will inform An Coimisiún’s supervisory and enforcement activities.

We have published an application form and guidance on out-of-court dispute settlements. This process allows users of online platforms the opportunity to avail of an impartial and independent means of resolving disputes relating to content moderation decisions by online platforms without requiring users to enter costly or lengthy legal disputes. Separately, we have published an application form and guidance on the new trusted flaggers system. This system will create a fast lane for approved trusted flaggers and platforms will have to prioritise dealing with reports from them. These trusted flaggers will be independent and have particular expertise in detecting and notifying illegal content.

Internally, we are in the process of initiating our supervision strategy and considering our functions relating to the Digital Services Act and our responsibilities under the terrorist content online regulation and the draft online safety code. We have adopted an impacts and risk approach to supervision. This assesses the risk of harm occurring, categorises regulated entities or services accordingly and differentiates the approach to supervision based on the impact category of the various regulated entities. The risk and harm framework recognises that the most value can be delivered by focusing on regulated entities that may pose the highest levels of online harm and risk and on the types of risk that pose the greatest threat of harm to people.

I acknowledge the work of members of this committee in ensuring that the DSA was swiftly implemented in Ireland and that we were one of just six EU member states to have their digital services commissioner in place for the DSA’s first day of operation last week. We are at the early stages of the implementation of the DSA. We will need to work collaboratively with a range of partners to deliver a positive impact for people and ensure that they can take advantage of the benefits of being online while being protected from being harmed by illegal content.