Written answers
Monday, 8 September 2025
Department of Communications, Climate Action and Environment
Online Safety
Ruth Coppinger (Dublin West, Solidarity)
Link to this: Individually | In context
1082. To ask the Minister for Communications, Climate Action and Environment his progress to date in the programme for Government commitment to “resource and implement the new EU online safety framework to protect vulnerable groups from online harm, and tackle disinformation”; and if he will make a statement on the matter. [46099/25]
Patrick O'Donovan (Limerick County, Fine Gael)
Link to this: Individually | In context
There is now a significant body of legislation providing the foundation for Ireland’s online safety framework, including the regulation of social media. It is a whole-of-Government priority that children and vulnerable groups are able to avail of the full benefits of the online world without being exposed to harmful content.
Coimisiún na Meán, as Ireland’s online safety and media regulator, is at the heart of that framework. It was established under the Online Safety and Media Regulation (OSMR) Act, which transposed the AudioVisual Media Services Directive, and that Act is one of three main elements of the online safety framework. The other two elements are the EU Digital Services Act (DSA), under which An Coimisiún is Ireland’s Digital Services Coordinator, and the EU Terrorist Content Online Regulation for which An Coimisiún is a competent authority.
In terms of resources, significant Exchequer funding was provided to An Coimisiún on establishment to allow it to hit the ground running. From this year, as provided for in legislation, An Coimisiún is largely self-funded from levies on regulated entities. This ensures it will continue to have the resources necessary to implement Ireland's Online Safety Framework effectively, as well as its other regulatory responsibilities. Staffing has expanded from around 40 at establishment to just under 240 currently, and will continue to grow to around 300 in the coming months.
Under the OSMR Act, in October 2024, An Coimisiún adopted a new Online Safety Code for designated video-sharing platforms established in Ireland, including TikTok, Facebook, Instagram and X. The Code sets out what they must do to protect users; in particular, they are obliged to put measures in place to protect children from harmful content online including providing for parental controls and have transparent reporting and flagging mechanisms. Since July of this year, Part B of the code has applied. Part B contains specific obligations for the platforms to put in place robust age assurance measures to prevent children from accessing adult content such as pornography or extreme violence.
Additionally, as Digital Services Coordinator under the DSA, An Coimisiún works with the European Commission to ensure platforms put in place the systems and processes required to minimise the availability of illegal content and take measures to tackle disinformation. In particular, Coimisiún na Meán works with the European Commission to ensure platforms address the risk of exposure of children and young people to illegal and harmful online content, including by means of age-verification or age-assurance, setting children’s accounts as private by default, adjusting recommender systems and allowing children to block or mute users, for example. The European Commission recently published new guidelines on the protection of minors under Article 28 of the DSA, which outline recommendations to strengthen online safety for children in EU Member States.
In terms of tackling disinformation, in April of this year, the Government published Ireland’s first National Counter Disinformation Strategy. It acknowledges that freedom of speech needs to be respected, while respecting the rights of others to privacy and data protection, as well as protection from discrimination. The strategy is designed to support and develop tools for people to make up their own mind about disinformation as opposed to the Government deciding what is true or untrue. The EU Code of Conduct on Disinformation, within the framework of the DSA, has been in place since July of this year and it focusses on transparency, accountability, and collaboration among stakeholders. Platforms must carry out risk assessments and take measures to reduce the overall availability and amplification of disinformation, in line with the DSA and the EU Code of Conduct on Disinformation.
The Online Safety Framework is relatively new and the focus now is on implementation to ensure platforms meet their regulatory obligations. Failure to adequately fulfil these obligations can lead to significant financial sanctions and continued non-compliance can lead to criminal sanctions for senior management under the OSMR.
No comments