Written answers
Thursday, 25 September 2025
Department of Communications, Climate Action and Environment
Online Safety
Brendan Smith (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context
280. To ask the Minister for Communications, Climate Action and Environment when additional legislative and regulatory measures will be implemented to prevent online abuse and the online dissemination of misinformation and disinformation; and if he will make a statement on the matter. [51096/25]
Patrick O'Donovan (Limerick County, Fine Gael)
Link to this: Individually | In context
Abuse of anyone, whether offline or online, is abhorrent and I condemn it. It is important that anyone who feels threatened or harassed online reports the content to the relevant platform directly in the first instance. If there is a fear of immediate danger, An Garda Síochána should be contacted immediately. The investigation of possible criminal offences, including abusive, threatening or offensive communications as detailed in Coco’s Law (the Harassment, Harmful Communications and Related Offences Act, 2020), is a matter for An Garda Síochána.
In relation to preventing online abuse, there is now a significant body of legislation providing the foundation for Ireland’s online safety framework. Coimisiún na Meán, as Ireland’s online safety and media regulator, is at the heart of that framework. It was established under the Online Safety and Media Regulation (OSMR) Act, which transposed the AudioVisual Media Services Directive, and that Act is one of three main elements of the online safety framework. The other two elements are the EU Digital Services Act (DSA), under which An Coimisiún is Ireland’s Digital Services Coordinator, and the EU Terrorist Content Online Regulation for which An Coimisiún is a competent authority.
Under the OSMR Act, in October 2024, An Coimisiún adopted a new Online Safety Code for designated video-sharing platforms established in Ireland, including TikTok, Facebook, Instagram and X. The Code sets out what they must do to protect users; in particular, designated platforms must provide for protections to minors from content which may impair physical, mental or moral development, or to the general public from content which incites hatred or violence, or is racist or xenophobic.
As Digital Services Coordinator, An Coimisiún works with the European Commission to ensure platforms put in place the systems and processes required to minimise the availability of illegal content and take measures to tackle disinformation. Under the DSA, the largest online platforms must swiftly remove illegal content once it is reported or identified. This includes content related to existing criminal offences such as abusive, threatening or offensive communications as detailed in Coco’s Law (the Harassment, Harmful Communications and Related Offences Act, 2020).
The focus now is on implementation to ensure platforms meet their regulatory obligations. Failure to adequately fulfil these obligations can lead to significant financial sanctions and continued non-compliance can lead to criminal sanctions for senior management under the OSMR.
In terms of tackling disinformation, in April of this year, the Government published Ireland’s first National Counter Disinformation Strategy. It acknowledges that freedom of speech needs to be respected, while respecting the rights of others to privacy and data protection, as well as protection from discrimination. The Strategy is designed to support and develop tools for people to make up their own mind about disinformation as opposed to the Government deciding what is true or untrue. It focuses on the effective implementation of existing legislation and regulation rather than proposing the adoption of additional legislative or regulatory measures, given that much of the regulation in this space is relatively new.
The EU Code of Conduct on Disinformation, within the framework of the DSA, has been in place since July of this year and it focusses on transparency, accountability, and collaboration among stakeholders. Platforms must carry out risk assessments and take measures to reduce the overall availability and amplification of disinformation, in line with the DSA and the EU Code of Conduct on Disinformation. Enforcement of this is being led by the European Commission for the largest platforms.
No comments