Written answers
Wednesday, 21 May 2025
Department of Culture, Heritage and the Gaeltacht
Online Safety
Barry Heneghan (Dublin Bay North, Independent)
Link to this: Individually | In context
69. To ask the Minister for Culture, Heritage and the Gaeltacht if he is considering the introduction of a ban on access to social media platforms for children under the age of 16 years; if he will outline any legislative or regulatory measures being examined to enforce such a ban, including the use of age verification systems; and if he will make a statement on the matter. [26298/25]
Patrick O'Donovan (Limerick County, Fine Gael)
Link to this: Individually | In context
Online safety for children is a Government priority. Ensuring children do not see illegal, harmful or inappropriate content while being able to avail of all the benefits of the online world is vital for children’s development.
In that context, I have asked my officials to examine international initiatives that are underway or currently being considered elsewhere to protect the safety of children, including in relation to limiting access to social media platforms for children under the age of 16.
Ireland, along with all other Member States, is subject to the Digital Services Act (DSA), a European regulation which requires maximum harmonisation across the European Union. This means that any initiative taken at national level must be in compliance with the provisions of the DSA. As such, Ireland works closely with other EU member states on effective and practical solutions that respect the provisions of the DSA and have regard to the rights of children while protecting them from harm, so that they can safety enjoy the benefits of increased connectivity.
Coimisiún na Meán, Ireland’s online safety and media regulator, is at the heart of Ireland’s online safety framework, which comprises the Online Safety and Media Regulation (OSMR) Act, the Digital Services Act and the EU Terrorist Content Online Regulation.
As Digital Services Coordinator under the DSA, An Coimisiún works with the European Commission to ensure platforms services address the risk of exposure of children and young people to illegal and harmful online content, including by means of age-verification or age-assurance, setting children’s accounts as private by default, adjusting recommender systems and allowing children to block or mute users, for example.
Under the OSMR Act, in October 2024, An Coimisiún put in place a new online safety code for designated video-sharing platforms established in Ireland, including TikTok, Facebook, Instagram and X. The Code sets out what they must do to protect users; in particular, they are obliged to put measures in place to protect children from harmful content online, have age assurance measures where appropriate, provide parental controls and user-friendly, transparent reporting and flagging mechanisms.
Part A of the Code, which has applied since November 2024, provides that designated services shall establish and operate age verification systems with respect to content which may impair the physical, mental or moral development of minors. Part B of the Code, which will apply from July 2025, contains specific obligations, including that designated services shall implement effective age assurance measures to ensure that, for example, adult-only video content cannot normally be seen by children.
An Coimisiún is responsible for the enforcement of the Online Safety Code and it is important, in considering any further measures, that the Code and the provisions set out in the DSA are given time to be implemented in full so that their potential in protecting the online safety of young people is fully realised.
No comments