Written answers
Wednesday, 19 March 2025
Department of Enterprise, Trade and Employment
Social Media
Gary Gannon (Dublin Central, Social Democrats)
Link to this: Individually | In context | Oireachtas source
499. To ask the Minister for Enterprise, Trade and Employment the steps he is taking to ensure that the expansion of the Appeals Centre Europe to additional social media platforms is effective; how the oversight of these platforms will be integrated into existing regulatory frameworks; and if he will make a statement on the matter. [12633/25]
Gary Gannon (Dublin Central, Social Democrats)
Link to this: Individually | In context | Oireachtas source
500. To ask the Minister for Enterprise, Trade and Employment the role of Coimisiún na Meán in assessing the decisions of the Appeals Centre Europe, particularly in cases where platforms fail to abide by EU law, including what actions the agency will take if it finds that platforms are not complying with the centre’s rulings; and if he will make a statement on the matter. [12634/25]
Gary Gannon (Dublin Central, Social Democrats)
Link to this: Individually | In context | Oireachtas source
502. To ask the Minister for Enterprise, Trade and Employment given that the Appeals Centre Europe has ruled against a company (details supplied) in half of its decisions, how the Government plans to support the centre in holding social media giants accountable for content moderation failures; the measures in place to ensure that platforms adhere to the centre’s rulings; and if he will make a statement on the matter. [12851/25]
Gary Gannon (Dublin Central, Social Democrats)
Link to this: Individually | In context | Oireachtas source
503. To ask the Minister for Enterprise, Trade and Employment in light of the 1,500 complaints received by the newly established Appeals Centre Europe, how the Government is ensuring that social media platforms (details supplied) comply with the EU’s Digital Services Act by providing accessible and transparent avenues for users to challenge content moderation decisions; and if he will make a statement on the matter. [12853/25]
Niamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
I propose to take Questions Nos. 499, 500, 502 and 503 together.
I thank Deputy Gannon for his questions.
The EU Digital Services Act (DSA) is an EU regulation that introduces new obligations on providers of online intermediary services, including online social media platforms, to protect users and consumers of online services. The DSA is designed to provide better protection of fundamental rights; more control and choice over online experiences; stronger protection of children online; and expedited removal of illegal content. The regulation has applied in full since 17 February 2024.
The DSA introduces additional specific rules for Very Large Online Platforms and Very Large Online Search Engines, reaching more than 10% of the 450 million consumers in Europe, in view of the elevated risks they pose for the dissemination of illegal and harmful content. The European Commission has responsibility for supervising and enforcing the obligations that apply to Very Large Online Platforms and Very Large Online Search Engines.
The Digital Services Act 2024 designated Coimisiún na Meán as its Digital Services Coordinator for the DSA, and the Competition and Consumer Protection Commission as a second competent authority with responsibility for provisions in the DSA applying to online marketplaces.
The Digital Services Act 2024 empowers Coimisiún na Meán and Competition and Consumer Protection Commission to investigate providers established in Ireland where they have grounds to suspect an infringement of the DSA. They have the power to impose administrative sanctions up to a maximum of 6% of the provider’s turnover for a failure to comply with an obligation under the DSA.
Coimisiún na Meán and the Competition and Consumer Protection Commission are independent in the performance of their functions.
The DSA obliges providers of hosting services, including online platforms, to put in place a mechanism to allow any person to notify them of the presence on their service of specific items of information that the individual considers to be illegal content. Providers must take decisions on whether the content is illegal in a timely, diligent, non-arbitrary and objective manner and act expeditiously to remove or to disable access to illegal content.
Providers are also required to establish an internal compliant handling system so that a person that submitted a notice can lodge a complaint against the provider’s decision. The DSA further provides for an out-of-court dispute settlement mechanism to resolve disputes relating to those decisions, including complaints that have not been resolved by means of the internal complaint-handling system.
The EU DSA empowers the Digital Services Coordinator to certify out-of-court dispute settlement bodies established in the State that satisfy certain requirements detailed in the Act. These requirements include that the body is impartial and financially independent of providers of online platforms and that it has the necessary expertise in relation to one or more areas of illegal content.
The decisions of out-of-court dispute settlement bodies are not binding. Certified out-of-court dispute settlement bodies must report annually to the Digital Services Coordinator, on their functioning.
The Government has no role in the assessment, in the certification, nor in the supervision of out-of-court dispute settlement bodies.
No comments