Dáil debates

Wednesday, 13 December 2023

Digital Services Bill 2023: Second Stage

 

1:50 pm

Photo of Gerald NashGerald Nash (Louth, Labour) | Oireachtas source

I welcome the ambitious Digital Services Act regulation and recognise that European law in this area continues to be the global leader. I also welcome this Bill. The Labour Party will support its transition through the stages to ensure it is enforced by 17 February. In much the same way that the general data protection regulation, GDPR, led the way in privacy rights, the Digital Services Act leads the way in social and market responsibility in online spaces. ComReg called the Digital Services Act a paradigm shift in the regulation of digital platforms in Europe. The Digital Services Act recognises the wide-ranging influence harnessed by the systems that providers of very large online platforms and very large online search engines have and places an important wider social responsibility on them - rightly so.

Recital 3 of the Digital Services Act states:

Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trustworthy online environment and for allowing Union citizens and other persons to exercise their fundamental rights ...

We in the Labour Party could not agree more. Far too often, as Members will know, providers fall below the standards right-thinking people across this country and the EU would expect. They also fall well short in far too many instances in the enforcement of their own guidelines and the fulfilment of their commitments. We commend the intention of the measures of the Digital Services Act, particularly with respect to the most influential online operators – the "gatekeepers", as the Digital Services Act terms them. These currently include Google, YouTube, Facebook, Instagram, TikTok, X and more.

The Digital Services Act importantly identifies four systemic risks: illegal content including child sex abuse material and illegal hate speech; impacts on the Charter of Fundamental Rights, including discrimination and privacy; foreseeable impacts on the democratic process and public security; and manipulation or co-ordination of disinformation campaigns especially in public health, mental well-being and gender-based violence.

It is quite clear from Recital 12 that what is to be determined as illegal content would include hate speech and unlawful discriminatory content. It is right that after many years we are updating our hate speech laws to take account of the ways in which we communicate in a mass fashion. These could not have been envisaged even 20 or 30 years ago. Our laws in that area need to be updated. Given the events in Dublin on 23 November and the many appalling events across the country in recent times - blockades of buildings intended to house applicants for international protection, protests at libraries, violent protests outside these Houses and at many other locations across the country - any legislation that strengthens the hand of regulators of the online space is to be welcomed. I could fill the Dáil record, as could many others, with instances where large online platforms played pivotal roles in stoking fear among communities, platforming misinformation, and where platforms themselves have manifestly failed to properly address those issues. In some cases they have point blank refused to act. This legislation is clearly much needed.

The existence of the Digital Services Act and the facilitation of its operation in Ireland by this Bill will not instantly put a stop to illegal content, misinformation, or human rights abuses proliferating online spaces. However, even apart from the new regulatory framework, what we hope to create is a different culture in those firms. The director of the EU Agency for Fundamental Rights, FRA, Michael O'Flaherty, recently stated:

The sheer volume of hate we identified on social media clearly shows that the EU, its member states, and online platforms can step up their efforts to create a safer online space for all.

The FRA has recently conducted extensive research, which reveals that women are the primary victims of online hate, facing abusive language, harassment, and incitement to sexual violence on a regular basis. This will of course come as no surprise to public representatives who are women. The number of hateful posts targeting women was almost three times that of those targeting people on a racial basis. Given this study and what we have witnessed here and around the world, it is clear that enforcement will be a mammoth task. A year ago, it was widely reported that the CEO of X had disbanded that platform's trust and safety council. This was the advisory group of approximately 100 independent civil, human rights and other organisations the company formed in 2016 to address hate speech, child exploitation, suicide, self-harm and other problems relating to the platform. Only last week, citing legal matters, the company refused to come before a committee of this House to answer questions. All of this serves to demonstrate the uphill task at hand for us as legislators, and the regulators we seek to empower.

I question the assignment of two existing bodies to undertake this mammoth but meaningful and important task, namely, the media regulator and the CCPC. The Bill provides these agencies with extensive and necessary powers to enforce the Digital Services Act and the legislation underpinning it. However, powers under statute can only be executed where the agency has the appropriate capacity to do so. The media regulator as the primary agency with oversight for the Digital Services Act is still newly established. Its four commissioners were only appointed in March and it is and still openly recruiting to fill formative roles.

The significant financial sanctions available under the Digital Services Act require that the media commission be empowered to issue fines of up to 6% of annual turnover of the service provider. Considering that the DPC can make fines up to 4% of annual turnover for infringements of the GDPR, 6% is significant. If the annual turnover of a very large online platform like Amazon is taken into account, there could, in theory, be in excess of €28 billion in fines issued were it to fail to comply with the relevant provisions of this legislation. The necessary processes, procedures and expertise to exercise such powers often proved difficult for the Data Protection Commission, and still somewhat does, with investigations taking considerable time to complete. That is notwithstanding the objective fact that the Data Protection Commissioner in Ireland does a good job.

When the next very large online platform emerges and is designated as a gatekeeper by the EU, the platform will have four months to comply with obligations under the Digital Services Act, as will agencies in member states. Assisting the media commission with the enforcement of the provisions will be no easy task. This recently formed body is being given significant powers and responsibility to regulate all non-gatekeeper intermediaries operating in Ireland and assist in international regulation. Without robust establishment, the media commission may struggle to meet it’s brief in the manner of the DPC following the GDPR becoming law. The additional duties envisaged for the media commission will require significant staff resources. To further complicate this task, this Bill apportions partial responsibility for Digital Services Act oversight to the CCPC when it comes to online markets. This will give them similar powers and responsibilities with respect to Articles 30, 31 and 32 of the regulation, which relate to platforms where distance contracts for goods or services are facilitated – essentially online marketplaces. It is of concern to me that in making a submission to the Department's open call for opinions, the CCPC made a detailed submission with respect to the Digital Markets Act and was silent on the Digital Services Act, perhaps indicating that the CCPC did not foresee a role for itself at the time. That the decision was made to split the enforcement responsibility tasks on such a monumental piece of legislation quite frankly needs further interrogation. It requires more detailed explanation. I understand the explanation given by the Minister in his earlier contribution. I only hope that this Government and those that succeed it commit to properly staffing both of these agencies in undertaking this important work.

Given how desperately necessary the Digital Services Act and this Bill are in our increasingly digital world, I sincerely hope that splitting the competent authority across two pre-existing agencies will not become problematic. The stakes could not be any higher. We have to get this right. While I understand the urgency to pass this legislation, given that enforcement capabilities must be in place by February 2024, it is not the fault of this House, the committee or the assigned agencies. There is a choice we have, which is to do this right.

I will work with the Minister. We may introduce amendments on Committee Stage, but all of us in opposition are anxious to get this right. We will work with them and we understand the importance of it. Government follow-through on resourcing the media commission and the CCPC must be a priority. However, in our view, Government follow through on resourcing the media commission and the CCPC must be a priority.

Comments

No comments

Log in or join to post a public comment.