Seanad debates

Thursday, 8 February 2024

Digital Services Bill 2023: Committee and Remaining Stages

 

9:30 am

Photo of Dara CallearyDara Calleary (Mayo, Fianna Fail) | Oireachtas source

They are very technical issues but I will absolutely engage with them as part of their work post enactment of the Bill to build confidence in that space.To come back to Senator Keogan's overall points, this Bill does not define hate speech. It is a very technical Bill that also relates to content moderation. With regard to content moderation, the DSA seeks to define the rules on content moderation by providers. There are at least two sides to that particular kind of implementation work. The DSA defines the circumstances under which providers should moderate content and the consequences of moderating or not moderating content as required. The DSA also provides for the rights of users when their content is moderated so there is a provision in that space. Online platforms that do not comply with these rules may face implications with respect to criminal or civil liability for distributing the illegal content of others. Furthermore, they may face administrative penalties imposed on them by the DSC of the establishment.

The DSA introduces two instruments to safeguard against providers taking decisions arbitrarily on content moderation. Article 14 of the DSA sets out the terms and conditions of the service, which must be clear and easily accessible. They must inform users about the procedures for content moderating. Online intermediaries are required to exercise due diligence, objectivity and proportionality when moderating content taking into account the fundamental rights and interests of the parties concerned. For content moderation decisions, platforms must now provide obligatory information to users if their content gets removed or restricted. That will be upfront and available to everybody.

With regard to the contesting of decisions, recipients of services will be able to easily and effectively contest decisions made by the providers of online platforms concerning the illegality of content or its incompatibility with terms and conditions that negatively affect them. Providers of online platforms are required to provide for internal complaint-handling systems which meet certain conditions that aim to ensure systems are easily accessible and lead to fast, non-discriminatory, non-arbitrary and fair outcomes. They are subject to human review where automatic means are used. The DSA requires that providers of intermediary services must publish reports at least once per year on content moderation they have engaged in during that time period. That would meet, unusually enough, both Senators Keogan's and Higgins's aims in terms of transparency around this. Those reports must include the number of notices submitted under the notice and action mechanism categorised by the type of alleged illegal content, the number of notices submitted by trusted flaggers and any action taken due to the notice, and any use of automated means for content moderation.

The DSA rules for trusted flaggers, and indeed all users reporting such content, impose measures to remove or encourage the removal of illegal content in respect of freedom of expression. Again, it speaks to what I feel are very robust checks and balances that address the concerns, albeit different concerns, expressed by Senators Keogan and Higgins.

Comments

No comments

Log in or join to post a public comment.