Oireachtas Joint and Select Committees

Wednesday, 14 July 2021

Joint Committee on Tourism, Culture, Arts, Sport And Media

General Scheme of the Online Safety and Media Regulation Bill: Discussion (Resumed)

Mr. Ciaran Moore:

On the ethical algorithms, we are aware of a lot of work that has been done around the transparency of same. The Digital Services Act and some of the European legislation also contemplated this area. One of the areas we are particularly concerned about is that a lot of the algorithms present content to increase engagement and younger people may not have the same awareness that they are going down a rabbit hole of particular types of material. We know that in the UK there have been a number of cases of younger people who have died by suicide, and on examining their social media use, it has been heavily dominated by this type of material. We have also undertaken research with the University of Bristol, which has looked at how people who have had suicidal ideation use the Internet. We see that they initially use it in quite a targeted way to seek support or to look for something but in the period of distress they see a huge amount of content. Much of that content tells them that if they watch this, they can also see something else. It is important to interrupt these patterns. A great deal of our research around ways to engage with people who are suicidal is about interrupting these circles of thought they have. The online environment is dangerous because we are still looking for the most engaging content to be presented to people.

The issue of content moderators is related. The people who will identify patterns of complaints will be the content moderators. If parents are complaining that their child is seeing an awful lot of something, it is the content moderators who should be empowered to escalate that. Samaritans Ireland has approximately 60 years of experience in looking after our volunteers. We care for them and we have formal debrief procedures, informal peer support procedures and rotations so that they are not doing the same patterns all of the time. That is partly for the mental health of our volunteers but it is also because they are better able to support others if they are not burned out and if they are in a better place.

In the past, we have brought some of this experience to bear. We have been funded by the National Office for Suicide Prevention, NOSP, to work with other front-line staff with some of these techniques. It is important that this is part of the experience of content moderators and that the regulation of this industry recognises that there is a specific health and safety risk around the mental health issue for the content moderators. It should be regulated in that manner. In the same way one would regulate an industry with toxic chemicals or health safety passes, this regulation should include this particular risk and it should be legislated for. We have guidance on that issue, which Ms Hamra might talk about, as well as talking about the algorithms.

Comments

No comments

Log in or join to post a public comment.