Oireachtas Joint and Select Committees

Thursday, 5 November 2020

Joint Oireachtas Committee on Education and Skills

Bullying in Schools: Discussion

Dr. Tijana Milosevic:

I am unable to comment on Fortnite, but I am happy to look into the research that has been done on the game. I can tell the committee that our research, and in particular a Covid study which I cited earlier, found a 60% increase in cyberbullying on online multiplayer games during the pandemic. It is an issue that we need to look into further, and there needs to be more research done into how cyberbullying specifically happens. I am happy to provide the committee with that.

At the same time, there is a large body of research on the influence of gaming, and aggressive games, on children, which dates back to the digital era. It is not a straightforward case that games which contain some elements of aggression and violence necessarily lead children to become more violent. It is a large research field with extremely contentious conclusions, and contradictory evidence. We must be careful to avoid creating a panic around specific types of games, because that has happened before. There have been incidences where one individual game has been blamed that for something that is a broader cultural phenomenon. Therefore, we need to take care, but I am happy to provide examples for the committee.

In respect of the legislation and different examples from Europe and further afield, the Online Safety and Media Regulation Bill was, in a way, inspired by the Australian e-safety Bill and the creation of the office of eSafety Commissioner. I have hopes that the Online Safety and Media Regulation Bill can facilitate collaboration between policymakers, the educational system, the charities and the social media companies, and that it will be able to assist with some of the problems that we are seeing on social media platforms. Other models have tackled hate speech specifically, such as the Netzwerkdurchsetzungsgesetz, NetzDG, law in Germany which requires companies, on receipt of the notification of the presence of the content, to act within a certain amount of time and to take it down. Those models can be more or less effective, but hate speech is a different issue from cyberbullying. I would also be careful of following the example of such takedown systems. In the e-Commerce Directive, which dates back to the early 2000s, companies are exempt from liability based on the fact that they are online intermediaries.

There is a greater acknowledgement that companies are not only intermediaries, but they are actually actively curating our content and data. The exemption from liability is currently being debated and there is an emphasis on a duty of care, safety by design, and age-appropriate safety mechanisms. Theses efforts will move towards understanding how the design of algorithms and of the platforms might contribute to certain behaviours. The issue of digital ecology, and how cyberbullying happens in the context of children's interactions online, is one which is extremely important and does not get sufficient attention. This includes how many likes one receives, and behaving in such a way as to augment the attention one is getting. There is a certain design to digital media that perhaps inadvertently enhances certain aspects of our psychological characteristics. For instance, one of the platforms introduced a change where users can no longer see how many likes other users have received for an image they have posted. Efforts, such as these, which are not directly aimed at suppressing cyberbullying, but instead consider how the design of technology can inadvertently influence our and children's behaviour, might be the way to actually understand why all these efforts have, to an extent, been ineffective. It is because we have failed to address the broader cultural issue of what we get from social media interactions.

When I previously mentioned age gating, I raised the issue of whether children should have mobile phones in a certain place and at a certain age. In my opinion, and that of the community of researchers, one has to understand that taking technology away does not take away the problem. There is the issue of peer pressure to consider. It could be, for example, that the parent of a nine-year-old child does not think he or she should be on social media, and according to the terms of service of the social media platform, he or she is not supposed to access it until the age of 13 and, if one also considers the GDPR in Ireland, the age of digital consent is 16 years. At the same time, the child's school social circle and their social life is happening on a certain social media platform, therefore, by excluding him or her, it may result in the unintended consequences of excluding him or her from a broader discussion of what is happening, and his or her participation.

Going back to the issue of children's rights and the UN Convention on the Rights of the Child, we have to protect children, but at the same time, children have rights of participation. Therefore, ensuring that there is a balance is very important, and we must encourage design that is conducive to delivering the outcomes that we want to see. The question of how to deliver this is an incredibly difficult one, and one that companies and regulators are trying to address.

Comments

No comments

Log in or join to post a public comment.