Oireachtas Joint and Select Committees

Wednesday, 7 July 2021

Joint Committee on Tourism, Culture, Arts, Sport And Media

General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)

Mr. Ronan Lupton:

Ultimately, the key response to the individual complaints mechanism relates to the rules of engagement. My preference is that any regulator be an advocate for the complainant, but the key query relates to how many complaints come and how they are filtered. That is the real challenge here. We have experience, as I said, with the Data Protection Commissioner and ComReg, where consumer complaints lines - call centres, almost - had to be set up to deal with the volume coming through. While a mechanism to allow complaints to be made is called for, the first port of call must be the social media company, to complain about the content with which the issue arises and make it take action to take down the content within the scope of the law, as it will then be aware of the deleterious content, whether it is misinformation or defamation, criminal or otherwise.

One would then take the steps to report but the question is whether society or the citizen knows the steps to take ant the answer is probably not. We cannot have a situation where people are told to run off to the digital safety czar and make their complaint, only for the digital safety czar to say it does not meet the threshold because it is freedom of expression-type content. An example would be a case before the courts recently involving Salesian College. Vulgar abuse is not defamation; it is simply vulgar abuse. You may not like it but it is, unfortunately, a feature of freedom of expression. The idea in that case was to try to unmask the person making the vulgar statements. That did not go ahead and the case has been withdrawn but it was before the courts and there is a written judgment in terms of what occurred. If we take that model, you would normally complain to Facebook first to say "I have somebody abusing me. We do not know who they are. Do something about it". If it fails to do this and to take steps, you should perhaps get a log from that particular platform provider and send it to the social media czar or commission and ask it to advocate on your behalf, rather than simply going to the social media commissioner or information society commissioner and saying "I have this complaint. You go deal with it. You find out what my problem is and take the initiative." I think this would be a disaster. However, if there are rules of engagement and the platforms buy in to it, be it through a code of conduct or some form of legislative narrative that tells them they must engage in this manner, ultimately, it will be cleaner and we will be able to resource it properly and come to some meeting of minds without recourse to the courts to try to take the content down and deal with it in a manner that is proper.

I take the Acting Chairman's point about the misinformation-disinformation dynamic. As I mentioned earlier, unfortunately or fortunately, depending on what side you are looking at, freedom of expression is codified in the Constitution and, unfortunately, people treat that sometimes as acarte blanchemechanism to spread misinformation about vaccines, efficacy and all sorts of nonsense and they get away with it. That is a deficiency on the Statute Book that needs to be addressed separately to this Bill. I am not saying it should not be in here. Perhaps in or around heads 49A, 49B and 49C, there may be room for codifying what should not come out. Going back to Senator Malcolm Byrne's question about how one defines the harm that take place in respect of that, it is very different because, ultimately, a regulator would take care of a doctor or lawyer who was misbehaving or doing something he or she should not do and have a regulatory sanction. However, when you are dealing with something like this, it is an unfortunate fact of life both in the off-line and online world that people say things that fundamentally are factually and scientifically incorrect and we simply have to deal with them. The Australian model may go too far in terms of how it does things. It is not dealing with the same level of European law and the dimension of rights, which Ms Fitzpatrick mentioned a second ago will be a ready-to-wear set of legislative principles under the Digital Safety Act, which will affect all 27 member states. Of course, we know that Britain has done its own thing anyway and is barking in some aspects. That is my contribution. I will happily pick up on any further questions that come from that thread. I am conscious of time so I will head over to Ms Fitzpatrick.

Comments

No comments

Log in or join to post a public comment.