Oireachtas Joint and Select Committees
Wednesday, 7 July 2021
Joint Committee on Tourism, Culture, Arts, Sport And Media
General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)
Mr. Ronan Lupton:
I can give a view on that, although it is probably one the Senator has heard before. The people who can be damaged most by algorithmic decision-making online are children. As adults, we disclose certain information on our devices, whether knowingly or not, and algorithms then take effect. We know that they are a fact of life on the Internet but those who will be troubled most are those who do not understand what they are doing on the Internet. The question is whether the legislation is able to develop a situation in which we can have some form of regulatory or code of conduct guidance on algorithmic decision-making in respect of minors. The Data Protection Act looks at targeting and advertising, and there are actions happening in that regard that are fundamentally good. Ultimately, however, it is a global question. It is something that irritates many people. If people are not happy with their algorithms, should they run off to the digital safety commissioner and say that the social media companies' algorithms are showing them content they do not want to see? That is the type of complaint that an individual complaints mechanism would fall over and die on. If people understood - it is a citizen information exercise - how the technology works, it makes life a little easier. We are all subjected to it. I am not saying that it is not a pain to have algorithmic decision-making happening but we need to focus on who is most likely to be most hurt by it, namely, minors. They can be exposed online to dietary issues, alcohol sales, gambling, etc. The question arises as to who is responsible for setting the parameters on the devices they are given and who is giving them the devices. Invariably, the answer is their parents or they have done it themselves and not told the parent in the context of the vast sums of money they have extracted from their parents by saying that they need the latest and greatest in technology, but that is another day's work.
The Senator's question on freedom of expression was a good one, if he does not mind me saying so. Freedom of expression online is codified, but we also have the Defamation Act. Under it, if a defence is reasonably likely to succeed - that is an open situation - the reality is that someone will not get an injunction from the courts. A court will look at the case presented to it and see whether there is a possible defence that is reasonably likely to succeed. If there is - a clever lawyer will have put one forward - then someone will not get an injunction, meaning that he or she will then have to take costly and lengthy defamation proceedings to resolve the issue. While I do not want to get into the details of the Carlow example, it was a serious situation. One can find businesses and institutions that do good work being subjected to crank complaints. I am not saying that happened in the Carlow context but those bodies can be subjected to complaints where people are exercising freedom of expression but doing so in the wrong. Trying to vindicate the right to a good name as businesses or individuals becomes an expensive and lengthy process and they have to go to court to do it.
Can this legislation fix that? There is an effort to do so in or around heads 49A to 49C in terms of providing recourse to a regulator to take data and information down, but what will the parameters be? We should be careful not to have censorship by the Executive without democratic legitimacy. Reverting to my point on misinformation and disinformation, we should seek to put into legislation clear guidelines for the regulator on when it would be appropriate to take down information. There should be takedowns - there are no two ways about it - but it is a question of when doing so is appropriate. We do not want a regulator going rogue and telling platform providers or others to take something down, that it is bold and, although they have not really assessed it, it is prima facienot correct or appropriate. We cannot have that. We would die as Ireland Inc. commercially, never mind anything else.
There are protections in the Constitution to deal with this issue. As the committee can probably tell from the way I am able to address this question, it is an area that I practice in, but the issue is vexed and problematic and there is room on the Statute Book for separate legislation in the criminal context, for example, through enhancing the incitement to hatred legislation or new legislation that a concerned Senator might introduce as a Private Members' Bill to shore it up.
No comments