Dáil debates

Wednesday, 21 September 2022

Online Safety and Media Regulation Bill 2022: Second Stage

 

3:47 pm

Photo of Alan KellyAlan Kelly (Tipperary, Labour) | Oireachtas source

I have a huge interest in this area originating from a previous occupation I had before I made the crazy decision to get involved in politics. This is very important legislation, for example in the context of the way the world is now. I have a 12-year-old and a ten-year-old. As parents, we try to control the platforms they watch and where they get information but, as the ads on the radio say, you cannot control it totally. It is impossible to do so. We try to protect them from various information sources and forms of bullying and such things, but it is almost impossible to do so. The Bill must set a regulatory framework that will work. It is a good start, and I appreciate that the Minister has taken on a number of amendments. I will have another amendment later which I suggest the Minister should also take on board.

I have sat in the Minister's chair before - I had to do too many Bills when I was sitting there - but I believe there is too much contained in this Bill. I personally would not have done it this way. There is a lot of stuff there that should be in a separate Bill. The whole issue around online safety should be in a bespoke Bill. That is just a personal comment. This Bill is trying to do so much in one go.

The transposition of EU Directive No. 1808, the audiovisual media services directive, was commenced and has been going through the motions since as long ago as when I was an MEP. I sat on the consumer affairs committee and I remember that they were orienting towards looking at stuff like this at a very early stage, and that was a decade ago. When we look at how we transpose these Bills, it is important that we do it in a bespoke way that is suitable to our narrative in this country. For instance, as users of content we are different in many ways from those in other jurisdictions. We have a large volume of e-services in this country. We have a hugely sophisticated population. When we have to regulate our services, every country should do it in a bespoke way. What I am saying is that we cannot transpose the directive generically.

I welcome the new operating framework for online content and the issue relating to public service programmes and how they will be delivered. We need to look at that going into the future. Everything is available on demand now. I use public service programming to watch sports on RTÉ and TV3, or Virgin Media One as it is now called, and for the news. Everything else is accessed on demand. Whatever leisure time we have is spent viewing on-demand content, whereby couples or groups of people can watch whatever they want together, at a time that suits them. However, there is an issue in respect of the volume of platforms. These platforms are consistently changing and will forever change, and there is a concern about how we ensure the legislation is robust enough to be able to cover those changes.

According to an article I read last night, TikTok is the most used app among certain age groups in this country, more so than in many other countries. I can see that from my own children. I am not really on TikTok. I had better not admit to anything here. The reality is that there is a need for these platforms. Even in a political sense, we will all be on TikTok by the next election. If you are not, you probably will not be elected. It is simple as that because it will be how we communicate with younger people. Snapchat was the app of four or five years ago. It is probably now declining in use. All these platforms evolve, so how do we ensure these distribution channels will be covered going forward?

I welcome the way in which authorised officers will operate and the powers to be implemented. I will come back to that later. I am concerned about the volume of resource required, and I am glad the Minister referred to this in her speech. This will involve huge resources, especially given the report that was published last night because it changes the volume totally. I will come back to that later.

I also welcome the funding that will be provided to local community broadcasters, etc., and the resources for media providers. We will need a little more detail on that, if I am being honest, as regards the mechanism.

In fairness, the Minister acknowledged this in her contribution and said she will introduce clarification through amendments. With regard to the difference between what is real and fake, and what is public and really public news, we all know what has happened in politics throughout the world over the past five to seven years is very dangerous. We need to ensure we have a process by which we deal with this through the Bill.

I have an issue with crossover with the online safety commissioner. It would be helpful if this was dealt with in some way. I can see a lot of overlap. The Minister might attend to this when she comes back with her reply.

With regard to individual complaints and what we found out yesterday, we probably should have known about it a little bit sooner. In fairness to the Minister, if she were sitting here she would say the same thing. The way in which expert groups and NGOs will report is fine but "scale" is referenced a number of times by the Minister and in the Bill. This phrasing cannot stay like this. It will have to be amended in some way. What sort of scale? Will it be scale of operation? Is it that we would go after TikTok, but if it is a new platform set up by a 12-year-old in Carrignavar, County Cork with 200 people on it who are acting in an inappropriate way we could not go near it? Will it be the volume of inappropriate behaviour? In this sense, "scale" means it would not matter what the platform is if there was a lot of volume. The word "scale" and how it is dealt with in the legislation needs to be redefined across the board. It is too open-ended.

The most important component is sanction for harmful content. This is something on which I will table an amendment, which I will explain later. We need to define "harmful content" in section 139A to deal with the issue of disinformation. I will get back to this. My colleague Senator Sherlock, as has been referenced previously, has raised issues which I believe the Minister will take on board. These include how information on baby products is communicated. It is not just the whole process by which this is done and the impact it can have purely for greed; there is also the profiling of the data which is slightly dangerous. How do we ensure this does not happen? It could be used for other forms of profiling and purposes. I believe the Minister is taking it on board.

Self-regulation was never going to work. The content moderators who are out there on a range of platforms should be brought into this legislation because they need to be made responsible. We could pick any social media company and its director of online moderation. It is self-regulatory. Why can we not state that through this process every platform of a certain scale - that word again - must appoint somebody who is legally responsible under the Bill? This would make them think a lot about how they act. We have seen it so often when people make complaints. In the past I have been involved in advising people who make complaints about content that has been very upsetting. In fairness to my colleague Deputy Howlin, he drove home the issue of Coco's law as the Acting Chair knows. It is very good legislation. Why can we not have a situation whereby the Oireachtas states through the Bill that somebody in each company has to be responsible for this role? These people would then be accountable. If they are part of the regulatory process in a formal way, we will have a lot more success.

When it comes to the issue of individual complaints I note that the Minister has said she will have to come back to it. She will provide for it in amendments. Commencement will probably happen after the regulatory process has been set up in a more formal way. I understand this. I do not have a huge objection to it because I believe we need to have the appropriate ground work and formation, but there should be some form of formal system. To get back to the point I made about bringing the content moderators into the regulatory process, if there are companies that continuously have the Ryanair model of complaint management, which is effectively to forget about it or joke on Twitter, and continuously ignore the complaints process and do not deal with it, then the severe fines in the Bill should kick in so they have to deal with it. I hope the Minister can follow my vision. Content moderators would become a legal part of the process. They would have to be appointed in each company over a certain scale, and I agree the phrase "certain scale" would have to be included. When complaints are not dealt with, and a volume of complaints is never dealt with, the legislation should kick in to fine those companies heavily. Through such a process that includes content moderators, the Minister would deal with issues of volumes of complaints and the Ryanair model of not dealing with complaints. Through this process and through the fines that would be brought about, the companies would be told they were not dealing with genuine complaints and they would not get away with it. Genuine complaints would be defined under the Act and I will help the Minister with an amendment on this.

I know the officials have considered this with Canadian and Australian legislators who have gone through all of it. I believe this country can be defining in this regard. We have a responsibility in this country given the volume of e-business, IT and online companies we have on our doorstep. I ask the Minister to actualise the vision I have put out there. We need to go this far. I will suggest an amendment to the Minister so that she can consider it as part of her deliberations.

The Minister has said she will probably need 160 staff and possibly up to 300. I can see it going beyond this and if necessary it should do so. There will be issues with the merger of various State organisations. I was involved in a number of such mergers. I am sure the Minister will work very closely with the unions to ensure it is done in an appropriate way.

The Minister said that for harmful online content not related to existing criminal offences, for instance cyberbullying, a risk test must be met for it to be considered harmful. This is like the word "scale". We need to define "risk test" and I will table an amendment on this.

With regard to the issue of streaming, all services in Ireland will be under the regulation. We have capacity in the EU to ensure all streaming services in the EU can be broached through this. Many online streaming services that come into this country do not come from within the EU. This is something the Minister needs to bear in mind. They come from all over the world. The majority of them are not based in the EU. How do we deal with this? I do not think there is huge capacity. Let us say that a lot can be bypassed. As legislators, how will we ensure this does not happen?

Before I get to the amendment I propose, I will raise the issue of the amendments the Minister will bring forward in lieu of the new Electoral Reform Act in which I have a great interest. They are very important because we all know from a political point of view of the manipulation and process by which these platforms can be used, especially content sharing that is self-produced and all of this sort of stuff, the virality of which can be immense within a short time.

With regard to the issue of disinformation and section 44, I will table an amendment for a new section to deal with content that is classed as harmful. The section goes through content that is classed as harmful because it is against the criminal law, while section 44(3) deals with content that is held to be harmful, even though its dissemination does not involve a criminal contravention. Issues such as suicide are dealt with in this regard and a number of others as the Minister outlined.

However, one major difference from the current law, as set out in the Prohibition of Incitement to Racial Religious or National Hatred Act 1989, is that the Act criminalised content likely to stir up hatred - this is the important point I wish the Minister to listen to - only if the content itself is, "threatening, abusive or insulting". The world has moved on. Hatred can be stirred up now even if the content is not threatening, abusive or insulting. "Harmful content", especially in the online space, and disinformation need to be defined in the Bill, otherwise it will have a large gaping hole.

Content that is written in terms that are non-threatening, non-abusive and non-insulting but that is simply false, made up of complete lies and known to be false, would not be captured by the current definition of "harmful content", even if it amounted to a serious defamation of a particular vulnerable group or individual. One could say something that is completely uncovered by the 1989 Act but is equally as dangerous or, in some cases, worse because it is completely untrue.

I appreciate that this might be seen by the ICCL and others as an online restriction of freedom of speech by requiring the take-down protocols to apply to a category of content that is not, under current law, unlawful. However, we need to deal with it because if we do not, disinformation is the methodology that will be used to avoid the total impact and construct of the Bill in the first place. If we do not do this, we will avoid an essential opportunity.

Within what is currently constructed, people will be able to behave in a way that is harmful to others and they will provide content which, by its construct, will be bullying or humiliating and will promote or encourage behaviour that characterises certain facets such as an eating disorder, suicide or many other things. I encourage the Minister to consider this. As it is not named explicitly in the Bill, there are other ways in which this harmful content can be provided and it is not covered by this Bill. I will put table an amendment that I wish to share and discuss with the Minister beforehand.

Comments

No comments

Log in or join to post a public comment.