Oireachtas Joint and Select Committees
Tuesday, 21 February 2023
Joint Committee On Children, Equality, Disability, Integration And Youth
Integration and Refugee Issues: Discussion
Mr. Mark Malone:
It is plausible that not only are we, our communities and those who are affected carrying the burden of the harms and hurts, we are also carrying the burden of reporting this stuff in the first place. We are not resourced to do this work. The onus should not be on communities that are impacted and affected to do the heavy lifting of massive corporations that are ultimately benefiting, whether or not they are regarded as doing so. YouTube is materially benefiting.
There is a democratic deficit in the sense that our communities are overly impacted by the spread of disinformation. In some cases it is clear propaganda. It is not disinformation or misinformation. It is the replenishment of anger, the mobilisation of fear, the persistence of undermining people's faith and trust in judgments of our community infrastructures such as traditional media and our ability to trust one another. We want to flag that we are seeing red flags of things that need to be addressed structurally.
In the context of the Digital Services Act, we are involved in a process with the EU around a code of conduct against online hate. These organisations have already made commitments to have regular contact with affected communities about issues that are flaring up. It does not exist. It does not happen. I have not seen any organisation or community groups have an experience with Facebook, Meta or Twitter. We will say on record that TikTok has come to us to state there is a problem on its platform and to ask what it can do. It has not been solved, it is not perfect, but it suggests there is a possibility of organisations working and at least trying to show an awareness of the impacts of the hurts and harms, to take some responsibility or at least take some action to mitigate what is happening. We are not seeing that.
I am unclear about the legislative process by which the Digital Services Act is given meaning here, but certainly in the code of conduct at EU level, there is an explicit agreement to have regular, ongoing contact. The actual quote in the joint statement by trusted flagger organisations and IT companies for an action framework on enhanced co-operation is that "IT companies will further consolidate ... dialogue with [civil society organisations] CSOs with regards to tackling hate speech on specific grounds, languages and national contexts, and promote human rights culture and narratives." There is no manifestation of that at present. It also states that "IT companies and CSOs will jointly explore topics and themes on an ad-hoc, but regular basis, in order to facilitate the flow of best practices and expertise between relevant teams and organisations", including those that are here today and many communities, and "including regarding CSO’s early warnings in case of events which may cause sparks of hatred online". The division between online and offline is quite antiquated. It is not how the world works now. It is not even how generations younger than us think about things. There are frameworks within which those shapes of accountability can be held but it is not clear where those are happening yet.