Oireachtas Joint and Select Committees

Tuesday, 16 May 2023

Joint Oireachtas Committee on Foreign Affairs and Trade, and Defence

Disinformation and Hybrid Threats in a Geopolitical Context: Discussion

Dr. Eileen Culloty:

I thank the committee for this invitation to address it on the topic of disinformation and hybrid threats. The Institute for Future Media, Democracy and Society is a research centre based in Dublin City University. Through our research and outreach work we aim to address the major challenges arising from the digital transformation of media, democracy and society. With regard to disinformation, DCU co-ordinates the Ireland hub of the European Digital Media Observatory, EDMO. Through this and other projects DCU conducts research on the effectiveness of disinformation countermeasures, supports the implementation of practical measures such as media literacy campaigns, and contributes to Irish and EU policy development. This is most notably through our work on the EU code of practice on disinformation, and through our work on the national counter disinformation strategy. I must stress that we do not have expertise in defence, security or cybersecurity. The EU network of EDMO hubs is typically focused on national and public facing dimensions of disinformation. Nevertheless, I would like to present some insights that I believe may be helpful for this committee to consider.

First, disinformation is a complex concept to define, and certain dimensions of it are more prominent in the defence context than in other contexts. The most commonly accepted definition of disinformation is as false information that has been created or shared with the intent to deceive or cause harm. However, in practice it can be difficult to ascertain intentions or to trace false claims back to an original source. Moreover, in many scenarios intentionality matters far less than the harm or outcomes that arise. For example, if false claims lead to widespread distrust in vaccines, or undermine the electoral process, the intentionality behind those claims is secondary to the negative public outcomes. For that reason, and because it is often so difficult to identify intentionality, many of the stakeholders involved in countering disinformation focus their attention on preventing disinformation harms, rather than identifying or trying to find the source. Yet, it seems clear that identifying sources and their intentionality matters a great deal in the context of foreign affairs, security and defence. The concepts of hostile influence operations, grey zone techniques - such as those discussed by the Institute of International and European Affairs, IIEA, last week - and the concept of hybrid warfare all presuppose an actor intending to cause harm. In other words, the way people approach the problem in the context of defence and security is somewhat different from the way people approach the problem in other spheres. It is not that they are incompatible, but that the resources at work in one sphere do not automatically translate to another.

The second point is that it is important to avoid assuming a cause and effect relationship between exposure to disinformation and public attitudes. Popular discussions of disinformation tend to assume that people blindly believe the content they consume. For example, news and research reports often state how many times a disinformation video has been viewed or shared online. That just tells us about the popularity of that video; it says nothing about people’s attitudes towards disinformation. Research on this topic indicates that people engage with disinformation for all kinds of reasons and, more important, people’s acceptance or endorsement of disinformation narratives and false beliefs tends to be tied to a wider set of factors or issues, including their levels of objective knowledge about a topic. Put simply, if people have lower levels of objective knowledge about a topic, they are more likely to believe false claims about that topic. It is also tied to their levels of trust, perhaps in media or government, and their ideological biases, beliefs and opinions. In the context of defence and security, it is potentially worth thinking about public vulnerabilities in those contexts and not simply that exposure to disinformation itself is what creates the problem.

Third, there is growing evidence for the effectiveness of pre-bunking disinformation. Pre-bunking aims to neutralise the effects of disinformation by warning people that there is a threat and explaining how manipulation tactics work. The underlying idea is to pre-empt the disinformation people are likely to encounter and provide them with the tools to recognise it. For example, if people are told that something is a common manipulation tactic, when they see it in the real world or their everyday media experience, they should be able to recognise and reject it. Given the promise of this approach, researchers are now investigating how to integrate pre-bunking into fact-checking and media literacy but also strategic communication that goes out by any kind of organisation or public body. The idea is to pre-empt how actions might be distorted or confused. As with all disinformation countermeasures, the challenge lies in reaching the people who need it the most.

Finally, research on disinformation has been hampered by a lack of access to data from the online platforms. Currently, independent researchers and policymakers are unable to determine the true scale and impact of online disinformation because they lack access to reliable data that the platforms have. There are voluntary EU mechanisms, such as the code of practice on disinformation, where the major platforms agreed to provide data on disinformation relevant to member states. However, so far, this has failed to actually generate usable, informative data. In the absence of co-operation from the platforms, researchers, journalists and others must invest considerable time and resources to simply understand what is happening online. A great burden is often placed on public funding but also on civil society to understand what is happening when platforms could often provide us with a much better picture.

I thank members for their time. I am happy to answer any questions they have.

Comments

No comments

Log in or join to post a public comment.