Oireachtas Joint and Select Committees

Tuesday, 17 April 2018

Joint Oireachtas Committee on Communications, Climate Action and Environment

Influence of Social Media on Elections and Referenda: Discussion

2:00 pm

Ms Helen Dixon:

I thank the Chairman and the committee for the invitation to attend today. As the committee is aware from my correspondence, this opening statement will address the issue of recent revelations relating to the use of social media and its potential influence on previous and future elections and referenda. As outlined by the Chair, I am joined this afternoon by assistant Data Protection Commissioner, Ms Joanne Neary, who leads the Data Protection Commission's supervision function in relation to technology multinationals. I am also joined by Mr. Graham Doyle, the head of communications at the Data Protection Commission, DPC.

I wish to start by briefly setting in context the convergence between the use of personal data in the digital or online environment and the objective of influencing the political behaviour of citizens. As the committee is aware, data protection regulation is concerned with the enforcement of data subject rights and the supervision of those organisations who hold, use and control the personal data of individuals. Relevant to the theme of these discussions, data protection law regulates, among other things, personal data processing in the context of online behavioural marketing of which political micro-targeting can be a subset. The issues of personal data processing on the one hand and electoral matters on the other are distinct and separate from each other and my office has no role in regulating the core aspects of electoral activity, including advertising and canvassing activities, other than where personal data is deployed. For example, the committee members may be aware that the DPC has previously issued guidance in the context of the last election on canvassing, data protection and electronic direct marketing and we are updating that guidance currently to comply with the General Data Protection Regulation, GDPR. Recent media coverage of the activity of Cambridge Analytica and its relationship with Facebook, however, means that these two areas of regulation may have been inadvertently conflated.

I will turn now to the issue of micro-targeting, which is central to today’s discussions. Micro-targeting is a type of processing of personal data where the personal data of an individual, for example, in a digital context this could be their online profile, is used to identify their interests and preferences across a whole range of issues, and to personalise that individual’s online environment. This is sometimes also referred to as online behavioural advertising. This personalisation can be done, for example, by delivering particular content, like specific advertisements, news feeds and messages, in a particular sequence with the aim of influencing the individual’s views, perceptions and ultimately their future choices, actions and behaviours. Political micro-targeting is a particular form of micro-targeting which uses different methods of communications to interact with and ultimately influence prospective voters. The data used to target and segment individuals may be demographic, geographic and-or behavioural in nature, in addition to richer sources of data gathered from the individuals themselves, such as information on their existing political views, personal opinions, lifestyle choices, and information indicative of personality types.

Meanwhile, the objective of data protection law is to protect individuals, among other things, from the unfair processing of their personal data. A key aspect of this fundamental fairness of processing issue is about transparency for individuals as to what information is collected on them, how it is used and who it will be shared with, so that individuals are positioned in a way that means they can actually control the use of their personal data in a meaningful way, for example, by opting not to provide information to a particular organisation in the first place.

As members are aware, recent media reports on the activities of Cambridge Analytica and Facebook have brought into focus the question of whether big data analytics combined with micro-targeting capability for delivery on social media platforms are being utilised to influence political and electoral outcomes. Big data refers to the practice of combining huge volumes of diversely sourced information, while big data analytics involves analysing these very large datasets using sophisticated algorithms - complex mathematical models - to identify patterns and trends with a view to informing decisions made on or in respect of individuals, such as deciding what subsets of individuals might be targeted with what selective material in an attempt to influence choices, preferences and behaviours towards a particular outcome. Big data relies not only on the increasing ability of technology to support the collection and storage of large amounts of data but also on its ability to analyse, understand and take advantage of, in particular, the full value of data using these types of analytics applications.

While the focus of today’s discussion is political micro-targeting in the digital or online environment, it may also occur in an offline or real-world context. In any event, while the aim of online political micro-targeting is ultimately to influence an individual’s thoughts or actions in the political sphere, it is recognised by academics and regulators that a possible outcome of political micro-targeting in particular is manipulation of individuals. One of the core concerns with these sorts of micro-targeting activities in general is that the autonomy of individuals is potentially jeopardised because the ability to make choices is potentially being dictated by the deployment of the individual’s information in ways that he or she does not know about, has not been told about and/or does not understand. This is likely because individuals have not been presented with full information in order to allow them to objectively assess the information with which they have been micro-targeted or because their circumstances are such that they simply are not capable of understanding the consequences for them of this type of processing. I refer, for example, to circumstances where the individuals involved are vulnerable internet users such as children. The potential adverse effects of political micro-targeting are amplified when one considers the ubiquity of social media and the amount of time spent on online platforms where political micro-targeting can occur. The issue of actual manipulation of social media users can also be linked in certain circumstances to issues of "fake news" creation sometimes disseminated by bots or fake accounts. Such latter matters fall outside the scope of data protection law.

I emphasise that with political micro-targeting, the concerns about manipulation of voters through such activities remain largely theoretical at this stage, as it has been recognised in the academic context that currently there is no hard evidence of such activity actually having a tangible impact on the outcome of elections. The way in which Cambridge Analytica is alleged to have obtained and used personal data has been highlighted recently in the context of an ongoing and active Information Commissioner's Office, ICO, investigation. The ICO is the UK's data protection authority. This investigation began in May 2017 and concerns the risks of data analytics used for political purposes. The ICO investigation spans 30 organisations, including Facebook. Encompassed in its scope is the specific issue of how data was collected from a third party app on Facebook and shared with Cambridge Analytica. In parallel with this ICO investigation, an Electoral Commission UK investigation is also under way in respect of campaign financing concerning the 2016 EU referendum in the UK.

It is helpful in the first instance to set out a brief chronology of the events. The Guardian newspaper, amongst others globally, had been running stories throughout 2016 and 2017 alleging "hijacking of democracy" by data analytics firms such as Cambridge Analytica. The ICO announced an investigation in May 2017 and the electoral commission in the UK announced its investigation in November 2017. In December 2017, the ICO confirmed that its investigation involved over 30 organisations including political parties and campaigns, data companies and social media platforms. As part of the investigation, the ICO issued a demand for access to data in the hands of Cambridge Analytica early in March. Cambridge Analytica is alleged to have failed to respond adequately to this demand and, on 23 March 2018, the ICO obtained a warrant for entry to Cambridge Analytica’s premises in order to inspect for relevant evidence.

The ICO investigation is ongoing with the possibility of enforcement action ultimately being taken. The ICO has also confirmed that it will be making public policy recommendations arising out of its investigation. The Office of the Data Protection Commissioner, DPC, in Ireland looks forward to the outcome of the ICO’s analysis and its recommendations. They are likely to have broader application across EU member states, in particular in respect of the questions of whether the rules for the use of personal data in political campaigns are clear; what data sources are used for profiling the electorate for micro-targeting; and whether there are no-go areas in the context of data analytics and social media in elections.

In summary, micro-targeting for political purposes is a potential technical subset of online behavioural advertising by social media platforms, apps, publishers and Internet sites. The application of these techniques in Europe to process voter data is thought to be more difficult, in part because of EU data protection laws. As I have mentioned, academics in this field are not yet certain of what, if any, the significant real effects of political micro-targeting through social media may be. It seems that considerable research, evaluation and investigation will need to take place before concrete conclusions can be drawn about the true risks and consequences of this type of personal data processing. However, the recent revelations have shone a spotlight on the undoubted asymmetries, arising from the inequality of resources and exacerbated now by technology, that are at play between big corporations and individuals, where those organisations collate, control and utilise big data consisting of personal data. Public consciousness of the risks to individual rights to autonomy and self-determination resulting from these asymmetries and these types of processing of personal data is growing. The demand for greater control over and transparency in how and why personal data is used will be facilitated by the enhanced package of data subject rights and the much more stringent obligations of controllers provided for under the GDPR, which becomes applicable across Europe on 25 May 2018.

As a data protection authority, the DPC in Ireland continues to drive awareness of individuals’ rights under both the current data protection regulatory framework and the GDPR as well as the risks for individuals when using social media services. We have recently published guidance for users of online platforms to encourage them to control settings as they wish that dictate the extent to which they are subject to behavioural advertising.