Oireachtas Joint and Select Committees

Tuesday, 17 April 2018

Joint Oireachtas Committee on Communications, Climate Action and Environment

Influence of Social Media on Elections and Referenda: Discussion

2:00 pm

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I draw the attention of witnesses to the fact that by virtue of section 17(2)(l) of the Defamation Act 2009, witnesses are protected by absolute privilege in respect of their evidence to the committee. However, if they are directed by the committee to cease giving evidence on a particular matter and they continue to so do, they are entitled thereafter only to a qualified privilege in respect of their evidence.

They are directed that only evidence connected with the subject matter of these proceedings is to be given and they are asked to respect the parliamentary practice to the effect that, where possible, they should not criticise or make charges against any person, persons, or entity by name or in such a way as to make him, her or it identifiable. I also wish to advise witnesses that any submission or opening statement made to the committee will be published on the committee website after this meeting.

Members are reminded of the long-standing parliamentary practice to the effect that members should not comment on, criticise or make charges against a person outside the Houses, or an official either by name or in such a way as to make him or her identifiable. I remind members and witnesses to turn off their mobile phones or switch them to flight mode. Mobile telephones interfere with the sound system and make it difficult for parliamentary reporters to report the meeting. Television coverage and web streaming is also adversely affected.

I propose the main witnesses will speak for no more than five minutes and if they wish to share speaking time, they might indicate that to me at the start of the presentation. I understand Ms Joanne Neary, assistant Data Protection Commissioner, proposes to make a presentation on online behavioural advertising and the committee has agreed to hear that presentation after the Data Protection Commissioner's presentation. On behalf of the committee, I welcome Ms Helen Dixon and Ms Joanne Neary and I ask Ms Dixon to begin.

Ms Helen Dixon:

I thank the Chairman and the committee for the invitation to attend today. As the committee is aware from my correspondence, this opening statement will address the issue of recent revelations relating to the use of social media and its potential influence on previous and future elections and referenda. As outlined by the Chair, I am joined this afternoon by assistant Data Protection Commissioner, Ms Joanne Neary, who leads the Data Protection Commission's supervision function in relation to technology multinationals. I am also joined by Mr. Graham Doyle, the head of communications at the Data Protection Commission, DPC.

I wish to start by briefly setting in context the convergence between the use of personal data in the digital or online environment and the objective of influencing the political behaviour of citizens. As the committee is aware, data protection regulation is concerned with the enforcement of data subject rights and the supervision of those organisations who hold, use and control the personal data of individuals. Relevant to the theme of these discussions, data protection law regulates, among other things, personal data processing in the context of online behavioural marketing of which political micro-targeting can be a subset. The issues of personal data processing on the one hand and electoral matters on the other are distinct and separate from each other and my office has no role in regulating the core aspects of electoral activity, including advertising and canvassing activities, other than where personal data is deployed. For example, the committee members may be aware that the DPC has previously issued guidance in the context of the last election on canvassing, data protection and electronic direct marketing and we are updating that guidance currently to comply with the General Data Protection Regulation, GDPR. Recent media coverage of the activity of Cambridge Analytica and its relationship with Facebook, however, means that these two areas of regulation may have been inadvertently conflated.

I will turn now to the issue of micro-targeting, which is central to today’s discussions. Micro-targeting is a type of processing of personal data where the personal data of an individual, for example, in a digital context this could be their online profile, is used to identify their interests and preferences across a whole range of issues, and to personalise that individual’s online environment. This is sometimes also referred to as online behavioural advertising. This personalisation can be done, for example, by delivering particular content, like specific advertisements, news feeds and messages, in a particular sequence with the aim of influencing the individual’s views, perceptions and ultimately their future choices, actions and behaviours. Political micro-targeting is a particular form of micro-targeting which uses different methods of communications to interact with and ultimately influence prospective voters. The data used to target and segment individuals may be demographic, geographic and-or behavioural in nature, in addition to richer sources of data gathered from the individuals themselves, such as information on their existing political views, personal opinions, lifestyle choices, and information indicative of personality types.

Meanwhile, the objective of data protection law is to protect individuals, among other things, from the unfair processing of their personal data. A key aspect of this fundamental fairness of processing issue is about transparency for individuals as to what information is collected on them, how it is used and who it will be shared with, so that individuals are positioned in a way that means they can actually control the use of their personal data in a meaningful way, for example, by opting not to provide information to a particular organisation in the first place.

As members are aware, recent media reports on the activities of Cambridge Analytica and Facebook have brought into focus the question of whether big data analytics combined with micro-targeting capability for delivery on social media platforms are being utilised to influence political and electoral outcomes. Big data refers to the practice of combining huge volumes of diversely sourced information, while big data analytics involves analysing these very large datasets using sophisticated algorithms - complex mathematical models - to identify patterns and trends with a view to informing decisions made on or in respect of individuals, such as deciding what subsets of individuals might be targeted with what selective material in an attempt to influence choices, preferences and behaviours towards a particular outcome. Big data relies not only on the increasing ability of technology to support the collection and storage of large amounts of data but also on its ability to analyse, understand and take advantage of, in particular, the full value of data using these types of analytics applications.

While the focus of today’s discussion is political micro-targeting in the digital or online environment, it may also occur in an offline or real-world context. In any event, while the aim of online political micro-targeting is ultimately to influence an individual’s thoughts or actions in the political sphere, it is recognised by academics and regulators that a possible outcome of political micro-targeting in particular is manipulation of individuals. One of the core concerns with these sorts of micro-targeting activities in general is that the autonomy of individuals is potentially jeopardised because the ability to make choices is potentially being dictated by the deployment of the individual’s information in ways that he or she does not know about, has not been told about and/or does not understand. This is likely because individuals have not been presented with full information in order to allow them to objectively assess the information with which they have been micro-targeted or because their circumstances are such that they simply are not capable of understanding the consequences for them of this type of processing. I refer, for example, to circumstances where the individuals involved are vulnerable internet users such as children. The potential adverse effects of political micro-targeting are amplified when one considers the ubiquity of social media and the amount of time spent on online platforms where political micro-targeting can occur. The issue of actual manipulation of social media users can also be linked in certain circumstances to issues of "fake news" creation sometimes disseminated by bots or fake accounts. Such latter matters fall outside the scope of data protection law.

I emphasise that with political micro-targeting, the concerns about manipulation of voters through such activities remain largely theoretical at this stage, as it has been recognised in the academic context that currently there is no hard evidence of such activity actually having a tangible impact on the outcome of elections. The way in which Cambridge Analytica is alleged to have obtained and used personal data has been highlighted recently in the context of an ongoing and active Information Commissioner's Office, ICO, investigation. The ICO is the UK's data protection authority. This investigation began in May 2017 and concerns the risks of data analytics used for political purposes. The ICO investigation spans 30 organisations, including Facebook. Encompassed in its scope is the specific issue of how data was collected from a third party app on Facebook and shared with Cambridge Analytica. In parallel with this ICO investigation, an Electoral Commission UK investigation is also under way in respect of campaign financing concerning the 2016 EU referendum in the UK.

It is helpful in the first instance to set out a brief chronology of the events. The Guardian newspaper, amongst others globally, had been running stories throughout 2016 and 2017 alleging "hijacking of democracy" by data analytics firms such as Cambridge Analytica. The ICO announced an investigation in May 2017 and the electoral commission in the UK announced its investigation in November 2017. In December 2017, the ICO confirmed that its investigation involved over 30 organisations including political parties and campaigns, data companies and social media platforms. As part of the investigation, the ICO issued a demand for access to data in the hands of Cambridge Analytica early in March. Cambridge Analytica is alleged to have failed to respond adequately to this demand and, on 23 March 2018, the ICO obtained a warrant for entry to Cambridge Analytica’s premises in order to inspect for relevant evidence.

The ICO investigation is ongoing with the possibility of enforcement action ultimately being taken. The ICO has also confirmed that it will be making public policy recommendations arising out of its investigation. The Office of the Data Protection Commissioner, DPC, in Ireland looks forward to the outcome of the ICO’s analysis and its recommendations. They are likely to have broader application across EU member states, in particular in respect of the questions of whether the rules for the use of personal data in political campaigns are clear; what data sources are used for profiling the electorate for micro-targeting; and whether there are no-go areas in the context of data analytics and social media in elections.

In summary, micro-targeting for political purposes is a potential technical subset of online behavioural advertising by social media platforms, apps, publishers and Internet sites. The application of these techniques in Europe to process voter data is thought to be more difficult, in part because of EU data protection laws. As I have mentioned, academics in this field are not yet certain of what, if any, the significant real effects of political micro-targeting through social media may be. It seems that considerable research, evaluation and investigation will need to take place before concrete conclusions can be drawn about the true risks and consequences of this type of personal data processing. However, the recent revelations have shone a spotlight on the undoubted asymmetries, arising from the inequality of resources and exacerbated now by technology, that are at play between big corporations and individuals, where those organisations collate, control and utilise big data consisting of personal data. Public consciousness of the risks to individual rights to autonomy and self-determination resulting from these asymmetries and these types of processing of personal data is growing. The demand for greater control over and transparency in how and why personal data is used will be facilitated by the enhanced package of data subject rights and the much more stringent obligations of controllers provided for under the GDPR, which becomes applicable across Europe on 25 May 2018.

As a data protection authority, the DPC in Ireland continues to drive awareness of individuals’ rights under both the current data protection regulatory framework and the GDPR as well as the risks for individuals when using social media services. We have recently published guidance for users of online platforms to encourage them to control settings as they wish that dictate the extent to which they are subject to behavioural advertising.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Ms Dixon and invite the assistant commissioner, Ms Neary, to give her presentation on online behavioural advertising.

Ms Joanne Neary:

I supplied the committee with a PowerPoint presentation in PDF format.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

It is not available.

Ms Joanne Neary:

I will just talk the committee through it. I thought it would be useful to give a brief outline of online behavioural advertising, OBA. Political micro-targeting may be a subset of it. Many of the committee members may have heard of OBA before. It is a form of sophisticated advertising which occurs in an online space. Advertising networks and other organisations that serve this type of advertising include search engines and social media websites. In simple terms, OBA is a form of targeted advertising which monitors the behaviour of individuals online and shows individuals advertisements based on information collected. We can break it down into three main parts. The first is the collection of personal data online. The second is that the person is allocated into a so-called "interest segment" based on his or her online behaviour. The third is that he or she is displayed ads on the basis of his or her interest segment.

What do we mean when we say that advertising networks are collecting personal information online? Personal information online is collected via technological tools which I will speak about in a moment. The types of information that may be collected concern the behaviour that advertising networks can track online, for example what websites individuals visit, advertisements they may have clicked or products they may have purchased or viewed. This information is collated and individuals are put into interest segments. For example, if I look at particular bookstores online and potentially at a particular type of book, I may be allocated to the interest segment of people who will purchase books in the future.

The ads are presented based on the interest segment the individual is in and that is essentially the flow of OBA.

I mentioned that a number of technological tools are employed in the online context to allow advertising networks to serve users OBA. These tools are called cookies, tags, pixels, beacons and so on. Many members may already have heard of cookies in particular. We all know that when we visit websites, we get a so-called cookie banner which pops up, and the user may have to accept cookies in order to continue. All of these tools essentially service the same purpose, namely, to track different types of information online. I mentioned some forms of information that these types of technological tools may gather. To expand on that, these tools are looking, for example, at the clicks a user makes on a website, how many times someone has visited a web page or how many times they have visited part of a web page. If there is an embedded video within a web page, such tools observe how long someone has spent on that video or on other parts of the page. They can be used to track whether an email has been opened and, if so, how long it was opened for; how often someone visits a web page; a person's buying habits as a result of clicks - whether they click through to an item and eventually buy it - and to track clicks and progression through a site and over different websites.

To take a really basic example of simple targeting versus OBA and interest-based targeting, targeting would be if I ran a website for a hotel and targeted the individuals who land on that page in relation to other types of services my hotel might offer or perhaps attractions within the area. That would be a simple form of targeting based on the fact that the user is interested in coming to my hotel. OBA, on the other hand, would take the behaviour that the advertising network has seen the user exhibit. For example, if someone has gone to many websites related to spas in the area or something like that, the hotel will then show him or her related adverts. The hotel's website might advertise its spa or a special offer on one of its treatments. That is the progression of OBA and how it can be a more sophisticated form of targeting.

The commissioner has already touched on ways to limit OBA online and how users can do so if that is what they want. A number of controls are available to users to limit their OBA exposure online, which exist in a number of different fora. There are browser-based controls, for example, in Internet Explorer, Google Chrome or Mozilla Firefox, where there might be ways to turn off particular types of tracking. Similarly, controls are provided by the advertising network themselves. Those networks which serve ads might also centrally have tools people can use to adjust their OBA preferences. The industry bodies in this area also have very good tools available. IAB Ireland is the Irish trade association for digital advertising. It has come up with a website, youronlinechoices.eu, in conjunction with a European affiliate. That website shows exactly what companies or websites are tracking an individual's behaviour and gives the option to toggle them on and off. It is a really useful tool. It is also worth noting that individuals' OBA assessments may be linked across different devices. They may need to adjust their choices on their mobile phone as well as on their laptop or PC. Our guidance is available on our website in respect of tailoring social media and advertising preferences. It also details the way in which a user can toggle his or her advertising preferences on specific social media websites. There are other controls available on other websites that people may avail of in that regard.

Opting out of OBA does not mean one is opting out of online advertising. Opting out of OBA means opting out of the monitoring and targeted advertising which OBA presents to users.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Ms Neary. There may be further questions for her which she can address later. I thank both witnesses for their presentations and will now open up to questions. I will prioritise members of the committee who have indicated. I ask the witnesses to bank the questions; I will take two or three members in a row and will then come back to the witnesses, if they could take note of the questions.

I might start with a question for the commissioner. Previously, she stated that the issue of friends' data being harvested when Facebook users engaged with an app on Facebook was resolved by Facebook in May 2014, when access to friends' data was restricted by a platform upgrade. The commissioner said that this followed a 2012 recommendation by her office on foot of its re-audit of Facebook Ireland in respect of access to friends' data. Why did it take so long, from 2012 to 2014, to implement that?

I invite Deputy Dooley to ask his questions.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I welcome Ms Dixon and Ms Neary and thank them for their presentations. They talked about Internet segments. Are those Internet segments published? Do we have some understanding of what they are or what kind of silos are created? Ms Neary talked about purchasing books. That seems a rather benign example. There may be others that would be helpful.

Is it the case that some social media platforms - I mention Facebook because its representatives are here - track users when the user has left the Facebook environment? Do these platforms have the capacity to travel with the user when they migrate from one stage to another? If so, why? It may not be the case that they do but I am just posing the question.

May I ask another question that is related to the Data Protection Commissioner's work?

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Because of our time constraints, I ask members to ask all of their questions together.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

We have written to the commissioner through the clerk asking that she might update us on the job of work she is undertaking in respect of what is alleged as a data breach at Independent News and Media, INM. I do not wish her to go beyond what she would want to say here in respect of the fact that this is before the courts. We have grave concerns about the potential impact that any data breach might have on the work of journalism and specifically the impact on the importance of journalistic sources. Can the commissioner identify for us if she thinks there is any weakness in the law? Does she believe the law needs to be updated? Does she think her office needs greater powers to ensure this and to address what appears to be a lacuna in the law here, if such exists?

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I remind members of our Standing Orders. Members will be aware of the sub judice rule provided for under Standing Order No. 59(3): "a matter shall not be raised in such an overt manner so that it appears to be an attempt by the Dáil to encroach on the functions of the Courts or a Judicial Tribunal."

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

For clarity, I am almost sure that what I have said in now way seeks to address anything that is going on in the courts. It is looking at what might be done regardless of the outcome of this particular event. I am concerned about protecting the data of journalists in particular because of the role that the fourth estate has in our democracy. It is of fundamental importance. For that reason, I would like a response.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I will ask the commissioner to respond to those questions and will then bring in our next Deputy.

Ms Helen Dixon:

In answer to the Chairman's opening question about the harvesting of friends' data, which was restricted when Facebook implemented a platform upgrade in May 2014, and why there was a delay of 18 months between the recommendation and the platform upgrade, the recommendations in the DPC audit arose in the context of what was essentially the wholesale import into Europe of a US platform that operates on a global basis. The audits that were undertaken by the DPC in 2011 and the follow-up audit in 2012, which were published by Facebook, looked into a whole range of areas and sought to resolve a whole range of issues that were identified in respect of the compliance of this US platform with European data protection law.

Issues that were resolved more expeditiously included, for example, the application of facial recognition on the platform in Europe and, in fact, facial recognition has not up to this point run on the platform in Europe partly as a result of the work undertaken by the Data Protection Commissioner.

The issue of access to friends' data when app developers access data on the Facebook platform was the subject of a recommendation by the Data Protection Commissioner. It recommended it could not see a proportionate basis for access to friends' data when a user engaged with an app and it sought a resolution of the issue with Facebook. The correspondence that ensued between Facebook and the Data Protection Commissioner between the recommendation and the platform upgrade in May 2014 is really reflective of the fact the guiding principle in data protection law around this type of access is that processing must be fair and proportionate. The laws are high-level, technology-neutral and principles-based. An iterative process was engaged in, whereby Facebook asserted its legitimate interests to implement the feature in that way. It argued the in-app experience of users was greatly enhanced, that the very purpose of these apps was to enhance the experience of users who engaged, and that this experience was enhanced by the inclusion of users' friends when they engaged with an app.

An iterative process took place. Facebook did not agree with the recommendation. The Data Protection Commissioner persisted, as I have said, in the iterative process, resulting in the platform upgrade 18 months later. It is probably worth mentioning in the context of the response that there are obviously many ways to go about persuading a company to make a change to comply with data protection laws or to comply with our interpretation of data protection laws. One way is to go about a litigation route. We have taken that route with Facebook on the issue of transfers of data from the EU to the US. We made an application to the High Court in May 2016 seeking a reference to the European Court of Justice. The High Court ruled in October 2017 that it would make a reference and that reference is still pending. The life cycle of litigation is uncertain and very lengthy. In this regard, I might just mention an important piece of litigation that has been undertaken by the Belgian data protection commissioner, who is litigating against Facebook on social plug-ins. The litigation started in 2014 and it is not yet resolved four years later. The life cycle of taking a litigation enforcement route can be long.

With regard to Deputy Dooley's questions on whether the segments are published, and he spoke specifically about Facebook, in fact Facebook users can go in and look at what segments they have been placed into with regard to interest-based advertising. On every advertisement served on a user on Facebook one can look to see why it has been served. A user may well see it is because he is male and in a certain age category. There is a certain amount of information with regard to the segments, and users can opt out of receiving interest-based advertising based on their placement in any of those segments.

Under the GDPR there will be additional obligations on data controllers to publish more information on the fact of their profiling of individuals and how they profile individuals. They will also need to publish the basic logic of the algorithms they are supplying. We anticipate we will see from all of the platforms, including Facebook, significant improvements in transparency come 25 May 2018.

In relation to the tracking of Facebook users once they leave the Facebook environment, Facebook will, of course, confirm the position to the committee when it comes before it later, but it is the case that users are tracked once they leave the Facebook environment. There are means by which users can switch off the tracking, but the issues around all of the control settings that Ms Neary outlined earlier relate to how accessible they are and how transparent it is to the users in the first instance that the default is that they are being tracked.

All of these issues are under active examination by our office with Facebook.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

On that particular point, has Ms Dixon given any consideration to limiting the potential for tracking outside of the environment?

Ms Helen Dixon:

This is an issue that does not relate only to Facebook and recently, in fact just last week, we made a statement to our peers at the article 29 working party of EU data protection authorities. We set out for the working party that as the Irish Data Protection Commissioner will assume lead supervisory status from 25 May for Facebook and other platforms that operate from Ireland, we intend to set out a strategic plan for the next 12 months in terms of the platforms' collection processing and tracking of users, which we intend to target for supervision and potential enforcement. Importantly, in the context of setting out that strategic vision on which we anticipated consulting our fellow data protection authorities, we also set out for them that there is a need to look much more systemically at the whole ad tech sector. It is not simply Facebook that is monetising data for the purposes of serving advertisements. The same applies to Google and many other services operating online. There is a need for us as a group of data protection authorities to divvy out the work and ensure we are tackling all of these issues and not just one platform.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I also raised the issue of the protection of journalistic-----

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

If Ms Dixon feels she cannot answer those questions, she does not have to do so.

Ms Helen Dixon:

I cannot answer those questions, other than to confirm we are investigating the matters.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

If I could just ask Ms Dixon to address it in a general way.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Ms Dixon has clarified the position on this.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I want to object to that. I do not want to get into a-----

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

There are Standing Orders and we are under time constraints.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

With respect, the Standing Order has no bearing whatsoever-----

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Yes, it does. Deputy, please do not question the Chairman on this issue. I have been very clear in private session and here again. I call Senator O'Reilly.

Photo of Joe O'ReillyJoe O'Reilly (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I welcome the witnesses. The commercial premise on which social media are based is to get the names of people and their data and advertise to them. This is how they get a commercial return. Given this is the case, obviously the onus on their side is to make the data available and attractive and use it as effectively as possible and attract advertisers. In this context, will Ms Dixon go through what she thinks of the current situation? The people watching us today will be interested to know as of today how an individual can protect his or her data. What specific steps can people take? Are there sufficient opportunities for people who might not be very technologically competent to protect their personal data and clearly know to do it? I am not sure there is an awareness there. How does Ms Dixon think we could improve on what is there now? I know legislation is pending to deal with fake accounts. Fake accounts are very serious and all of us in this room have been adversely affected at certain times by people using fake accounts to disseminate falsehoods or false arguments. Does Ms Dixon think the legislation will be sufficiently adequate in this regard? It is a very important protection for democracy and for individuals.

Ms Dixon made the point that access to data in the United States is much more easily obtained and Europe is not bad in this regard, in terms of what could happen with the elections in Europe. Will Ms Dixon elaborate a bit more as to how watertight the European situation is and how confident could we be in an Irish electoral setting? When we have an election in Ireland, how confident can we be, based on what Ms Dixon has said about the European model?

Photo of Tim LombardTim Lombard (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I welcome the guests and acknowledge their contributions so far.

On 25 May, a significant milestone, not alone in Ireland but in Europe, will occur with the introduction of the general data protection regulation, GDPR. This will involve a considerable amount of work in how actual data will be protected. One significant point about this legislation concerns fines. In some cases, a fine can be 20% of a platform’s gross turnover, which could be an enormous sum. The commissioner spoke about going through a process with the platforms and litigation is the least likely one she would pursue. However, if an issue went into litigation and it was to receive a maximum fine, what would that mean to the actual platform involved? Does the commissioner feel the fines are so large that she will be tied up in the legal process because the platform cannot afford to lose? Will we have a drawn-out situation of seven years of litigation because of the significant amounts of money that could be involved?

Ms Helen Dixon:

Senator O’Reilly succinctly expressed how the ad tech sector works, as well as how the monetising of personal data operates in attracting advertisers and serving ads on users. There is much talk about whether there should be an alternative model to free Internet services. Inevitably, this would involve accessing services from behind a paywall. Policymakers are looking at all these issues, particularly in the context of the EU e-privacy regulation and whether more services will end up behind paywalls to avoid the tracking of users.

At the moment, users are being tracked in order to offer free Internet services. We have published details online in the past fortnight on how users of the major platforms can use controls and granular settings to ultimately switch off interest-based advertising.

Photo of Joe O'ReillyJoe O'Reilly (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I appreciate Ms Dixon's response. However, are these accessible and easy to use for the average citizen?

Ms Helen Dixon:

No. To date, they have not been accessible and easy enough to use. For example, on the very case the Chair raised, the access to friend data on the app developer platform prior to the 2014 upgrade, there was in fact a user setting, an at app platform offsetting, which users could toggle. Nobody knew what it meant. There was no pop-up or bubble beside it to explain what it meant if one toggled it one way or another. It has been inadequate to date.

We have high hopes of what the GDPR is demanding in terms of obligations on data controllers. We have promises from all the major platforms that we are going to see something significantly new and different around transparency come 25 May 2018. We will be supervising this to ensure we see a whole new level of transparency.

The platforms tell us that they have been using new design methods in order to better engage users around privacy settings. There are also new principles in the GDPR such as privacy by design, privacy by default and data minimisation requirements, which will provide extra supervisory hooks for us in terms of how the platforms operate.

On European and Irish elections, it is important that users who engage on online platforms and consume their news mainly through these, engage with the control settings. If they choose to receive interest-based advertising, which some users legitimately choose, they have to be aware they may be subject to advertising which is pushing them in one specific direction and ensure they consume other types of news from other publishers, rather than relying solely on social media.

Regarding Irish elections, data protection laws and the GDPR are strong enough to prohibit personal data processing. The ground-breaking investigation the UK Information Commissioners Office is undertaking on novel uses of data analytics in the electoral context will provide useful policy outcomes which will have application across all EU member states. The UK commissioner has updated that she hopes to have some outputs from that investigation next month.

Senator Lombard correctly cited the level of fines under GDPR will be high. They can come to 20% of global turnover in certain cases. When one looks at the global turnover of the platforms, this could be a significant sum of money. I have attended several events recently at which the MEP and former European Commissioner who proposed the GDPR, Viviane Reding, has said these levels of fines are comparable to competition law fines because it is necessary to grab the attention of the industry, particularly regarding the importance of protecting personal data. Will we end up tied up in litigation processes challenging any decision we make to apply a significant fine? That is certainly a possibility. The stakes are going to be much higher in terms of the outcome of any investigation. We conclude the corrective powers we exercise and the sanctions we decide to apply.

We could look at recent examples from the competition law sphere. Last year, the European Commission fined Facebook WhatsApp €125 million concerning information it did or did not give the Commission during its assessment of its merger application in 2014. That was not appealed. The European Commission fined Google €2.6 billion in the summer of 2016, due to favouring its own shopping comparison website. That was not appealed either. It will be circumstantial in some cases as to whether the decisions are appealed. However, undoubtedly, there will be more litigation across the board regarding all types of sectors we regulate and where we seek to impose significant sanctions.

Photo of Brian StanleyBrian Stanley (Laois, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I welcome the witnesses.

Will the new data protection regulations protect us from what happened with Cambridge Analytica? What about the legalities in terms of domestic law and European law? Will those regulations be enough to protect people, given the transnational nature of what is described as the platforms? In her presentation, Ms Dixon stated there is no certainty with how effective political micro-targeting is. Some people in the political world might disagree with that. In the context of Brexit, which is having a significant impact on this island and our neighbouring island, is there not a relevance in micro-targeting?

There is a particular concern about young people under 18 years of age. They are connected into the big bad world or the big good world, whichever way one looks at it. They are vulnerable in many ways.

What is being done to protect the vulnerable from micro-targeting and what is being done to prevent psychometric profiling of juveniles and children - people under 18 years of age? Even in the political context, what is being done? It is of huge concern to parents and people in education.

I do not want to comment on INM but my point is related to that issue. On the collection of data on workers in various industries and the harvesting of data in terms of what they look at on social media and their private data that is stored, has the commissioner received many complaints from workers, representative bodies or whistleblowers in the board room of some companies about other companies in this State? People use computers every day in their workplace and there are ways of hacking those data. Has the commissioner received many complaints or is she pursuing many complaints against other companies in the Twenty-six Counties?

Photo of Bríd SmithBríd Smith (Dublin South Central, People Before Profit Alliance)
Link to this: Individually | In context | Oireachtas source

I thank both witnesses for their presentations. It is fascinating stuff. I was very keen to hear Ms Neary's presentation because of the emphasis on online advertising and how online advertising can grab behaviour, attitudes, likes and dislikes of individuals to then target them for particular messaging. The reason I am concerned about it is because of the upcoming referendum. It is really interesting to note that as the new data protection regulations in Europe will be brought in on 25 May, the very day the referendum is being held, they will not cover the period between now and when the referendum is held. I have serious concerns, as do many people, given the recent revelations regarding Cambridge Analytica and the use it made of people's information on Facebook. It has been reported that some 87 million Facebook users were affected by the misuse of data by Cambridge Analytica. One of its former employees, Thomas Borwick, has been engaged by the "No" side in the referendum campaign to produce online advertising. That connection in itself is a big worry for what may happen in terms of influencing the democratic outcome of the referendum.

I am bit mixed up. The commissioner does not perceive that she might have a role in this. She tells us, and this is obviously her legal job description, that she is concerned with the enforcement of data subject rights and the supervision of organisations that hold, use and control personal data of individuals. She then says that her office has no role in regulating the core aspects of electoral activity, including advertising and canvassing activities other than where those personal data are deployed. Could she comment on that and clarify whether she does not perceive her own role to involve querying what might be happening with the use of personal data in the referendum campaign? She has stated her role does not overlap with electoral activity other than where those personal data are deployed. If it has been proved that companies like Cambridge Analytica and others have used personal data to influence democratic outcomes in other countries, would she have concerns about that in the current referendum campaign?

The commissioner says she is concerned about vulnerable Internet users such as children. Does she not think there is a whole cohort of vulnerable Internet users, such as older people a bit like myself, who grew up in an age where we did not automatically learn about social media and how to use it and that there is a cohort of the electorate that is very vulnerable to that type of advertising?

Regarding the impact of the Cambridge Analytica scandal, of those 87 million Facebook users who were affected, has the commissioner an idea as to how many of them are based in Ireland? Given that Facebook has its second largest global headquarters in Dublin, does the commissioner have concerns that there might be more targeting of an Irish audience or is that incorrect? Does she have an idea as to how many Irish Facebook users have been affected?

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I invite the commissioner to answer those two rounds of questions from Deputies Stanley and Bríd Smith.

Ms Helen Dixon:

I wish to correct a misstatement I made earlier. In response to Senator Lombard, I accidentally said that 20% of global turnover was the limit of fines. Of course, it is 4% of global turnover and up to €20 million so my apologies for that.

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

The Facebook team's hearts have just re-engaged.

Ms Helen Dixon:

Viviane Reding would have been proud of me inflating it to 20% of turnover.

On the question relating to Cambridge Analytica, the transnational nature of global data flows and whether the new laws to be introduced on 25 May will be effective, the first thing to emphasise about the GDPR is that it is a harmonised law. It is intended not only to be a modernised law for Europe but to be harmonised. It is intended that there will be a much more co-ordinated and consistent application of the law across the EU than there has been heretofore, where there have been separate and fragmented transpositions of the 1995 directive in EU member states. To that end, where the Irish Data Protection Commissioner acts as the lead supervisory authority in supervising any entity, we must consult with other data protection authorities as we go and must take the utmost account of the views of other data protection authorities. Ultimately, any of those other data protection authorities can offer a relevant and reasoned objection to any conclusions or findings we make in an investigation. That is with the aim of ensuring there is a consistent and co-ordinated application of the law.

There are also very strong provisions for mutual assistance between the data protection authorities in the GDPR. Currently, it is the case that Cambridge Analytica is a data controller in data protection terms that is located in the UK and subject to the jurisdiction of the Information Commissioner's Office in the UK. However, in conducting her investigation, the Information Commissioner proactively had a call for an hour with me yesterday and is keeping relevant data protection authorities up to date regarding her investigation and we are assisting in any way we can. Consequently, there are ways in which the new regulation can be brought to bear regardless of the location of an entity.

In respect of political micro-targeting, the Deputy mentioned Brexit. As I outlined earlier, the Information Commissioner in the UK is concentrating on the application of data protection law to the use of big data analytics and micro-targeting so in particular, she is looking at what sources of data a company like Cambridge Analytica has. Is it procuring data from data brokers in Europe? Are those data being lawfully procured? It now appears as though it might have procured data from Facebook that was not lawfully procured. The Information Commissioner in the UK is looking specifically at the sources of data, the legal basis for the processing, whether the UK electorate could have been aware that its data were being processed in that way by data analytics firms and so on. In one of her recent appearances before a parliamentary committee in the UK, she said she is not expecting a smoking gun in any shape or form but her investigation has not yet concluded. It is concentrated around the data protection issues relating to the likes of Cambridge Analytica.

Regarding juveniles and young users of the Internet, the GDPR over and above the 1995 directive is now calling out that children merit special protection in a data protection sense. There are a number of provisions in the GDPR relating to child users.

In particular, Article 8 of the GDPR now allows member states set an age over which a child will not require the consent of parents to access information services online. As the committee is aware, that age has been set at 13 in Ireland.

There is a requirement now for data controllers, and in particular for online service providers, to show that they are giving children the special additional protection that they merit. This may be in the form of modified services that are appropriate to the age of development of users, and it may be certainly be represented in terms of the type of transparency and plain language notices that are offered to children. The platforms that we supervise have told us that we will see differentiated services in some cases being offered to children from 25 May 2018.

In relation to Deputy Smith's questions in respect of behaviours and attitudes and political micro-targeting, the Deputy said she was confused following the opening statement regarding the role of the DPC. Our opening statement was designed to convey that the particular legislation that the committee intends to carry out scrutiny on today is not a matter for comment by this office as to whether there is a requirement for an imprint on a political advertisement that shows the original funder. We are concerned, however, with every aspect of electoral activity and political activity that concerns the processing of personal data.

In relation to Cambridge Analytica and the 87 million users who were mostly based in the USA, what was at issue in that particular case was that Dr. Kogan at Cambridge University, who was originally the app developer who accessed the data from Facebook's platform, is alleged to have transmitted it on, contrary to Facebook's policies, to Cambridge Analytica. He had been doing psychometric research at Cambridge University based on a model called OCEAN, which concerns personality traits of individuals, such as openness and conscientiousness, right through to where on the scale of neurotica a person stands. What was asserted to be the case was that individuals could be micro-targeted based on these personality profiles and classifications and I pointed out that social scientists and academics, who are looking now live at this area, say there is no evidence that targeting based on that type of personality and psychometric segmentation has happened or that it has manipulated results.

Further research is needed in this area, but it is undoubtedly the case, because it is happening already in an offline context as well as an online context, that there is political micro-targeting. It is not necessarily harmful or a bad thing that there would be some targeting of messages. It happens in an offline context where posters or particular issues that politicians want to promote relative to a geographic area in which individuals live are segmented and targeted messages are delivered to the constituents in that way. Not all political micro-targeting is harmful, and not all of it involves a manipulation of users data in a way that they would not anticipate.

Individuals have some level of control, albeit the transparency around how they can exercise that control is not yet good enough and we expect to see an improvement from May under the GDPR. Users need to be aware that if they have not exercised their controls online to switch off interest-based advertising, they are being targeted based on segments that the platforms deem them to be interested in.

In relation to the 87 million affected users, and Facebook confirmed these figures, 15 individuals in Ireland downloaded the app that Dr. Kogan published and up to 44,000 Irish users were affected as friends of users who downloaded the app in that their data was accessed by Dr. Kogan and, it is alleged, passed to Cambridge Analytica.

In terms of vulnerable Internet users, the Deputy is absolutely correct. Children were just an example of vulnerable Internet users. There are many reasons that the GDPR, as proposed and enacted, now requires privacy by design and default and encodes these new principles.

Under the GDPR, in order to ensure that before users have to search and look for controls about what it should switch off and switch on, they should be already able to anticipate that there is a level of default protection when they arrive on the platform. This is what we anticipate we are going to see, not just from the platforms but from all sorts of sectors and data controllers from 25 May 2018.

Photo of Brian StanleyBrian Stanley (Laois, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

My last question was not answered.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

We are under time constraints, so very briefly-----

Photo of Brian StanleyBrian Stanley (Laois, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

The last question I asked was in relation to the example of INM. Are there other workplaces from where the commissioner has had complaints? Are there examples of data being collected or harvested by companies, or people within other companies, about workers?

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Thank you. Deputy. The commissioner has that question.

Ms Helen Dixon:

I beg the Deputy's pardon. I omitted to answer that question. We receive, on an ongoing basis, a lot of complaints from employees in organisations, and ex-employees of organisations, particularly around CCTV monitoring of individuals, and our annual reports each year detail case studies where individuals allege that they have been subject to inappropriate and excessive monitoring by their employers around CCTV. We have also received complaints from individuals, within the employment context, that their emails were accessed without their knowledge and we have investigated those particular cases.

There is quite a body of case law from the European Court of Human Rights around the issue of employer monitoring of employees in the workplace and the rights to privacy and to data protection that employees still retain in the workplace, notwithstanding the legitimate interest of employers to operate their business effectively.

If the Deputy is interested, we can provide him with a summary of the case law in that area and some of the case studies that we have published in relation to decisions we have made.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Could I ask the commissioner to forward that to the committee clerk because I am sure all the members would be interested in that.

I am going to bring in Deputies Eamon Ryan, Lowry and Lawless and Senators McDowell and Higgins, all of whom are most welcome. I will start with Deputy Ryan.

Photo of Eamon RyanEamon Ryan (Dublin Bay South, Green Party)
Link to this: Individually | In context | Oireachtas source

I thank Ms Dixon very much for coming in. I have always had the highest regard for the Data Protection Commissioner. The commissioner has always carried out her work in the best traditions of the Irish public service and the Irish regulatory system, as did her predecessor. I agree very much with her comments that we have to follow guiding principles in terms of digital regulation and she mentioned processing to be fair and proportionate.

I am hoping that the commissioner will agree that one of the other principles that might apply is to be transparent, wherever possible. It is a good principle in terms of the management of data systems and the regulation of them. In that regard, I have 14 written questions that I want to give to the Data Protection Commissioner, not to have answered now but to receive written replies in the next day or two, if possible.

I have similar questions for the Facebook representative as I want to use this opportunity to put them in a written form, as I will not have time to ask the number of questions I have orally, so I will hand them out in a second.

Following on from the very first question the Chairman asked, can I ask about this issue of third party access to friends data, which was at the corner of the problem in the Cambridge Analytica scandal? As Ms Dixon says, this issue was originally raised by, among others, Maximilian Schrems in his complaint in 2011. He was very specific in raising concerns about the ability of Facebook to do exactly that in its system in Europe.

My understanding is that that process, where one has that audit in 2011 and 2012, the recommendation or the direction - I am not what word one would use - or the instruction to Facebook to stop that capability was very centre stage. It was not a peripheral issue; it was a very significant issue. In regard to that process, in terms of this audited approach, is it what the commissioner calls "an engaged regulatory process", where it is iterative and where, as the commissioner says, it engages with the company? I heard the commissioner's response earlier on but it beggars belief that having gone through that iterative process and having gone through two audit processes, which one presumes had reached a conclusion in terms of the contents of the audit having been discussed - it was not that Facebook was just presented with an audit without any engagement prior to that - that it resisted the implementation of that direction from the Data Protection Commissioner for 18 months.

They presented an audit without any prior engagement. Then for 18 months they resisted the implementation of that direction from the Data Protection Commissioner. The witness said taking a litigious route would have been difficult but the fact that the company ignored that recommendation, and the central points of Max Schrems' complaint and proceeded for that period, allowed Cambridge Analytica harvest the data which had not insignificant consequences for democratic systems elsewhere.

In the iterative process the witnesses have described did the Data Protection Commission ask Facebook for written confirmation that such data breaches were not occurring? When did it first become aware of the Cambridge Analytica case and what sanctions, if any, can it apply as a regulator with regard to what happened in that case? I heard the witnesses mention the fine when WhatsApp and Facebook were exchanging data despite Facebook's having made a commitment that would not be done. Is there not still an issue with, for example, WhatsApp, where my understanding is that my friends' data is being harvested for use by the company? It may not be, or at least I hope it is not, transferred to Facebook. Within the company, however, it is harvestable data. Do the witnesses think it appropriate that people's data and contact information, which sometimes can be quite extensive, can be used for commercial purposes without their consent? Is that fair and proportionate on that principle?

On the issue of data surveillance in foreign jurisdictions, and standing up for our country, Judge Gerard Hogan and the Irish court system deserve great credit for the way they have raised such issues with the European Court of Justice to defend the rights of European citizens. Judge Hogan's judgments were exemplary in grappling with incredibly complex and difficult issues. Is the witnesses' opinion that all Irish data is subject to surveillance by the UK Government Communications Headquarters, GCHQ, when it exits this country on a fibre network going through the UK? What implications are there in the recent judgments, or work of the Data Protection Commissioner in regard to surveillance of data in other jurisdictions outside the EU, that may apply post Brexit? Is the commission considering that and can it help to guide the committee because we are interested in the wider policy issues and what we should do about that?

Photo of James LawlessJames Lawless (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

The Max Schrems case which generated an investigation by the Irish Data Protection Commissioner in 2012 related to third party consent which is exactly what the Kogan app used in 2013 which came out of Cambridge Analytica in 2016, 2017 and 2018. Am I right in saying that if action had been taken at that time none of the sequence of events that followed would have been possible?

When the Cambridge Analytica story broke in The Guardianand Channel 4 we saw the unedifying spectacle of the Information Commissioner in the UK bizarrely appearing on TV to say she would get a warrant and a week later appearing to enter with the warrant. It is generally not a good idea in criminal law or in evidence gathering to give people a week's notice of one's arrival. I understand there were several legal reasons for that because I read the detail afterwards but are we prepared, is the Office of the Data Protection Commissioner sufficiently resourced, to deal with a similar incident if it arose here?

Are the powers there in legislation? If the commission needs a warrant is there a process which can be used to get a judge at night, if needs be? Is the commission confident that the same situation would not arise here and that it has the necessary enforcement and dawn raid powers? If it does not have them the witnesses should tell us what they need and we will see if we can give them the powers.

We are five or six weeks away from the implementation date for the general data protection regulation, GDPR. I am not sure how ready industry is for this, given that the Bill will only come before the Dáil this week. How ready do the witnesses think business and the State generally are for the deadline of 25 May?

Photo of Michael McDowellMichael McDowell (Independent)
Link to this: Individually | In context | Oireachtas source

Can the witnesses reassure me that if, taking a topical example, somebody in Northern Ireland wanted to influence the pro-life, pro-choice issue here, there is no control or sanction in electoral or data protection law that can stop someone putting €1 million worth of selective advertising into that? The Democratic Unionist Party, DUP, did it in respect of Brexit but that is not a negative comment on the DUP. It saw that loophole and used it.

I echo Senator O'Reilly's comment about the need for a clear opportunity to opt out. I am probably one of the least computer literate people in this room and someone like me just brushes aside little messages about cookies and privacy policy. I just want to read an article. I never stop to think what the little pop-up message means but it should be possible to develop pop-up information packages, which are mandatory and are directed by the commission, on service providers in Ireland. In other words, for example, Facebook should every month remind people with a clear message, not three pages of close print, about privacy settings and what a user can opt out of.

Speaking from a position of comparative ignorance, do the switch off advertising controls mean that the harvesting of data is also switched off? To use Deputy Ryan's point, does it mean that one's service provider can accumulate and store data about one without using it for advertising purpose but for other purposes? I suppose there is very little the commission here can do to stop GCHQ looking at material going through fibres crossing the United Kingdom, or can it do something?

Photo of Alice-Mary HigginsAlice-Mary Higgins (Independent)
Link to this: Individually | In context | Oireachtas source

While the timeline in the presentation begins in 2016 the timeline in Cambridge Analytica begins in 2011. The witnesses have spoken of their concerns about litigation but there is an option between recommendation and litigation, which is an enforcement notice.

The Office of the Data Protection Commissioner has the power under section 10 to issue enforcement notices. Is it not the case that, if it were to have made use of that kind of enforcement notice in 2011 and 2012, the Cambridge Analytica breach might not have happened in 2013? Perhaps Ms Dixon could clarify also the question of how many enforcement notices the office has issued against State bodies or private companies? This is a robust existing power that does not require litigation and which can be used. It is a concern that there would be unwillingness, if there is that lead supervisory role, to use enforcement orders. For example, an issue which has come up a few times here is the question of privacy by default. I have been concerned that there is quite a lot of language about switching off settings. Can Ms Dixon confirm, for example, that we must have all privacy settings at the highest level and there must always be a situation of switching on an agreement to any action that shares data, rather than switching off the taking of data, which is a very important distinction? Again, there is a question of enforcement in that regard.

I was a little concerned at some of the language around Cambridge Analytica and the idea we are still waiting for hard evidence of it influencing political outcomes. Leaving aside the fact we have the vast commercial infrastructure that is based on micro-targeted advertising as evidence that it is clearly seen as having an effect - indeed, that is the main product offered by many of the commercial platforms - and leaving aside the fact we have long-established regulation in regard to electoral advertising because it has been identified as an area that needs regulation, so it would be extraordinary if we were not already on a presumption that it is having an effect, there is also the precautionary principle, which is a key principle in European regulation. Ms Dixon might address that question of the precautionary principle and whether we have an onus to take immediate action in areas where we feel there is a danger, rather than waiting for a long academic process in regard to the proven impacts in that regard.

I was also concerned with the idea that we are waiting for the ICO to make public policy recommendations that may have broader application for EU member states. While the ICO has its investigation in regard to Cambridge Analytica, I would think, given the lead supervisory role Ireland holds, that Ireland should be moving ahead and taking a lead, not simply looking forward to the outcomes from the ICO. Given that lead supervisory role, with Facebook Ireland serving 83% of all global Facebook users, I would like Ms Dixon to comment in particular on the issues around Article 9 of the GDPR and what she thinks the implications of Article 9 will be in regard to this area, specifically Article 9.2(d), which concerns the processing of special categories of personal data, that is, data around political or religious opinions and sexual orientation, and which, of course, includes photographs. These are issues which have been identified as needing particular protection and they, of course, form part of the profiling work by organisations like Cambridge Analytica. Will Ms Dixon clarify this specific question? In Article 9.2(d) it is clear the exemptions which are given around processing by political bodies, such as political parties, are for not-for-profit bodies and there is a prohibition on the sharing of that personal data or the work outside that body. Does Ms Dixon believe this part of the GDPR will provide a prohibition against any for-profit commercial company which seeks to process special categories of personal data without consent? Does she believe that is adequately provided for in Article 9? While it is legislation that is under way, Ms Dixon might also be free to comment on whether she believes this is adequately provided for in the Data Protection Bill?

In regard to the "honest ads" provisions and the online transparency proposals from Deputy Lawless, I believe there is a question around transparency, who is using the data and whether it is with permission. There is an assumption it is only about using the mechanisms around online behavioural targeting but there is the question of that data being shared.

On the gathering and storing of data external to Europe, we have seen online quizzes that are under way not necessarily on Facebook but on Google and other platforms that are seeking to build political profiles, and we have seen companies that have been gathering facial profiles and building databases in that regard. While I know the pending High Court referral will help on that issue, Ms Dixon might touch on it.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

There are many questions and we are short on time. I ask Ms Dixon to keep her answers short and if there is any issue where she feels she can give us more information in writing, we would appreciate that.

Ms Helen Dixon:

I will do my best. In response to Deputy Eamon Ryan, we would welcome receipt of the 14 questions he wants to submit in writing and we will be happy to respond. The Deputy talked about the delay in terms of Facebook implementing the recommendation. Again, looking back, although we never thought what Facebook was doing was proportionate, the issue of evolving social norms cannot be ignored when we are looking at novel applications of technology in an area like social media, which was still quite new in 2010-11. Max Schrems was not the only person calling out this issue about access to friends data - civil society bodies like the American Civil Liberties Union in the US had also been highlighting it.

In terms of the iterative process that was engaged in, we have this type of process with actors in all sectors. Given that there is an interpretation required of what is high-level, principles-based law in any particular area of application, be it in the insurance sector or the banking sector, or on a social media platform, there is a period when there has to be some type of iterative process where we share with a regulated entity what our interpretation of the law is and how it applies in a very particular scenario. It is legitimate that there may be push-back and an assertion of a legal basis or an assertion of proportionality, but, ultimately, that was not accepted by our office and the issue was resolved.

It is very easy to look back and say things were black and white. An example of an evolving social norm that might be helpful to consider in this context is the issue of caller ID. When this first came out in the 1990s, users considered it an invasion of privacy if someone they were calling could recognise their number and if their caller ID was automatically submitted. Now, very few people would answer their mobile phone if they could not see who was calling them, so there has been a 360° spin in terms of the social norm around caller ID. While it is just one example and I am not saying the access to friends data fell into that category, what I am suggesting is that there is often a process with all types of sectors and it is not simply black and white.

In terms of when we became aware of the Cambridge Analytica case, it was on 16 March when the story was broken in The Observer. It was not new news to us that friends data had been accessed and it was a feature of the platform until Facebook made the upgrade. What was new news to us was that The Guardianhad contacted Facebook in 2015 in regard to allegations that all of the data had not been deleted by Cambridge Analytica.

In terms of sanctions, looking at it under the current legislation, in particular the Irish Data Protection Acts, the committee will be aware there are very few offences under the current legislation and the legislation is directed towards compelling compliance with the Acts. Under the GDPR, clearly, the sanctions are considerably higher.

There are three use cases for data sharing that were proposed by WhatsApp and Facebook when WhatsApp sought to amend its privacy policy after the approved merger. Two of them related to business analytics and security on the platform and the third use case relates to sharing of data between the two entities as controllers for the purposes of ad serving and friend suggestions.

We have prohibited that use and Facebook and WhatsApp have provided us with an undertaking that they will not implement that use case until the DPC is satisfied that there is a legal basis for doing so. Deputy Eamon Ryan asked whether what we are discussing could be done without a person's consent. The answer is "No".

The Deputy also referred to contact lists, which are covered in one of the recitals to the GDPR. This is an area of the law at which we intend to look further in circumstances in which the recital under the GDPR clarifies matters. This happens with all platforms and Internet service providers. Platforms give users the option to upload the contact lists on their phones to be managed by the Internet service to facilitate the user. The recital in the GDPR spells out, quite correctly, that the uploading of the contact list from a person's phone is done under the so-called "household exemption" under data protection law. As such, when one acts under this exemption, the full laws do not apply in terms of uploading the list. However, further use and processing of that contact list by the platform or any controller requires a legal basis. I am not sure if the Deputy is bringing to our attention the fact that some platform is making further use of material. If so, we would be interested to receive the information.

Deputy Eamon Ryan's final question related to data surveillance and GCHQ post Brexit. The EU Commission is already being pushed by the UK in this regard and will be required to agree some type of mutual recognition of the UK having an adequate level of data protection. Assuming that there is no such recognition, the UK will undoubtedly seek an adequacy finding from the EU Commission. It will be the responsibility of the Commission to conduct a full analysis in the context of making an adequacy finding in respect of the UK. We will co-operate in that process.

Deputy Lawless referred to Cambridge Analytica. I note that some 300,000 users directly engaged and gave consent to the downloading of the app. If it transpires to be true that Dr. Kogan passed the data to Cambridge Analytica, that would have occurred in any event minus the friend data had that been restricted. The supervision activity Ms Neary is undertaking with Facebook is around the case regardless of whether friend data was accessed or not, how Facebook vets apps, how Facebook controls the permissions it gives apps and what examination it is carrying out of the privacy policy proposed by an app developer.

On the issue of warrants, we have very strong powers already to enter premises. We conducted almost 100 on-site inspections and investigations last year. Some of those involved dawn raids conducted by the assistant commissioner, Mr. Tony Delaney, regarding a series of prosecutions he has commenced involving private investigators. Our powers are further strengthened under the new Data Protection Bill. While we will continue to have the same powers to enter, we will also gain the ability to obtain warrants from the District Court where there is an objection by controller to our entry onto a premises. We are confident about those powers.

The Deputy also asked about the level of industry readiness in the context of the GDPR. Our office has undertaken a huge awareness campaign, particularly during the past 12 months. We surveyed the industry, especially SMEs and microenterprises, a year out from the GDPR. We saw that while there was a high level of awareness of the new laws, there was a great deal less preparedness in respect of them. We are rerunning the survey and should have the results next week. We can keep the committee updated on that.

I was asked about the huge number of events at which we have spoken and the type of representative bodies, for example, the Small Firms Association, with which we have engaged. There has been huge engagement in Ireland on the GDPR and there is a high level of awareness. However, there are some areas in respect of which organisations are not fully prepared, including some of the new ones covered under the directive. In particular, the requirement to deliver higher standards of protection for children is an area in respect of which we have not seen the level of engagement we want.

Senator McDowell asked about a situation whereby someone from Northern Ireland might want to influence matters. He or she could seek to influence offline or online. The question related to online and data protection issues. I cannot speak to what electoral laws would apply. I presume finance and transparency provisions would apply to whoever was funding a campaign in terms of making a declaration. In the context of targeting online, such an entity could purchase advertising with Facebook and target based on segments.

I was asked about clear opportunities to opt out. The opportunities are not clear enough. We have been having the same discussion since I became Data Protection Commissioner. Those who are in the data protection business longer than me say that we have been having the same discussion for ten or 20 years. This is why Ms Viviane Reding has said that the time has come for punitive fines and a new, modernised, fit-for-purpose law, which is something we will have from next month. Aside from clear opportunities to opt out, free Internet services based on the monetising of personal data and the way in which the ad-tech sector operates require us to unravel and look behind the whole sector. In data protection terms, any publisher or newspaper using social plug-ins and dropping third-party cookies from its website is the controller in those instances and has obligations under the law. This is what we have started to discuss with our fellow data protection authorities. As we supervise the platform and its responsibility, there is a need to look at all of the other players in a sector which has a very long tail that leads to opaqueness for users.

Photo of Hildegarde NaughtonHildegarde Naughton (Galway West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I am sorry to cut Ms Dixon off. We will send her a transcript of the questions posed by the members and seek written replies to them. I thank Ms Dixon and Ms Neary for attending and for providing such a comprehensive response. The committee will publish the opening statements and submissions on its website. Is that agreed? Agreed. We will suspend for a moment to allow the witnesses to withdraw.

Sitting suspended at 4.07 p.m. and resumed at 4.14 p.m.