Oireachtas Joint and Select Committees

Tuesday, 26 November 2019

Joint Oireachtas Committee on Communications, Climate Action and Environment

Data Collection by Digital Assistants: Discussion

Photo of Timmy DooleyTimmy Dooley (Clare, Fianna Fail) | Oireachtas source

I thank both the witnesses for their presentations. I think I was among a number of people who raised this issue back when Apple made an announcement - it was probably reported in The Guardianor one of those publications - that a number of people had been let go because of the issue of human review of sensitive private conversations. We were probably all a little concerned that more information than was necessary was being captured. I think part of Apple's defence at the time was, as Mr. Sutherland identified, that human review is necessary to improve the algorithms and assist in machine learning, and to an extent we get all that. However, unless there is an appropriate regulatory regime for the big tech companies, then, as night follows day, liberties will be taken at some point in the future.

The most important aspect of it is that individuals have a clear knowledge from the start that the interceptor, as it were, is always on and that it is not minded to always understand the call, whether it is Alexa, Siri or whatever. It is a bit like information on cigarette packets about the harmful nature of smoking; it requires authorities to flag the potential pitfalls of an activation that was not intentional. Many people would not have known prior to this story breaking that there was the potential for such a level of inadvertent activation. People do not know about it. I know many people who would not be anxious to have it on all the time if they knew the potential for other conversations to be recorded. There is an issue around protecting citizens from the perspective of civil liberties, which perhaps falls to the regulator. It needs greater support from the State through laws that would require a much greater volume of information upfront and a very clear opt in, because I agree with the Chair that one is linked to so many different platforms that one just clicks through. Of course things are buried and if someone goes to the bother of checking what information is held on him or hers, he or she agrees to it. Even when it comes to just going through the standard approval to see what we have given different permissions for, many people just leave it and move on. That is the beginning of a slippery slope to giving much greater control to the big tech companies. We have to be clear that people must opt in in a very clear and concise way. It is not about opting in to give them something that they want at a particular moment. They must opt in very clearly and be shown the consequences of doing so. Those consequences must be writ large, not just in small print. I do not know if that comes from the regulator. I doubt if the regulator having those powers is contained in the GDPR but perhaps it is. Perhaps the rules of engagement need to have some other type of legislative basis, rather than data just being captured, so that people are clear from the start. It is important that we get that right.

When there is an inadvertent capture of information, sensitive or otherwise, a log is prepared. It is used at a later stage in the human review to prepare better machine learning and artificial intelligence. In any of those inadvertent captures, are logs provided to the regulator? For every inadvertent activation, is a log prepared and forwarded to Mr. Sunderland as regulator?

Comments

No comments

Log in or join to post a public comment.