Oireachtas Joint and Select Committees

Tuesday, 13 February 2024

Joint Oireachtas Committee on Justice, Defence and Equality

General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill: Discussion

Dr. Ciara Bracken-Roche:

I thank the committee for having me here today to contribute to this discussion. I am an assistant professor in the school of law and criminology at Maynooth University and an adjunct professor in criminology at the University of Ottawa, Canada.

I am a specialist in surveillance technologies and privacy including national and international frameworks and policies. I focus mostly on the development, regulation and use of new technologies for policing, public safety and security purposes.

To echo what Professor Kaye said, I welcome the production of this draft Bill and amendment to bring facial recognition technology under the rule of law to help safeguard the Irish public. However, I am concerned that the adoption of FRT could do more harm than good in this context and that it could be counterproductive for An Garda Síochána’s operational goals. The development and consistent review of Garda practices, resources and technologies should occur in tandem with the adoption of new technologies, which need to be assessed on their own merit, but also in the context of broader organisational operations.

In this statement, I will highlight three key points relating to effectiveness and the proprietary nature of FRT for the committee but I am happy to discuss other things later on. First, with regard to public order policing or surveillance of public spaces, everyday garments can often render FRT useless because these simple physical barriers stop FRT from seeing the face clearly. Thinking of the riots, baseball caps, face coverings, face masks, glasses, hoods and umbrellas can all potentially obstruct FRT. Notwithstanding this issue, the chilling effects at play should prohibit the use of FRT in public spaces in the first instance.

Second, FRT relies on vast databases of images to operate, using algorithms to find details about one face to assess its similarity to others. Instead of positively identifying an unknown person, some systems will calculate a probability match score between the unknown person and specific face match templates in a database. This means that, instead of a single match, the system offers up several matches ranked by probability score. This puts privacy at risk as individuals who have nothing to do with an event might still be brought into the investigation if their probability score is high enough.

Lastly, the proprietary nature of FRT means the algorithms and processes inside the system are black-boxed. They are often unexplainable to government institutions and the public alike. The practices employed by FRT companies are questionable at best with one of the largest international providers, Clearview AI, being fined and sanctioned for the inappropriate collection, use and disclosure of personal information, creating risks of significant harm to individuals who have never been engaged in a crime and collecting images in an unreasonable manner.

Comments

No comments

Log in or join to post a public comment.