Oireachtas Joint and Select Committees

Tuesday, 13 February 2024

Joint Oireachtas Committee on Justice, Defence and Equality

General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill: Discussion

Dr. Nessa Lynch:

Good morning from New Zealand where it is already Wednesday. I thank the committee for the opportunity to speak. It was great to listen to the expertise of others appearing today. I would like to highlight three overarching points from my written submission. This draws on my expertise in biometrics and biometrics regulation from an academic perspective in various regulatory and ethics roles in the public sector and in police leadership. Regarding human rights compliance, the risks of biometric technologies in policing have rightly been highlighted in the scholarly and advocacy literature and by my expert friends here today. Facial recognition technology involves the collection, analysis and retention of sensitive personal data. As we heard, it poses significant risks to collective and individual rights such as privacy and freedom from discrimination. I agree strongly with this analysis and have made those points in my own work. I also invite the committee to reflect on the wider human rights framework.

The State has a duty to protect the human rights of all. I argue that assessing the use of FRT can be very context-specific. There are situations in which it may be possible to use the technology in a human rights-compliant way or to deter or prevent significant human rights abuses. As an example, I come from a disciplinary background in children's human rights. I found in my work that children and young people from ethnic and social minorities may be disproportionately impacted by the use of biometric technologies due to their use of public space and existing inequities in the impact of policing. I also know that in the growing area of online child exploitation, these types of tools can be used to identify alleged victims and alleged perpetrators, thus fulfilling Articles 19 and 36 of the Convention on the Rights of the Child. These two things can be true at the same time. It is vital to look at those contexts. While perceived social licence, political expediency or public opinion cannot in any way override fundamental human rights, we have to reflect on situations in which technology such as this can be used in a human rights-complaint way where there is a legitimate law enforcement objective.

Moving to the particular use case contemplated in the draft legislation, which is retrospective, which means that analysis will occur on already collected imagery. It will depend on the system and vendor contracted, were these proposals to go ahead. There is a spectrum within retrospective use ranging from the kind of speed and scale improvements where a single image is compared against footage to use cases that are very close to live automated, where there is only a small time lag and the reference database is large. That needs to be considered as well in terms of that spectrum. Previous work I have done on FRT in the policing context has categorised retrospective FRT as medium risk. This is dependent on the existence of a significant system of controls. I encourage the consideration, which will be compulsory, of the European Union AI regulations, which I think aligns with that scheme. As we heard in previous submissions, accuracy and bias remain significant concerns. It is important to reflect that humans are not a gold standard for visual identification. It is important not to fall into that trap for comparisons.

Legislation is a broad power which empowers and restricts but any attempts to implement safe ethical and human rights-compliant technology requires a comprehensive system of controls. Some of these are within the power of the committee as legislators or members will have the opportunity to signal their importance to the operational matters for the Garda. This would include the legislation itself and regulatory controls such as the independent oversight of patterns of use, a requirement for internal and external controls, technology assurance such as the vendor with which the State might enter into an agreement, assurance of technology use, audit of use, technical matches, rules on matching and-----

Comments

No comments

Log in or join to post a public comment.