Oireachtas Joint and Select Committees
Tuesday, 13 February 2024
Joint Oireachtas Committee on Justice, Defence and Equality
General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill: Discussion
Mr. Simon McGarr:
Yes. Both ICCL and Digital Rights Ireland thank the committee for the opportunity to discuss this proposed legislative scheme.
The use of facial recognition technology, FRT, by police engages many fundamental human rights, including but not limited to the rights to human dignity, privacy, protection of personal data, non-discrimination, protest and freedom of expression, all of which are enshrined in the EU Charter of Fundamental Rights. We call on the committee to urge the Government to reconsider the proposal, as currently presented, to introduce FRT into Irish policing as we believe the risks to these fundamental rights are too high in the legislation before us at the moment. We make this call for several reasons.
We endorse the statement by the Data Protection Commission. The results of FRT are unreliable. It is not a silver bullet solution. FRT involves comparing a biometric template created from a face detected in an image or video against a reference database of biometric templates in an attempt to identify a person. However, even when there are optimal conditions with respect to image quality, facial recognition technology is not designed to give police a singular positive identification or 100% match for a person. Instead, at best it gives a person running an FRT search a guess list of who the person could be. It provides a list of potential candidates accompanied by similarity scores. A threshold value is fixed to determine when the software will indicate that a probable match has been found. Should this value be fixed too low or too high, it can create a high false-positive rate or high false-negative rate. There is no single threshold setting which eliminates all errors.
Facial recognition technology is also inherently discriminatory. As the technology stands at the moment, the discriminatory effects of FRT are well documented. While error rates vary depending on the multiple factors that can affect the performance of an FRT system, these errors do not affect all individuals equally. Studies have clearly demonstrated deeply inherent racial and gender biases in FRT, meaning women and people of colour would be more likely to be misidentified and brought to the attention of An Garda Síochána than white men. FRT can enable powerful, mass, indiscriminate and pervasive surveillance. The implications for police use of this highly invasive technology can vary depending on the purpose and scope of its use.
The use of FRT by gardaí, as proposed in this scheme - to use any images or recorded footage that An Garda Síochána legally retains, or can legally access, to locate, identify and track people in respect of certain crimes - would result in a seismic shift in the surveillance capabilities of Irish policing. This brings us to our fourth point.
The general scheme is unlawful under EU law. The committee must consider: the EU law enforcement directive, as transposed in our Data Protection Act 2018; the forthcoming artificial intelligence Act; case law from the Court of Justice of the European Union; and recent guidelines published by the European Data Protection Board, EDPB, on FRT. With those in mind, it is our position that this general scheme is not in step with those frameworks. For example, the use of FRT, as provided for, is not clear, precise or foreseeable. It creates a model of indiscriminate surveillance of people in Ireland. It fails to limit the use of facial data to when it is strictly necessary. It fails to ensure that any FRT use would be targeted in terms of the individuals to be identified. It fails to ensure that anyone whose biometric data is processed by FRT is directly linked to a specific crime, as required under the EU law principles of necessity and proportionality.
In conclusion, we urge the committee to urge the Government to reconsider introducing the FRT proposals before us at the moment. We warn that to do so on foot of ill-defined methods and purposes is to invite not only breaches of innocent people’s rights but also to see otherwise secure convictions at risk on subsequent appeal if the underlying legislation is found to be unlawful.
No comments