Oireachtas Joint and Select Committees

Tuesday, 13 February 2024

Joint Oireachtas Committee on Justice, Defence and Equality

General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill: Discussion

Dr. Abeba Birhane:

I am very grateful for this opportunity. I really appreciate the opportunity to appear before the Joint Committee on Justice.

In seeking policing FRT legislation a second time, Ireland runs the risk of inadvertently imposing a technology that is ineffective, inherently flawed, opaque and proven to be discriminatory. A robust body of scientific evidence over the past year has demonstrated flaws and inaccuracies arise time and again. Although often presented as a cost- and resource-effective aid to policing, FRT has proven to be ineffective and intrusive. For example, in a recent survey by Big Brother Watch in the UK that reviewed police use across Wales of FRT that scanned over 508,542 faces, it was found that over 3,000 people were wrongfully identified. An inaccuracy rate of 88% was recorded in the period 2016 to 2023. In all this time, only three arrests were made. I point to this research to show the ineffectiveness of the technology.

In an audit carried out by Cambridge researchers only last year, which evaluated the ethics and legality of police use of FRT in the UK, the researchers complied ethical and legal standards for governing facial recognition based on extensive literature, feedback from academia, government, civil society and policing organisations. They then applied the resulting audit tool to evaluate three facial recognition deployments by police forces in England and Wales and found that all three technologies failed to meet the stated criteria or standards.

Although computer vision, the basic technology for FRT, has come a long way since the misidentification of black people as gorillas in 2015 by the app Google Photos and huge error rates in identifying black-skinned faces, as confirmed in the 2018 study by Buolamwini and Gebru study that found error rates of 34.7% for black women by comparison with 0.8% for white men, the technology remains deeply flawed. In a recent study carried out by me and my colleagues, which evaluated the latest state-of-the-art computer-vision models on classification tasks, we found that black people are still labelled and misclassified as "criminal" and "suspicious person" at a much higher rate. So far in the US alone, we know of six people who have wrongfully been arrested and convicted due to errors in FRT, five of whom are black men and one of whom is a black woman. Again, the discrimination tends to be against minoritised identities.

I cannot emphasise enough the importance of independence when it comes to evaluations and audits of the technology. Although the technology is deployed to track, monitor and surveil the public, it operates in the dark without clear oversight, transparency and accountability. This is partly due to the proprietary rights that come with technology. For example, there is no access to training data, model architecture and other critical information necessary for model performance. For the most part, all the information remains hidden from the public and independent auditors, essentially making independent audits and evaluations impossible.

If Ireland goes ahead with this technology, it is just a matter of time before it becomes another cautionary international headline. I thank the committee.

Comments

No comments

Log in or join to post a public comment.