Oireachtas Joint and Select Committees

Tuesday, 13 February 2024

Joint Oireachtas Committee on Justice, Defence and Equality

General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill: Discussion

Dr. Abeba Birhane:

In reference to the finding which was mentioned in the presentation, again, I am not clear as to the particular evaluation referred to. The National Institute of Standards and Technology, NIST, is a huge organisation which produces plenty of reports. For example, in December 2019, NIST evaluated the effects of study variants of race gender in safe zone facial recognition software. It asked how accurate was facial recognition software in identifying people of varying sex, gender and racial background. According to a new study by NIST, the answer depends on the algorithm at the heart of the system. The application uses it in its data effect but the majority of facial recognition algorithms exhibit demographic differentials. A differential means that an algorithm exhibits two images of the same person which varies from one demographic to another. I am just presenting this to show the NIST reports are vast and the specific report I am familiar with had 99% accuracy.

On racism, sexism and other discriminatory attributes on datasets or databases, I cannot speak to the reference database which An Garda Síochána will be using but I can speak generally about computer vision systems which are the basis of facial recognition technology. These systems are built where one generally trains an algorithm for various tasks. In the case of a facial recognition algorithm, one needs millions if not billions of images to train the algorithm so that it recognises, identifies, classifies and does all sorts of tasks on a given image. On the reading of that data, that is usually my expertise and I will read those types of datasets when they are available or are open source. To us, data, according to almost all audits carried out, have a tendency to label images of, again, minority identities with problematic slurs and connotations. As I mentioned previously, in one of my own research projects we found the state-of-the-art and the latest models tend to label black images, more particularly images of black men, as suspicious persons and entities. I am talking about the data which are used for training. One has training data that are the core and the basis for these algorithms. Then there are fine-tuning data in the reference database. Again, I am not very familiar with that in the context of Ireland. I hope that clarifies some of the questions which have been put forward.

Comments

No comments

Log in or join to post a public comment.