Oireachtas Joint and Select Committees
Tuesday, 24 March 2026
Joint Oireachtas Committee on Artificial Intelligence
Artificial Intelligence, Justice and Law Enforcement: Discussion (Resumed)
2:00 am
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
We have received apologies from Deputy Gibney. A number of colleagues have indicated that they are going to be running a little bit late.
I am sure our witnesses are aware, but I need to repeat the rules and regulations. I advise members of the constitutional requirement that they must be physically present within the confines of the Leinster House complex in order to participate in public meetings. The evidence of witnesses physically present or who give evidence from within the parliamentary precinct is protected by absolute privilege in respect of the presentations they make to this committee. However, they are expected not to abuse this privilege and it is my duty as Chair to ensure that this privilege is not abused. If their statements are potentially defamatory in relation to an identifiable person or entity, I will direct them to discontinue their remarks. It is imperative that they comply with any such direction.
Members are also advised and reminded of the long-standing parliamentary tradition to the effect that they should not comment on, criticise or make charges against a person outside these Houses or an official either by name or in such a way as to make him or her identifiable.
I thank our witnesses for coming here today. This is our second meeting on the topic of artificial intelligence and looking at the issue of justice and law enforcement as part of our module looking at AI and the State. We are very happy to be joined today by representatives of the Courts Service. I welcome Ms Angela Denning, chief executive officer, and Dr. Owen Harrison, chief information officer. I thank them for being here today. I now invite Ms Denning to deliver the opening statement.
Ms Angela Denning:
I thank the Cathaoirleach and members of the committee for the invitation to appear today. I look forward to discussing the topic of justice and law enforcement. The Courts Service has managed and administered the courts since 1999. As the committee knows, the administration of justice is solely a judicial function. In accordance with the constitutional independence of the Judiciary and the Courts Service Act, I cannot comment on any matter relating to judicial functions or individual cases.
The Courts Service is an independent State agency established to support the third branch of Government, the Courts and Judiciary. Our primary functions are to manage the courts, provide support services to judges, provide information on the courts system and provide court buildings and facilities for court users. We are midway through our modernisation programme to improve access to justice, aimed at creating a digital, user-centric and efficient system. Technology is central to this transformation. Artificial intelligence offers opportunities but also requires safeguards, especially in a justice environment where fairness, transparency and rights protection are critical.
AI tools are now common in daily life and have been used for some time without difficulty in legal research and document review. The Courts Service has a governance framework in place in relation to the deployment of AI, which outlines governance procedures and structures for the responsible adoption, deployment and oversight of AI tools. It establishes the guiding process documentation requirements, governance roles and supporting structures necessary to ensure AI is used appropriately. It is aligned with the AI guidelines for judges on the responsible use of generative artificial intelligence published in 2024 by the Office of the Chief Justice, which adopt a similar approach to judicial guidelines in comparator jurisdictions. These guidelines were developed where it was considered essential for all judges to understand the benefit and limitations of generative AI. Judges must ensure any use of AI is consistent with the overriding obligation to ensure the integrity and reliability of the legal research and analysis which underpin judicial decision-making. Its use must also be consistent with the overarching obligation to protect the independence and integrity of the administration of justice and the protection of fundamental rights.
Judges remain responsible for their decisions. The use of generative AI on trusted databases can be a useful supplementary tool in some contexts but cannot replace the judgment, empathy and ethical responsibility of judges in the performance of their judicial role. The Courts Services framework ensures that any AI deployed by the Courts Service adheres to the obligation under the EU AI Act, our own information security standards on generative AI and assesses the risks associated with the introduction of new AI technologies. All new proposed uses must be aligned with the seven principles of the public service AI guidelines, reviewed by our data governance unit and referred for consultation with staff, the senior management team and-or judicial AI working groups as required. A register of all AI systems in use within the Courts Services is maintained and a data protection impact assessment may be required for certain AI deployments.
We are conducting AI trials, including a guidance retrieval chatbot, tools to help judges interrogate large document sets, a proposal to extract information for judges from digital audio recordings and a proof of concept to support interpretation needs in court. We have also deployed Microsoft Copilot to staff and judges with appropriate training to provide secure access to an AI tool. As a public body deploying AI, we must ensure human oversight, clear use-case boundaries, transparency, assessment of risk and AI literacy. Our framework directly supports these obligations. Public trust is the cornerstone of the justice system and that trust must be earned through openness and clear communication about how and why these technologies are used. I thank the committee again for the opportunity to speak with members today. I look forward to answering any questions members have.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
We will now have questions and answers from members.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I thank the witnesses. I will start with the recent report by Lorcan Staines on bail laws because it touches upon the information-sharing systems in the Courts Service. That report arose from the tragedy of Shane O'Farrell's death and all of the failings which the State has apologised for that took place in relation to Mr. O'Farrell's murder. The first recommendation in Lorcan Staines's report is that:
Significant and immediate investment should be made in the information sharing systems for both An Garda Síochána and the Courts Service. The accurate recording and sharing of information between these bodies should be prioritised.
What is going on in respect of that?
Ms Angela Denning:
I will start and then I will hand over to Dr. Harrison for the details. The Courts Service was subject to a capability review back in 2017 and 2018. It found that our IT systems at that stage were underfunded. They had grown up organically so they did not speak to each other. We had a lot of legacy systems. We had about 150 individual systems that did not speak to each other. The modernisation programme has sought to address that. We are halfway through. The key part for us is about reducing those systems, improving our standard of data so that they then can speak to each other and, importantly, they can speak to other Government agencies. We are at the stage where the Central Criminal Court will go on to our new unified case management system over the Easter break. The Circuit Criminal Court will go over the course of this summer, and then the District Criminal Court, which is the biggest amount of work that we have, will go next summer. I will hand over to Dr. Harrison then on how we propose-----
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Just before Ms Denning does that, is it correct to say she highlighted a report from 2017?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Lorcan Staines's report is from this year, 2026.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
That is some nine years after the first report. The number one item he put on his report, at least in the context of bail, was to improve the sharing of the systems between An Garda Síochána and the Courts Service. Has the system improved in those nine years in that specific area and, if so, what does that look like?
Ms Angela Denning:
I will let Dr. Harrison answer that bit. The hole that we were in was very deep. It was not a simple job to lift ourselves from that digital deficit to where we are today. It required significant investment. We had to completely rebuild our capacity and capability internally. Our IT team has gone from a very small one to a team now of almost 50 or 51 professionals.
Ms Angela Denning:
We have had a big change that has required investment from the Government. Thankfully, we have got that and we now have a steady income for our IT capital side as well. That is really important to be able to do that. That is the foundation upon which the data sharing can take place.
Dr. Owen Harrison:
A lot of data sharing is currently happening between the Courts Service and the Garda. At the moment, we share summons data and charge sheets data. The charge sheets data initiative, for example, happened in the last four years. There has been a significant ramping up of data sharing between the two bodies in question. One of the main things to mention is that-----
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Does Dr. Harrison accept that it is one of the biggest deficits identified in Mr. Staines's report, at least in the context of bail? He has flagged it as the number one recommendation for improvement that is needed. It is only obviously a month or two ago when his report was published.
Dr. Owen Harrison:
It is recognised that one of the key milestones that we want to get to is sharing of data, particularly in the higher courts. A lot of data that is shared is in the District Court. The current systems have been in place for donkey's years. As referred to by Ms Denning, we have been steadily making progress to be able to replace those. It has been a long process to put in place a team, build up the expertise and design the systems. It is now that we are starting to deliver the new systems. Ms Denning mentioned that, on 13 April this year, we are going to replace an ancient legacy system that is inoperable with anyone else with a new what we call unified case management system. That is the key pillar for our modernisation programme that started five years ago. The idea is that all our case management work will be put into the unified case management system. What this ultimately means is that when we have done with the deployment of that, we then will have a project to share data from the higher courts with the Garda, which is the major deficit that was mentioned.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
How far away are we from that outcome?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Is that just for the unified case management system?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
It will be moved to the higher courts.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
If we have the unified case system completed by the end of 2027, would Dr. Harrison envisage it as a two-, three- or four-year time period before he gets the work done, or shorter than that?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
We would be talking 2028.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Has the Courts Service carried out a peer review? As we know from this committee, AI is constantly changing in terms of its benefits and how it is adapted. Is the Courts Service carrying out any ongoing analysis of what other court systems are doing internationally, how it could benefit from the changes other court systems have made in using AI and how AI could be adapted for our system?
Dr. Owen Harrison:
We are particularly close to the British courts and those in Northern Ireland, Canada and Australia. We have had some very early morning meetings with colleagues in Australia and keep in touch with them regarding not just AI, but day-to-day digitisation, digital transformation and adoption approaches, and "gotchas" to watch out for. Those services are critical for us to learn from in regard to our modernisation programme.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Will Dr. Harrison communicate one or two things to the committee that he has heard of, including from committees, that, if adapted for our system, could be a lot more transformative than what happened between 2017 and today in terms of investment in court systems?
Dr. Owen Harrison:
The main thing for us is to stay focused on fundamental reform of the foundations. One cannot put AI on top of paper, and we are still drowning in paper. We need to stay the course and digitise the rest of the journeys through the courts. Once there is a digital foundation and the data is right, it becomes easier to integrate responsible AI on top. Notwithstanding the fact that we would not do that, the best jurisdictions are those that are furthest along on their digital transformation journeys.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I thank the witnesses for attending. We are here to discuss AI. Do the witnesses see a conflict in the use of different AI software for the same rules, especially in the context of language? I refer to the Irish language. I could say that I was in the pub last night and had "loads of craic" with all of the lads. In court, that might mean something completely different. How does AI cope with such language in the system?
The witnesses have said that trust is probably the most beneficial element of the system. Citizens have to have trust in court cases. The witnesses have said that one of the main advantages of AI is a reduction in costs to citizens, but is there a possibility of the quality of justice being undermined in the process?
Dr. Owen Harrison:
I will touch on the first question. One of the key advances in large language models is trying to accurately translate and pick up on the nuances and subtleties of human language. Does it do it right? No. Does it do a better job than it used to? It has vastly improved in the past number of years.
We want to do a proof of concept this year around the use of real-time translation services. It is very much a proof of concept that involves experimentation. It is hoped that we will be able to give members of the Judiciary and the courts options for moving cases along in respect of small procedural matters that need a translation capability in the room that is not immediately available and could otherwise lead to adjournments in order to bring back a person to get over that hurdle. A number of judges want to see whether we can do anything in this space.
We will test the proof of concept. As is the case with a lot of proofs of concept, we have carried out about nine and only two will go into production. There is often overhype in AI, so we have to thoroughly check the outcomes. If there is a potential to undermine the court and fairness in procedures, which has to drive everything we do, such as a risk of mistranslation where something subtle goes through that does not deliver the correct outcome we all want, then it should not be used.
However, we will go through it and test it thoroughly.
Ms Angela Denning:
Údarás na Gaeltachta is building a model for translation from English to Gaeilge and vice versa. I have seen it demonstrated. It has been able to deal with dialects from different parts of the country and very strong country accents. It has been able to cope with all that. I hope that, as with many State agencies, we will be able to piggyback on some of that work for Irish–English translation.
On the question of trust and equality of justice, we provide tools to the Judiciary in the same way that, for years, we provided pens and paper. We do not tell the Judiciary what to do with the tools. The same applies to the tools under discussion. The guidelines are very clear. At the end of the day, the judge continues to make the decisions. I do not envisage equality of justice changing. I envisage that some of the tools may assist people with their understanding of the court process, for example. One of our statutory obligations is to provide information on the courts system to people. We have done a lot of work on that so people will understand what will happen when in court. Our website got a score of 100% from the National Disability Authority in the most recent audit. The Courts Service was one of five public bodies to get it.
Again, it is about people having the relevant information before going to court. Their knowing what the court can and cannot do for them is really important. At the end of the day, the decision-making stays with the Judiciary. This is where the equality piece is really important. We need to make the courts accessible.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Is there hard evidence from other countries that costs have been reduced by using AI?
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
We should know that, in the courts, we do not assume anything.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Ms Denning is saying one of the main benefits would be a reduction in costs for citizens. Is there any evidence of that from other countries?
Ms Angela Denning:
It is a case of getting good information from a trusted, reliable source. We want our website to be the number one source of information for people making inquiries about court. If good, reliable information can be got for free, it reduces barriers for people and then they can make decisions about how much they want or need to spend.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
On that, other countries have online dispute resolution for small claims.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Is there a blockage in the Irish system in that regard?
Ms Angela Denning:
I do not think it is a blockage. We are not comparing like with like all the time, so the reality is that we always have to go back to the fact that we are a common law jurisdiction. In civil law jurisdictions, you are very much rules bound. It is easier to apply technology where there are strict rules. In Germany, for example, there is a system in place for airline claims, arising from Frankfurt Airport being a hub. Where countries have used the technology, we would be interested. We spend a lot of money on translation and interpretation. Spain uses technology in this regard very effectively. We would certainly consider that. Singapore has speech-to-text technology available. Slovenia also has speech-to-text technology. It almost gives subtitles and you can pick your language. That type of thing would be of great assistance to court users and could save the system money through obviating the need to have interpreters available for every small matter, for the procedural matters. For trials, we will still have our interpreters available, and indeed interpreters for the legal teams. However, it is about the type of thing I have referred to.
On online dispute resolution, Australia has technology called Amica, which is for family law disputes. The first thing it asks you is whether you are amicable.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Does anyone answer “No”?
Ms Angela Denning:
It drafts papers for people who are amicable and actually drafts the court applications for them to have orders made to settle their affairs where they have made agreements between them. Again, however, you have to be very careful. Domestic violence agencies say to us that we need to be really careful about coercive control, for example, in these situations. However, I see potential.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Does Ms Denning see the abduction resolution model as a good model?
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
This is a model that basically says AI will pick out the best preference in a situation and put it forward, but it might not exactly be the best preference, and there might be two or three choices.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I will give an example. In America, the judicial system used a sentencing programme. There were black people and white people, and the black people were getting more sentencing in certain things. That is the point I am coming at.
Dr. Owen Harrison:
Under the EU AI Act, it is not legal to use automated sentencing. It is under the prohibited section, so it cannot be used. The correctional offender management profiling for alternative sanctions, COMPAS, software is blocked. We are well aware of it and that it can subtly ingrain bias that only becomes apparent over time. Any use of AI, therefore, would have to be in a measured way to make sure bias was not being introduced without it being noticed. The COMPAS system that has been in the US for many years now uses profiling.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I thank the Cathaoirleach.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
I thank Deputy Mythen. For our guests who have just joined us in the Public Gallery, we are exploring how we may use artificial intelligence within our court system with the Courts Service. A number of colleagues are online, but I do not think any of them are in Leinster House at the moment. They might signal to me if they are. If not, it then falls to me to ask my questions.
I again thank our witnesses. We see artificial intelligence as a tool. It is not a goal in and of itself. It is a tool to allow us to see if it will enable us, in this case, to deliver justice more effectively and efficiently. I would agree that certainly in the case of sentencing, and it is prohibited, we would have concerns in that regard.
In the context of the administration of justice, though, there are certainly many improvements that can be made. I would like the witnesses to talk a little bit more about case management, for instance, where AI may be used. One of the regular criticisms of the court system, as the witnesses will appreciate, is how long a case takes to get to court.
Additionally, in the area of stenography, there is the actual recording of court cases.
Judges using AI for their research was mentioned. One of the challenges, or issues, often raised, concerns consistency of decision-making and sentencing. Can we see a situation whereby AI is used by judges in their research to look at case law? It was mentioned that it is being used on a pilot basis. What are the circumstances in which that would happen?
Ms Angela Denning:
I might start with the last point first. On the research aspect, the judicial guidelines are very clear. They are available on the Judicial Council website, if the Cathaoirleach wants to have a read of them. They are very clear that AI should not be used for core research or core legal analytics. Now, that is not to say that practitioners are not using it. I can see that there are definitely use cases there where it is being used for legal draft, first-cut submissions and so on.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
In the UK, LexisNexis had a recent study that found 50% of barristers there admitted to using generative AI for research purposes. There is no data here, that I am aware of. Would Ms Denning think that it is probably a similar figure in Ireland?
Ms Angela Denning:
The reality is that it is a useful tool. For example, if a practitioner has always practised in a particular area, to use AI to summarise cases, or something like that, where the practitioner knows what the cases are about, is a very smart use of a technology or a tool, in the same way people have taken to using cars rather than cycling from A to B if it is a long distance. We can look at it and see that there are very practical uses where it can be used well. The danger, of course, is when somebody who does not know the area of law asks a machine to draft something for them and perhaps does not ask the right question. That is where we are getting into the real trouble. When they read the submission back, they do not know whether it is right or wrong. Human involvement is key here. I think that is the piece about using it for research. Of course, people will use it for research. So much information is available now to practitioners and judges and to my staff because of the Internet.
Traditionally, people got a small folder or papers for a court case. Now, the sky is the limit in terms of boxes of papers. We have to be able to provide tools to judges to go through that in order to find information. At the moment, there is an inequality of arms because big firms are using AI. They are very clear that they are using it and they tell us what they are using it for. As regards case management and document management, it has been used for e-discovery for a long time. Judges have to be able then to filter through that. People are asking why cases are taking longer to decide. The amount of information has increased dramatically with the advent of the Internet.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Would it be fair to say that in pretty short order, we are going to see judges using generative AI? As Ms Denning said, they are still going to make the decision but as regards case management and research within a closed-loop large language model, it is probably fair to say that we are moving in that direction.
Dr. Owen Harrison:
The committee will be aware that under the EU AI Act, Annex III is not in force at the moment and now there is talk about delay. That would classify that type of use for judicial research and fact interpretation as a high-risk system. The ability to actually know what to do to put in place a high-risk system is critical to enable that and to make sure it aligns with the EU AI Act. I imagine that, in time, the answer has to be that responsible use by judges will be a factor. The guidelines are clear. The recommendations in judicial guidelines that speak to this say that the systems should not be used for core research and legal analysis. However, for scenarios where the judge knows the legal situation and law in detail and just wants summarisation rather than having to do the drudgery and administrative task, it is fine to use it. We can see this maturing over time. As the EU AI Act finally lands on a position, it will help clarify for everyone, including the judges, how they can use it.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
What about case management and stenographers?
Ms Angela Denning:
On case management, there are definitely tools on the market at the moment to assist practitioners in case management. On our side, the new systems that we are building are building in elements of case management. In time, we would see that we will be in a position to send reminders out to parties when they are supposed to do something within a particular time and a document has not been lodged yet. We will be able to do all of that.
On stenography, I think I was probably among the last run of people who did typing in school. That is a skill that is dying, unfortunately. We are finding it increasingly difficult to find typists to work as secretaries, for example. Stenography is probably also a skill where the numbers are dropping. For that reason, technology will, of course, move into that space.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
The committee is looking at what Ireland needs to prepare for. Will we see a situation whereby in a courtroom, in the same way that Ms Denning talked about it for translation purposes, there will be an AI-enabled stenographer?
Dr. Owen Harrison:
We have a proof of concept that we are looking at and it looks like it might actually go into production use. It is not for external use at the moment, but it may be in the future. It is for staff and members of the Judiciary who rely very heavily on the audio recordings of court proceedings. At present, we have to get a human to do the transcription of that audio recording. With the latest AI tests we have been doing over the last couple of years, we are finding that the word error rates are starting to get down to 3%, which is close to what a human does. Typically, a human trained in that area would achieve around 98% accuracy. We are hoping to bring that technology in. It will be one of those uses of AI for streamlining what is currently a fairly labour-intensive administrative process, as the Cathaoirleach can imagine. That being said, if that becomes trustworthy, we could look at making it more widely available in future from the stenography point of view.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Dr. Harrison is saying that it is has been proved as a concept. When does he anticipate we will see the introduction of an AI-enabled stenographer in a courtroom?
Dr. Owen Harrison:
I do not know. It depends on how it goes. We are very driven by the results of AI. We measure all the proofs of concept we do thoroughly for accuracy, completeness, bias and all those issues. When they fall foul of then, which, to date, a lot of them have, we do not do it. Who knows what the curve or trajectory of ability of LLMs will be. If it continues on an exponential kind of capability growth, it could be very viable in the near future. However, if it does not but, rather, is an S-curve and plateaus out as some suggest, it may never get to that quality level. It is difficult to predict.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I am sorry that I am caught between two committees, as is often the case. I thank the witnesses for their contributions. A recent edition of The Bar Review gave some international examples, predominantly from the United States, of cases where AI hallucinations were presented in the case either for the defence or the prosecution. That is a significant concern. The AI revolution may be to the text-based world what the combustion engine was to the manual-based world. With so much text and information out there, what are the checks and balances within the Courts Service to protect people from such hallucinations? I will start with that.
Ms Angela Denning:
At the end of the day it is the judge who makes the decision. We have had some cases already where judges have asked people if they had used AI to do something because it did not make sense to them. The cases we have seen reported internationally have been where somebody clearly did not know anything about the area of law and used a tool without asking the right questions. They took the response, did not validate it or do their homework on it afterwards and presented it to a court. Of course, the judge sitting there knew the law and the area of work and said "That is wrong". At the end of the day, that is where the checks and balances finish. They finish with the human set of eyes on things.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Would Ms Denning be confident that the system is sufficiently robust to always have zero error? I suppose there are courts of appeal in that case-----
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I am shocked to hear that.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I hear that 100%. Does Ms Denning think there is a different set of considerations in relation to the power of AI compared with anything that has gone before?
Dr. Owen Harrison:
Being slightly speculative, one of the potential defences against that kind of injection of nonsense into the court, and at volume and at scale that has never been seen before, AI may have a place to detect anomalies. Just because AI generated something does not mean another AI system could not detect that. There are citations of well-heard cases that are being relied on that do not exist at all and it will be difficult for any human to pick all those up. The judicial AI guidelines that are out there talk to that and try to brief judges on how to practically catch people out. If judges see submissions with a lot of US spelling, that might raise their hackles to interrogate that in court, check whether was AI used and interrogate further. This is why we were saying before the Deputy arrived that we are trying to focus on digitalisation.
If the paper can be put into digital format and if we can have digital submissions, we can start to use AI to try to counter the threat of AI.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Someone gave the example of a telephone scam where it is obvious it is wrong, such as the prince or whatever, compared with the ones that sound realistic where the scammers have additional information about me or the wider context. I presume we are working on the basis that this stuff is going to get more sophisticated and advanced. I am taking from the Courts Service argument that it is the quality of our Judiciary and the oversight, but also potentially the use of technology.
Dr. Owen Harrison:
We have to be open. Deepfakes will be used in courts all over the world. Anyone on the street now can see that deepfakes are extremely convincing and it is almost impossible. It has gone from those old videos three years ago of Will Smith eating spaghetti, which looked bananas, to now, where they are absolutely photo-realistic. We therefore have to be open to using technology as a tool to defend against the technology polluting the space.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
What about the budget and the capacity in the Courts Service to respond?
Ms Angela Denning:
I am grateful that, since we started on our modernisation programme about five years go, we have been supported to digitise and replace systems. We have replaced almost all our civil systems at this stage. We have replaced almost all our family law. District Court family law is to be finished this year. That gives us a firm foundation and that is important because we can make sure our data is right and after that, we can see what tools would be appropriate to use. However, until we get the paper out of the system and the data into a state we can rely on, we have a lot of work to do.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
This is a wider point and might not be for the witnesses to answer. I have heard that some of the legwork or analysis of legal papers that would have been done by graduates or junior professionals is now being hived off to AI tools. Does the Courts Service have any wider concerns about quality and sustainability or the role AI will fill?
Ms Angela Denning:
We had to deal with this issue when we introduced the jury summons office in Castlebar. I said at that time that no one joined the Civil Service to open envelopes for a living, so we brought in an envelope opener and it saved a huge amount of work. No one wants to do that type of drudge work and if the tools can be used to make the work more interesting, people will move to it. What I will also say for the professions is that, during my time in the Courts Service, the professions have adapted time and again. There are areas of law now that are significant areas being litigated before the courts that did not exist previously. Family law did not exist 50 years ago. The courts very much reflect the life we all live and what is going on in society. There will always be new areas of work and new ways to do it and the professions will adapt.
I can understand the concern. People need to know how to do that basic research work. We all studied research methods in college and there is a reason for that; it is so we can understand what we are looking at and are able to analyse and interrogate it. They are basic skills people will still need. When people are asked a question on their feet by a judge, they need to be able to answer it.
Darren O'Rourke (Meath East, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
It is not just about having the skills, but also having the knowledge and being able to apply them.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I thank the Chair. I thank the witnesses. It has been an interesting discussion. They are probably aware of what happened in the International Criminal Court, where it was at least alleged that a judge, because he was sanctioned by the US Government, did not have access to his email system, which was operated by a US provider. To what extent is that a new issue for the operating system of the Courts Service?
Is that an issue the Courts Service is considering in the context of sovereignty?
Dr. Owen Harrison:
It is an issue that all public service bodies are facing. It is certainly something I talk about to my chief information officer colleagues across the public service. We are very dependent on Microsoft. I saw the particular case the Deputy mentioned. What we always have to do is look at the way out of that dependency, if it ever came to pass that we needed to exercise such a route, understand our data and make sure that our data is in a place and situation where we can reconstruct the applications and services that are built on top of it. It is a risk. It would be very expensive to plot our way off certain vendors. We are very conscious of this in regard to AI. We are trying to make sure that we have a plug-and-play mentality. LLMs are becoming commoditised to an extent. We are trying to not bake in vendor lock-in into our processes and systems. We are very conscious of geopolitical situations and trying to make sure that, as we go, we have ways of switching between vendors realistically and practically. It is a worry.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
The Courts Service is using AI on a pilot basis. Digital audio recording was mentioned. The figures were interesting. If I understood it correctly, it was stated that there is a 3% error rate and a 2% human error rate. It was also stated that, in essence, AI systems can transcribe every digital audio recording of proceedings in the courts, practically to the same level as humans.
Dr. Owen Harrison:
It is close. It depends. In the worst scenarios, where it is a very noisy court - I have done the testing myself and was roped into the experimentation - a human can barely make out what is being said because it is basically a lot of people talking over each other. In a clear court, such as the superior courts, where it is more controlled and the audio is better quality, like here, it would be almost 100% accurate at this stage.
To go back to the Deputy's previous question, the technology we are using there is different. It is not Microsoft technology. We are trying to diversify what core technologies we are using under the hood for that. The word error rate, at its worst, is around 7%. In a normal environment, it is 3% right now.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
The transcription of digital audio recordings can be and is used as evidence in subsequent court hearings. The integrity of that transcription, even leaving the audio aside for a moment, is about as important as it gets. If it is to be used, what is the threshold? Is there a threshold that has to be met for AI-transcribed digital audio recordings as distinct from human-transcribed digital audio recordings?
Dr. Owen Harrison:
The way we are rolling it out is very careful in that you can only get access to the transcription on a screen that shows the actual recording itself. It is designed that you can search and see the transcription, but you are hearing the transcription as well. It is ensuring that a human is in the loop. What we do not provide in the tool at the moment is exporting an adapted transcription, so it can be taken away in a Word document, because of that concern. At the moment, we are not using it for that purpose. If it is used in a trial and is depended on, we use traditional transcription. What we are mainly using it for is for a judge or member of staff who needs to access the audio recording. They are typically trying to find and confirm what they heard in court, but the court recordings could be three hours long and it takes ages to find the particular case. There could be 100 cases in a busy day or call-over. To actually find it, even taking into account the 3% error rate, you can do a text search, get down to it and then listen. You can see the text. We are trying to keep it combined so that you are not taking a transcription away.
I do not know what the word error rate would be, but it would have to be at least as good as an average human transcriber would do before we would start to rely on it fundamentally. We will have to go bit by bit and keep trust in the process.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Are AI-transcribed digital audio recordings already being used by judges-----
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
-----in the preparation of their judgments?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I misunderstood Dr. Harrison. What are the circumstances where-----
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Is it that a judicial researcher has access to it? Is that the point Dr. Harrison is making?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Okay. It is not the case that you could obtain the transcript as a practitioner, that it would be AI-transcribed and that could be-----
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
-----entered as an exhibit, affidavit or something else.
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Dr. Harrison referred to an error rate. Hallucinations are obviously different from human errors. How does he assess that risk of hallucinations as distinct from typos and whatever else?
James Geoghegan (Dublin Bay South, Fine Gael)
Link to this: Individually | In context | Oireachtas source
As we use more AI in our systems, or at least pilot its use, does it increase the level of cyber risk protection that the court system needs? We talk about larger fears of agentic AI being used by malign actors and seeping its way into systems. I am presenting a science fiction novel here, to some extent, but that is the human, guttural fear about this. How does the courts system, one of our most important institutions, protect itself from those kinds of fears, however unrealistic those fears might actually be?
Dr. Owen Harrison:
That is a multifaceted question. I will try to answer it quickly. Agentic AI is the ability for AI to do real stuff. As the committee knows, we very much monitor that and do not have it enabled internally in the courts. You would have to be very careful because that is into full automation. People who follow the space will see a tool called OpenClaw, which automatically does a lot of stuff. You would want to be insane to release that into a public body to do automatic stuff without very strict guardrails. We do not have it deployed at the moment. Everything that we use AI for is non-agentic and we control how it is used.
The biggest threat from a cybersecurity point of view is the use of AI to attack. The fastest growing use of AI is on that malicious side of things. Attacks are becoming increasingly sophisticated. AI is automatically scanning for vulnerabilities and automatically scanning systems it knows you have used. That needs the standard robust defence, in-depth tooling, that public bodies typically have. We invest heavily in cybersecurity. We are grateful to the Department for ring-fencing a significant amount of money for us to put into cybersecurity defence every year. We just have to keep doing it and keep being vigilant.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Are there any cases where AI-generated legal correspondence was challenged in the Irish courts? A High Court judge said recently that correspondence does not necessarily have to be written by humans.
Ms Angela Denning:
I am not aware of any cases, but that does not mean it has not happened. I am aware of another public body that has now asked people to fill out a particular form if they want to make a complaint to the body. The reason was that an AI tool was not just providing people with the information about the State agency but also continuing on and asking if people wanted it to make the complaint. It was generating the complaint. It was not the human making the complaint. It was the system making the complaint. We spoke about that territory.
We have to deal with everything that comes in. It is about having human eyes on what comes in.
I cannot assume a document is not valid just because somebody used AI to generate it. That does not mean it is not a valid document. If the person signed it, then they chose to sign it, in the same way a person swears an affidavit or signs a contract. It does not matter what tools they used to generate it; that is their signature and it is a valid document. We cannot look behind it.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Deputy Geoghegan will like this question. What is Ms Denning’s opinion of lawyers using AI-driven predictive analytics to research the history and preferences of judges in order to win a case?
Ms Angela Denning:
Anyone who has been in courtrooms long enough or practising in a particular part of the country, will get to know this. We all get to know the humans we work with every day. We get to know the likes and dislikes of people. I do not think we need technology to help us with this. Anybody who is in and out of a courtroom every day would know how people like cases to be run.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
What if people are in a different jurisdiction or going to a different area? Obviously, AI is used.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I am just putting it out there, that is all.
Ms Angela Denning:
I am not saying somebody will not think that is a great tool after hearing this and go off and design it. The reality is that young barristers ask their colleagues in the bar room every day in courthouses around the country what approach a judge takes to different types of cases. These are the conversations that happen. I am sure people coming in here use AI to run over previous transcripts to see what kind of questions committee members ask.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
The Judiciary has a great deal of autonomy in the adoption of AI technology. How far does Ms Denning see this going in future? How far does she see it perhaps replacing the work of the judges, or maybe the entire Judiciary itself? What percentage of AI use is acceptable?
Ms Angela Denning:
From what I know, I do not see that happening. Trials and work have been done internationally, and UNESCO has a report that emphasises AI can be an assisting tool but cannot replace the decision-maker. Our guidelines are very much aligned with guidelines internationally. Judges and organisations have considered this question internationally, and, at the end of the day, it is about trusting the system. We are very lucky in this country that we have high levels of trust in the courts and in the Judiciary. One of the reasons we have that is that we can walk into a courtroom anywhere in the country and watch what goes on. That is very important for trust in the system. The guidelines published certainly emphasise that aspect and that the judge remains the decision-maker.
Johnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I thank Ms Denning.
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Going back to people submitting affidavits, or whatever it might be, that have been generated using AI and that have been signed. People's signatures are on these documents. I have no doubt this is happening. Is there evidence that court documents submitted have had hallucinations in them? These are made-up facts and figures, regardless of people's signatures on the documents. Has this been something the Courts Service has identified?
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Okay.
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
That is fine, but-----
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
I was just about to say that.
Ms Angela Denning:
-----how somebody came to draft a document. I can see where the tools might assist lay litigants, for example, where they cannot get or have not got a solicitor, for whatever reason. It may, in fact, assist people to put a bit of shape on what they are thinking. People have to be careful, though, and know what they are doing. The affidavit they sign has to match the facts of the case and the law they are relying on. The danger is where somebody who does not know anything uses the tool and then does not fact-check or double-check what the system has provided to them. That is the really important bit.
For anybody who goes to court and is relying on an affidavit, that is sworn evidence. They have to be prepared. It is the same as if somebody went into the witness box and gave that evidence. They should know what is contained in the affidavit and be happy to stand over it.
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
One thing that consistently arises is the backlog in cases being dealt with. I appreciate that most people in the legal profession will say that we need more judges. I am sure Ms Denning is probably of the same opinion, but I do not want to put words in her mouth. Has the Courts Service looked at ways of utilising AI to reduce the backlog by dealing with cases in a more efficient way, while ensuring that there is no barrier to accessing justice and that there is no bias built into any of the systems?
Ms Angela Denning:
We will start with the word "backlog". When comparing, we have a system. The European Commission for the Efficiency of Justice compiles a report on waiting times in cases across Europe every year. Because we have a common law system, cases here are not managed in the same way as they are across the rest of Europe. In the context of the use of the word "backlog", we have a working group of judges, of which Dr. Harrison is a member, which is looking at defining what constitutes a backlog and what is just part of the normal progression of a case. You issue your proceedings and the other side has a certain number of days to do whatever. A time limit is automatically built in to cases. That is not part of a backlog. It is a question of how we find a point in time at which cases have fallen into what could be called a backlog.
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
When they have overrun.
Ms Angela Denning:
When they have overrun. It is about how we find that point in time. That is about us having good data and we cannot have good data until we have good systems. As we are replacing our IT systems with this single case management system, we are able to identify points where things should be happening and they are not. For the first time, we are, for example, able to better measure the duration of a case. Instead of saying that we think cases are taking three years or however long it might be, we can tell from start to finish how long cases are taking and how many adjournments are involved. We have that type of information. As that information gets better, we will have better data for Dr. Harrison's group to be able to work on.
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Will that information be made public?
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Something struck me was when Dr. Harrison spoke about cybersecurity. Notwithstanding the acknowledgement of the additional funding provided for cybersecurity for the Courts Service, as we move towards more digitalisation and the adoption of AI within the service, from Dr. Harrison's perspective, in the context of cybersecurity, what are the particular weak points that exist or that might exist in the future which he thinks the service will have to address?
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
We can speak broadly.
Dr. Owen Harrison:
Generally speaking, in the public service, it is probably access to the skills and expertise. It is a kind of arms race you are involved in. On one side are the attackers. They may or may not have deep pockets. Also, there is the commoditisation of the tools the attackers use. In other words, the cost of using those tools to set up an attack. Where there used to be a high degree of skill needed to conduct attacks, it is becoming easier to deploy attacks. They are getting so sophisticated that I have read there is even a help desk that you can call to pay for ransomware attacks and stuff like that. The entities involved are becoming extremely sophisticated.
We will have to start to use AI more and more as part of our cybersecurity defences in order to be able to deal with the arms race, because the industry moves very quickly to exploit vulnerabilities. We have to have the skills to deploy those tools in the correct way. Broadly speaking, one of the challenges is to make sure that you continue to have the right skills and expertise available, not just contractual expertise but also internal staff. You also have to make sure those staff are continually trained and that they maintain their skills. Engaging with the National Cyber Security Centre is critical.
It provides guidance and training skills. We have a cybersecurity group in-house that oversees all cybersecurity aspects. We are getting all of the members of that governance group to do a training course organised by the National Cyber Security Centre. There are many different types of weaknesses - technical and so on - but probably the biggest defence is to keep investing in our people so we have the right people to defend us.
Naoise Ó Cearúil (Kildare North, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
There is a wider conversation about the difficulty in hiring people with technical expertise, particularly in cybersecurity and AI in the public services in general. There is a big question around remuneration of that technical expertise in the public sector. That needs to be reviewed because it is competing against not just nefarious characters but private industry and other countries to try to get that talent. I wanted to put that on the record. The witnesses do not have to agree or disagree. I assume it is something they would be in favour of.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
This has been a very interesting discussion. I do not think we are moving into a cyborg Judiciary or a "Minority Report" film-type situation but it is clear we are moving to a space where judges who use artificial intelligence will replace judges who do not. Similarly and more broadly within the Courts Service, that will be the case. When I was a law student, it was always a case of going to the library, trying to pull down the relevant journal and find the case and so on. It strikes me that a closed-loop large language model would be beneficial. It would provide access for judges and lawyers to the approved data because decisions are made based on the data provided. Has any work been done in that regard, in particular for judges or practitioners?
Ms Angela Denning:
There are companies that provide those services and very solid research databases that have all the cases and can be filtered in various ways to reduce that effort of going into the library and taking many books off the shelves to try to find what you are looking for. That is available and we provide those tools to judges and legal researchers.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Having read the guidelines Ms Denning set out, to what extent are they being used already?
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
I was struck by what Dr. Harrison said about cyberattacks and that this is occurring right across the public service. Will he give a rough ballpark figure for what the Courts Service currently spends on cybersecurity to meet the challenges? Have the sources of the cyberattacks been identified? Are they actors outside the State? Are there State actors in some instances, potentially?
Dr. Owen Harrison:
Roughly speaking, just north of €1 million a year is dedicated. That is purely for the straight cybersecurity defence spend but a lot of our systems reinforce that. We have a bigger budget than that and it all contributes to defending us. We also have a dedicated cybersecurity team in-house and outsource to vendors to top that up to have a 24-7 defence. On the sources we detect, it is truly global. There are certain areas of the world which have extremely high activity as regards attacks. Typically, the further east you go, the higher the rate of attacks.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Will Dr. Harrison name those jurisdictions?
Dr. Owen Harrison:
Russia and China are the particular ones. They are demonstrable; we see it in the logs. A huge number of attacks are coming in constantly. Whether they are state actors, we do not know. We see in the networking kit and cyberdefence system that it is constantly under bombardment looking for weaknesses.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Why does Dr. Harrison think it is China and Russia in particular attacking the Irish Courts Service?
Dr. Owen Harrison:
I think they attack everyone. That would be my opinion. I do not think we are particularly special. They will attack any public or private body. Some bodies are more susceptible and attract more attacks than others. Perhaps it is financial and a matter of where they think there are deep pockets. If you attack a State body, it might be more likely to pay out. States typically do not, as far as I am aware. I do not know if the destinations of attack are well placed.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
I will ask about the Courts Service, its own use of AI and the development of policy as an organisation.
Ms Angela Denning:
We have a framework in place. That sets out how AI is to be used. It makes sure we are compliant with the EU AI Act. It makes sure we have a step-by-step process. If you want to deploy AI, there are steps you have to go through and approval that you have to get and so on. We have deployed Microsoft Copilot to all staff and judges. That was a decision we took. The reality is that every student in college in using AI. Those people will leave college and come to work. The reality is that AI is part of the kit they use in their day-to-day lives. We had to recognise that people were going to use AI so we wanted to give a safe tool that we could manage and ring-fence for people to use.
It is very useful. We are about to provide to staff a catalogue of prompts to do things such as putting a document into plain English. We all love our legal language but the reality is that for people to be able to engage with documents, they should be in plain English. We have committed to that and there is a prompt for it. There are also prompts to make sure that if you are doing a presentation or whatever else, it accords with the Courts Service style guide. There are simple things like that on which people spend time. If we can use those tools to speed up the work, why would we not?
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
The role of the committee is that we make recommendations to the Government on how to roll out this very exciting new technology. We want to encourage innovation but balance it with the necessary safeguards. Is there a recommendation that the Courts Service would like the Government to note in the context of how to adopt AI? I am conscious of the education and training piece that was mentioned. Are there any other specific policy recommendations that the witnesses would like to raise with us?
Ms Angela Denning:
The training, that literacy piece, from a CEO's perspective, is the key piece that I would like to see. It is about data literacy and literacy around the systems that people use. That is really important. We are going to keep the humans in the loop but those humans need to know what they are working with and how it works. That is really important.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
Do the witnesses have any advice for a young solicitor or barrister who is quite frightened about all the talk about their roles being replaced?
Dr. Owen Harrison:
It would be useful for the EU AI Act to give clarity. It has set the cat among the pigeons with the delays. I know it is not necessarily all within our control, but it would be nice if there were clarity from a legal point of view so we know what we are aligning with. We went to a lot of effort. We were one of the first public bodies to introduce an AI policy in June 2024. We introduced a framework last year that was to align in detail with the EU AI Act. There is ambiguity about where that is at now with the digital omnibus package. It would be nice to know what we are playing against so we can align with the legislative framework.
Malcolm Byrne (Wicklow-Wexford, Fianna Fail)
Link to this: Individually | In context | Oireachtas source
I thank the witnesses. It has been a useful and informative session for all of us. We will adjourn now until 14 April 2026. I again thank the witnesses.