Oireachtas Joint and Select Committees

Wednesday, 5 July 2017

Joint Oireachtas Committee on Justice, Defence and Equality

General Scheme of the Data Protection Bill 2017: Discussion (Resumed)

9:00 am

Dr. Geoffrey Shannon:

I take this opportunity to thank the committee for the invitation to address it on the general scheme of the Data Protection Bill 2017. I had the opportunity of listening to the commentary of Mr. McGarr and Dr. McIntyre and I share many of the views articulated on the general points. I am not going to dwell on the detail of the regulation. I am going to focus specifically on the children's rights issues, which I believe are quite significant and profound, many of which have gone unnoticed yet will have profound implications for children, particularly on the age of digital consent. Much of what I will say today involves a much broader consultation on the issues that arise under the general scheme.

As we know, the general scheme of the Data Protection Bill is a crucial step in Ireland's preparation for the implementation of new EU data protection obligations and it provides a much-needed update of existing data protection legislation in this jurisdiction, namely, the Data Protection Acts 1988 and 2003.

The two key items of European legislation reflected in the general scheme are the EU general data protection regulation and the directive on the use of personal data by criminal enforcement authorities. The general data protection regulation was agreed in 2016 and mandates higher data protection standards for data subjects, imposing increased obligations on data controllers and processors. It focuses on reinforcing individual’s rights, ensuring stronger enforcement of data protection rules and streamlining international transfers of personal data. A point made in the last session is one with which I agree. As a regulation, the GDPR will take effect in this jurisdiction automatically from 25 May 2018 and does not require a transposition. Nevertheless, the 2017 Bill significantly gives effect to its provisions and provides derogations where permitted.

I have identified at the outset of my presentation, for the ease of the committee, five key issues that are central to our consideration. The first is on the age of digital consent, which I believe should be set at 13 years of age. The Bill in its current form is silent on this issue. The second issue is on the need for a definition of preventative and counselling services in order that blanket blocking of sites does not prevent access to much needed, and increasingly online, services for young people. The third key issue is the right to be forgotten. I have strongly articulated for this right on behalf of young people. We all know that young people sometimes insert material online that they would regret afterwards. We need to recognise and acknowledge the vulnerability of young people. They need to have the right of erasure and this right is not mentioned in the Bill in its current form. The fourth key issue is the link between data protection rights and digital safety. I have spent many of my reports highlighting the importance of digital safety. Digital safety is not about creating a nanny state; it is about empowering young people to understand the benefits and downsides of the online world, especially in terms of young people's exposure to cyberbullying. There is another important and profound question in respect of adult data literacy. Much more needs to be done in this jurisdiction. I say this as a person who still has a Nokia phone, which means I am hugely challenged myself in the context of the online world. The fifth key issue relates to the processing of sensitive data. We have seen so much public discussion over the last year on the importance of ensuring that organisations that hold sensitive data protect those data and that sufficient safeguards exists to ensure that citizens' fundamental rights are not breached in circumstances where data end up in the public domain. This is at the very core of the general data protection regulations.

In addition, it would be interesting to know whether children have been canvassed in respect of this Bill, and how they perceive its current form. It must be remembered that children, like adults, have data protection rights under both EU laws and under the existing Irish data protection regime. Children may not, however, depending on their age and their level of maturity and understanding, be in a position independently to exercise these rights. In this vein, and throughout my presentation, it is necessary to bear in mind Recital 38 of the general data protection regulation, GDPR. I am constrained by time and will not read the provisions but just wish to highlight them. I am very happy to engage in any discussion on the various provisions in the international instruments. This recital states "Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child." It continues by stating "The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child." I am particularly concerned about that and the absence of any definition in the regulation on what amounts to "preventative or counselling services". Recital 38 explicitly recognises children as a separate and particularly vulnerable group in society with regard to data protection issues and I believe that it must inform the approach taken in the 2017 Bill in respect of the protection of the personal data of children.

I shall now turn to the digital age of consent, which is a key children's rights issue addressed in the Bill. Part 3 of the general scheme of the Data Protection Bill 2017 sets out the heads of the Bill required to give further effect to the GDPR. Head 16 of the Bill is particularly relevant from a child protection perspective. It concerns the child’s consent with regard to information society services and relates to Article 8 of the general data protection regulation, which sets the age under which children require parental consent to sign up to digital services – known as the digital age of consent. Pursuant to Article 8, where a child is below the age of 16 years, data processing shall only be lawful to the extent that consent is given or authorised by the holder of parental responsibility over the child. Member states, however, have discretion to provide by law for a lower age, once that lower age is not below 13 years. When the age of consent is set, the data controller is obligated to make reasonable efforts to verify in respect of children below the age of consent that such consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology. There is no definition of parental responsibility. It is a term alien to Irish law and it needs to be clarified. I shall discuss this aspect later on.

Article 5 of the UN Convention on the Rights of the Child, UNCRC, explicitly recognises that children have evolving capacities and that as they get older, they have a greater ability to take responsibility for decisions affecting their lives. The aim of the general data protection regulation, in setting a digital age of consent, is to protect young people from commercial online marketing providers, for instance social media and gaming platforms. The current situation whereby the same data practices are being used to target teenagers as those used to target adults is absolutely unacceptable and needs to be tackled.

In head 16 of the general scheme of the 2017 Bill, in its current form, the Irish digital age of consent has not yet been set out. It is silent in this regard. In the explanatory notes to the heads of the Bill, a consultation process on the appropriate age threshold is described as having been completed and it was indicated that the results of this consultation will be submitted to the Government for a decision in due course.

It appears, therefore, that no determination on this critical issue has been made by the legislature at this point in time. I believe that Ireland should take the opportunity now to designate the lowest permissible age - namely 13 - as the age of digital consent for this jurisdiction. This lower digital age of consent has also been recommended by children’s organisations such as the Children’s Rights Alliance. Ahead of this meeting with the committee I took the opportunity last week to discuss the issue with the Ombudsman for Children, who supports my view that the age of digital consent should be set at 13 years of age. A variety of competing children’s rights and practical realities support the argument that the appropriate age, having regard to the permissible age range delineated by the GDPR, should be the lowest age possible.

Members will see in my written submission that I discuss the key rights which include the right to participate. The right of the child to participate and be heard in proceedings concerning him or her is a fundamental principle of international children’s rights law and is enshrined by Article 12 of the UN Convention on the Rights of the Child and in Article 24 of the EU Charter of Fundamental Rights. We heard Dr. McIntyre make reference to the Charter of Fundamental Rights, an instrument that is binding on our courts. We should take particular regard to it. It talks about children having the right to such protection and care as is necessary for their well-being. They may express their views freely and we need to make sure that happens.

The importance of the voice of the child and the child’s right to participate in all matters has been promoted recently in this jurisdiction when the Irish people voted in a referendum on children's rights, which must mean something. We also see it reflected in through the Children and Family Relationships Act 2015. The focus, however, has primarily been on private family law matters such as guardianship, access and custody. In my view it is equally relevant in the context of the drafting of new legislation. Ireland needs to do much more in terms of meaningful consultation with children where the legislation affects them. This is why I urge this. I am delighted that I now have the opportunity to present a children's rights perspective because all too often, apart from discrete children's rights issues, the Legislature does not take into account the impact the legislation will have on children. Legislation such as this will have a profound impact on children into the future.

In line with the national policy framework, the National Strategy on Children and Young People's Participation in Decision-Making, 2015-2020 was launched. It discusses the participation of young people. Its goal is to ensure that children and young people have a voice in their individual and collective everyday lives and it explicitly acknowledges that their voice in decision-making requires a cross-Government response, with initiatives and actions from all key Departments and agencies.

With the national strategy and the recommendation of the UN Committee on the Rights of the Child in mind, it is unclear whether children have been consulted on the issue of Ireland’s proposed digital age of consent. While the explanatory note to head 16 describes a consultation process on the appropriate age threshold, which it declares as having already been completed, there is no comment in the explanatory note on what this consultation process entailed and in particular, who it involved. In light of the child’s right of participation, I believe that the views of, at the very least, a focus group of Irish children must first be garnered before any final decision is made on this question.

Information services technology and digital media play an integral role in the lives of our young people, as exhibited in statistics that I have attached to my submission. Instead of reading them out, I will try to assist the committee by providing as much detail as possible on the likely impact of this legislation on children. The committee will see from the statistics how critical it is that children be given an opportunity to have their perspective heard in this legislation. Therefore, I recommend that a consultation process takes place to ascertain the views of a variety of age groups of children on the issue of digital consent.

Freedom of expression and freedom of assembly are very important. The right to freedom of expression is a human right that is not confined in its remit to adults. The UN Convention on the Rights of the Child, UNCRC, guarantees a child's enjoyment to freedom of expression in Article 13. Further related rights under the UNCRC include the right to access appropriate information, provided in Article 17, and the freedom to assemble peacefully. Such assembly may take place in the context of an online environment. We need to realise that the world has changed and that these basic human rights are as relevant in the online world as they are in the world to which we are accustomed. That needs to be acknowledged in our legislation. These rights are often exercised by children through their use of information and communications technology.

If members examine the statistics that I have furnished in my advance submission, they will see that so many children spend a great deal of time on the Internet on a daily basis that we need to consider how their rights are protected and vindicated in the online world as much as they are in the offline world. We have been slow to catch up with technology in terms of child protection and children's basic human rights.

In a number of my previous rapporteur reports, I have highlighted the importance of the Internet for children's freedom of information. While there is a genuine need and, indeed, obligation to protect children from the dangers of the Internet, the State must ensure that it does not unreasonably restrict children's civil and political rights, such as the right to freedom of information and expression. The Child Rights International Network, CRIN, identifies instances of Internet service providers being pressured by state authorities to institute blanket filters to block websites containing material that is argued to be unsuitable for those aged under 18 years even though some of the sites contain material that could be important for the well-being of many under 18s, such as material on sexual education, politics and support groups for alcohol dependency and suicide. Alcohol dependency is an issue that I have articulated in strong terms in the context of my recent audit report. Society has been reluctant to engage with the issue and the Government has been reluctant to take on vested interests. Now the time is right to deal with the issue and provide supports. Sometimes, those supports can exist online.

The blanket filters to which I referred are arguably contrary to Article 5 of the UNCRC, which deals with children's evolving capacities. Restricting Internet usage for children, for instance, by setting the digital age of consent at 16 years, should therefore be approached with caution and the varying rights at play must be borne in mind. The overarching consideration must be whether any such restriction is in the best interests of the child. This is mandated in Article 24 of the Charter of Fundamental Rights of the European Union, which provides that, in all actions relating to children, whether taken by public authorities or private institutions, the child's best interests must be a primary consideration. I believe passionately that, to prevent any infringement of the child's right to express himself or herself freely and to ensure children's access to online information, the digital age of consent should not be set at 18 years, but at 13 years, which would be preferable in order to prevent a dramatic reduction in the participation of young people in online services. This is an important point that the committee should take on board.

Due to time constraints, I will move to my next critical point, namely, the definition of preventive and counselling services. The rights of children, namely, to participate in matters concerning them, to be heard, to express themselves freely and to access information, need to be exercised effectively by children. On a practical level, therefore, certain realities must be considered to ensure that children are capable of exercising these rights in the context of their online activity and use of digital services. A difficulty may arise in circumstances where the view of the child is not aligned to the view of his or her parents or guardians. For instance, children may wish to access online services relating to sexual health or education, explore LGBT issues or seek support if they are being bullied. Certain service providers in these areas regularly require and retain personal data from the young persons who access their services in order to improve and fine-tune the operation and content of same. Thus, children's personal data may be processed and retained.

These types of issue may be ones that the children involved, for a variety of reasons, may not be comfortable discussing with their parents or guardians. Children and young people often contact organisations or services in confidence and arguably should be allowed to continue to do so without having to obtain consent from their parent or parents. If the digital age of consent was to be set at 16 years, this would in all likelihood operate to prevent children from accessing these services, something that cannot be said to be in their best interests.

While recital No. 38 of the general data protection regulation, GDPR, specifically provides that the special rules relating to the processing of children's personal data, namely, the requirement for parental consent, should not apply in the context of preventive or counselling services offered directly to a child, whether the variety of service providers envisaged in that context will come within the definition of "preventive or counselling services" is unclear and needs to be clarified. For this reason, I am suggesting that consideration be given to defining "preventive or counselling services" in the broadest possible fashion so that children can avail of support when they need it. I urge the committee to consider this point.

It will be also necessary to provide clarification on whether organisations that provide online support services to children will have to verify the consent of the child's holder of parental responsibility before processing the child's data for not-for-profit use.

My next point is on the related issue of the holder of parental responsibility and what that means. It should be noted that head No. 16 of the general scheme does not contain any definition of the phrase "the holder of parental responsibility over the child". "Parental responsibility" is a term that is more common to the UK and is not defined in our legislation. This issue arose in the context of the children's rights referendum when we examined moving away from the terms "custody", "guardianship" and "access", but that has not happened. For this reason, the Bill may be problematic. Who is the "holder of parental responsibility"? The term should be defined and include any parent and guardian of the child, whether automatic or court appointed pursuant to the Guardianship of Infants Act 1964.

What we saw with the commencement of the Children and Family Relationships Act 2015 was a broad range of family types being taken in from the cold and enhanced rights for a large number of citizens. We saw the creation of temporary guardians, an expansion of testamentary guardians and rights for foster parents under section 6C of the 1964 Act, which means that foster parents or those who have been looking after a child for a period of one year when no other parent is willing or able to act can now have guardianship rights. Those substitute parents on a day-to-day basis should be entitled to provide consent. If they are handling the real parenting duties, they need to have commensurate rights. Applying a wide definition to the phrase "holder of parental responsibility" is preferable so as to allow for a broader category of persons who may be responsible for a child to be able to give the requisite consent for the child in question.

A further concern relates to the involvement of the "holder of parental responsibility", in that many parents or guardians of children have lower digital literacy skills compared with their children. Despite this, the GDPR places the responsibility to manage children's data protection on their parents and guardians where the child is under the digital age of consent. Greater awareness among parents and a more robust information campaign are required. The Department of Justice and Equality produced a good document, entitled "Data protection safeguards for children ('digital age of consent')", but we need to go one step further. This necessitates the development of appropriate, child-friendly material by the Data Protection Commission "which convey an understanding not only of the risks that may arise when personal information is supplied online but also the remedies that are available under data protection law."

I have championed and advocated the right to be forgotten year on year in my reports. We need to consider this matter, particularly in the context of children. In my ninth report, I discussed the right to be forgotten and its importance from the perspective of a child.

The right to be forgotten was held to exist in the seminal case of Google in Spain. In that decision the European Court of Justice held that an EU citizen had a right to request commercial search firms such as Google to remove links to their personal information when requested to do so, provided that the information was no longer relevant, emphasising that an individual's right to privacy overrode the public interest in seeking access to information in certain circumstances. Article 17 of the general data protection regulation concerns the right to erasure, known as the right to be forgotten. There is no specific head in the general scheme that gives effect to that article.

The right to be forgotten is not only important for adults, although the public narrative has focused on this dimension. It is probably even more important for children, as they are less likely than adults to be aware that information they post online may be available long term. They may not consider the consequences of posting something online which may last long beyond their childhood. While not stated in Article 17 of the general data protection regulation, it is suggested Ireland should take the opportunity to include specific provisions on this issue in the 2017 Data Protection Bill. At a practical level, we know that with increased vetting across the world, searches are undertaken. If a child posts something online, it could turn up in a search and have profound consequences. If there is anything the committee should consider in protecting young people and their vulnerabilities, it is ensuring the right to erasure reflected in Article 17 of the general data protection regulation is explicitly provided for in the 2017 Bill. As I argued, the relevance for children of the right to be forgotten should be acknowledged. Children should be educated on the matter and it should be understood the age at which an individual posts information online should be considered to be a very important factor in decisions on whether to remove an individual's personal information from sites.

Another issue I have highlighted year on year is cyberbullying and cyber-harassment. These are profound child protection issues. I argue that the Internet is the new child protection frontier. We need to ensure our children are properly protected online. Much more needs to be done in that regard. When we talk about protecting children's data, it is also about protecting children. The general data protection regulation and the general scheme of the 2017 Bill cannot be considered in a vacuum. There are risks associated with young people maintaining an online presence which cannot be ignored. In this vein, the introduction of the Criminal Law (Sexual Offences) Act 2017 is to be welcomed and applauded. The Act demonstrates Ireland’s commitment to better protect its children from online predators and specifically recognises the dangers that come with technological advances by creating a wide range of new criminal offences dealing with child pornography and grooming, with a particular emphasis on the use of information and communications technology in such offences.

A further concern associated with children’s Internet usage is the prevalence of cyberbullying and harassment. Each year when I have raised this issue, I have received a large volume of correspondence from parents saying they are delighted I have done so as their sons and daughters have been affected. The issue needs to be much more prominent on the political agenda. We must take the necessary steps to protect our children as the problem is widespread. One EU study indicated that 21% of children had been exposed to potentially harmful user-generated content such as hate, pro-anorexia and self-harm material. In order to ensure children are protected from cyber-bullying in their online activities and that their personal data are not exploited, regard should be had to the recommendation of the UN Committee on the Rights of the Child that states should "develop effective safeguards for children against abuse without unduly restricting the full enjoyment of their rights". It is notable that Article 6(2) of the general data protection regulation enables member states to "maintain or introduce more specific provisions to adapt the application of the rules to ensure lawful and fair processing".

To further address the issues raised concerning cyber-harassment and bullying, regard should be had to the recommendations I recently made in my tenth rapporteur report which is soon to be published. I also endorse the recommendations of the Law Reform Commission in its 2016 report on harmful communications and digital safety concerning take-down procedures. When content is put up online, there needs to be an effective mechanism to enable it to be taken down, which would ensure the efficient removal of harmful digital communications online. The proposed office of the digital safety commissioner of Ireland would thereby oversee an effective and efficient take-down procedure in a timely manner, regulating a system of take-down orders for harmful cyber-communications made for both adults and children. Alongside the right to be forgotten, there must be a procedure for taking down in a timely fashion offensive material posted online. That is why I am strongly supportive of the recommendations of the Law Reform Commission. The proposal made by the commission regarding the establishment of a new statutory oversight system appears to be a practical and viable solution to the current gap in Irish law. I also recommend that consideration be given by the Government to chapter 3 of the commission’s report to enable progress to be made in this regard and ensure steps will be taken to establish an office of the digital safety commissioner of Ireland. There needs to be some joined-up thinking between the two offices. In the light of the publication of the general scheme of the 2017 Data Protection Bill, the office of the digital safety commissioner of Ireland, if established, should be required to liaise with the Data Protection Commission operating pursuant to the 2017 Data Protection Bill. Co-operation between the two bodies would be essential in protecting children and their data.

The provision on the processing of special categories of data is very important. Article 9 of the general data protection regulation concerns the processing of special categories of personal data. I have a significant insight into this issue as I have spent the last two years conducting an audit of the exercise by An Garda Síochána of its emergency child protection powers which I am sure will interest the committee. I have looked at over 500,000 fields of PULSE data. It has provided me with a unique insight into the discrete aspect of child protection within the Garda. It is the largest audit worldwide of the exercise by a police force of its emergency child protection powers. It is rich in giving us an insight into how the child protection system operates. By and large, I have found that the Garda goes to great effort to deal with children sensitively in these circumstances. In respect of data protection, it is equally relevant. When sensitive personal data are gathered, we need sufficient safeguards to ensure they are treated appropriately. That is why Article 9 is important. The processing of special categories of personal data is permitted where it is necessary for the provision of health or social care or treatment or the management of health or social care systems and services, subject to suitable and specific measures being implemented to safeguard the fundamental rights and freedoms of data subjects. I am a big believer in ensuring citizens' fundamental rights are protected. We need to ensure that where there are derogations from the regulations, there are checks and balances. If somebody's sensitive personal data are being used for what are purported to be health reasons, there must be accountability, which must also be the case where there are breaches and data end up in the public domain.

In the general scheme of the 2017 Bill heads 17 and 18 concern the processing of special categories of personal data. Head 17 permits the making of regulations for the processing of sensitive data where "necessary for reasons of substantial interest", while head 18(1) particularly provides that these categories of sensitive data may be processed where necessary for, among other things, "the management of health and social care systems and services and for public interest reasons in the area of public health". It can be imagined that these exceptions to the prohibition of the processing of sensitive personal data will enable the Child and Family Agency to process such data in the carrying out of its statutory role. It will inevitably include sensitive data relating to children and young persons.

The 2017 Bill only allows this processing to take place on the condition that suitable and specific measures are adhered to in order to safeguard the fundamental rights and interests of the data subject. As identified in the explanatory notes to Head 18, it is as of yet unclear as to the extent to which the "suitable and specific" measures referred to in Article 9 and included in the Bill are intended to be additional or complementary safeguards to those already placed upon data controllers elsewhere in the general data protection regulation or whether additional safeguards will be required.

I am passionately of the view that consideration should be given to the inclusion of additional safeguards, particularly where a child's sensitive personal data is engaged and is to be processed by an agency such as the Child and Family Agency. This should be explored having regard to Recital 38 of the general data protection regulation and the special protection required therein for the personal data of children.

I thank the members for taking time to listen to me this morning. I am happy to take questions.

Comments

No comments

Log in or join to post a public comment.