Oireachtas Joint and Select Committees
Wednesday, 20 March 2013
Joint Oireachtas Committee on Transport and Communications
Social Media: Discussion (Resumed) with Google and Digital Rights Ireland
The purpose of this engagement is to hear the view of Google and YouTube on the abuse of social media, including the challenges posed by cyberbullying and online harassment. As I have said before, because social media is changing the way in which many of our citizens interact with one another, we thought it was important that the committee should become aware of the various facets of this topic, particularly in light of the disquiet that has been expressed in some quarters about the lack of curbs on the irresponsible use of these channels. As members are probably aware, the committee has already heard the views of the Office for Internet Safety and Minister for Communications, Energy and Natural Resources, Deputy Pat Rabbitte, and has held meetings attended by representatives of the social media networks Facebook and Twitter. We have also met the National Anti-Bullying Coalition and will hear from Digital Rights Ireland later this morning. In addition, we are taking submissions in writing via social media channels from interested groups and individuals. At the end of that process we propose to prepare a report and make recommendations to the Houses of the Oireachtas. On behalf of the committee I welcome Ms Sue Duke, head of public policy, and Ms Sabine Frank, media literacy policy counsel, from Google.
By virtue of section 17(2)(l) of the Defamation Act 2009, witnesses are protected by absolute privilege in respect of their evidence to the committee. However, if witnesses are directed by the committee to cease giving evidence on a particular matter and they continue to do so, they are entitled thereafter only to a qualified privilege in respect of their evidence. Witnesses are directed that only evidence connected with the subject matter of these proceedings is to be given and they are asked to respect the parliamentary practice to the effect that, where possible, they should not criticise or make charges against any person, persons or entity by name or in such a way as to make him, her or it identifiable. I also wish to advise that any submission or opening statement submitted to the committee will be published on the committee's website after this meeting. I remind members of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the House or an official either by name or in such a way as to make him or her identifiable. I invite Ms Duke to make her opening statement.
Ms Sue Duke:
I thank the Chairman and members of the committee for the opportunity to appear before the committee to discuss how we can ensure the best social media experience for everybody and how we can work with the committee on this important issue.
I work on public policy at Google in Dublin. I am accompanied by Ms Sabine Frank from our Berlin office, who is our expert on digital literacy. We have divided our presentation into three main parts, the first being the benefits of the Internet and social media platforms. I will then give a brief overview of what YouTube is doing to ensure people stay safe and secure online and I will offer a couple of thoughts on where we think we can all work together in this area.
Before I begin, I will give the committee a quick introduction to Google Ireland. Google celebrates ten years in Ireland this year. During the course of that time we have grown from five Googlers in 2003 to more than 2,500 Googlers today. Our recent investment of almost €300 million in our headquarter buildings just down the road in Barrow Street as well as our new data centre in west Dublin has consolidated our long-term commitment to Ireland and will be the basis for future growth and expansion. Much of the work we do from Dublin focuses on helping companies, big and small, to get the most out of the new business models and new opportunities presented by the Internet. I will detail precisely what those new values are.
In the past decade the innovations brought about by the Internet have revolutionised communications and the spread of information across the globe. The Internet has been a phenomenal force for good, promoting culture, boosting economic growth and advancing free expression. Social media platforms such as YouTube have been at the heart of the Internet's success in generating economic value, creating employment and fostering innovation and creativity.
What is YouTube? For those who are not familiar with it, YouTube is our user-generated video-sharing platform, around which users debate, form communities, interact with and inspire one another. YouTube began in our founder Chad Hurley's garage in California in 2005, when he posted the very first video of his friends at the zoo. The following year, Google acquired YouTube, and since then the platform has enjoyed enormous success, becoming the world's biggest video-sharing platform today. The slide on the screen will give the committee a quick snapshot of the story of YouTube and some sense of the scale of content we are dealing with. Seventy-two hours of content are uploaded to YouTube every minute, and there are 4 billion views on YouTube every day. In 2011, we had more than a trillion views on the platform, which equates to about 140 views for every person on the planet. YouTube has enabled creativity and collaboration on an unprecedented scale, allowing the young, the young at heart and the old to engage with communities both locally and globally, to express themselves and to get involved in society. For example, on YouTube today one can submit an idea for a science experiment to Stephen Hawking. One can launch one's music career, as Justin Bieber did, or start a business from one's bedroom. Cork man Jason Sullivan began uploading cartoons last year to YouTube and after phenomenal success on the platform, Sminky Animation, as shown on the screen, has gone on to sign a contract with United Agents in London. The Khan Academy, with which many members may be familiar, has pulled together more than 3,000 educational videos on YouTube, with students around the world viewing up to 200,000 videos every day on topics ranging from calculus to arithmetic and from prehistoric Africa to the Second World War. During the past weekend, members may have seen how RTE used the platform to promote Ireland at home and abroad. It live-streamed two main events: the full St. Patrick's Day parade, and, as part of The Gathering, a one-hour music stream from Áras an Uachtaráin, which showcased a host of Irish talent from Christy Moore to the Frames to the Script.
As with any means of communication, there can be challenges. I am aware that the committee has discussed at recent hearings how social media platforms reflect older and larger societal problems with regard to how we treat one another. In order to address these societal problems we believe we need societal solutions, but we also believe that technology and technology companies have a role to play in ensuring they keep their users safe online.
I turn now to how YouTube is playing its part. There are three pillars to our approach to user safety at YouTube and, more generally, at Google - education, empowerment and protection. At YouTube we are deeply committed to educating young people about how to use the Internet safely and responsibly and to raising awareness about digital citizenship and digital literacy. I shall take members through a couple of the campaigns that are particularly relevant to the committee's current work. The first is the digital citizenship curriculum. We launched an online curriculum for YouTube last year aimed at supporting teachers of second-level students by providing online teaching plans and lesson plans. The curriculum educates students on YouTube's policies, how to keep themselves safe online and how to report abusive content, in order that they can develop a holistic understanding of how to be a responsible digital citizen. The curriculum was first tested in the Ursuline convent in Waterford, where our local YouTube team walked through the lesson plans with four transition year students and their teachers and used their feedback to improve and adapt the programme. Since then the digital citizenship curriculum has been launched globally.
The second project, It Gets Better, was created to show young lesbian, gay, bisexual and transgender teenagers how successful, fulfilled and happy their lives will be if they can get through their difficult teenage years and to show them they are not alone during those years.
YouTube has worked closely with It Gets Better to highlight issues of bullying facing young LGBT teenagers and provide them with messages of support and encouragement. It Gets Better has become a worldwide movement, with more than 10,000 uplifting personal videos from celebrities, world leaders and everyday people. In the current slide, President Obama is seen in a photograph addressing a young gay soldier in Iraq. Messages have also been posted from young out people up and down the country. These videos provide a message of support and solidarity to young LGBT teenagers and are a further example of how our platform is being used for good.
Google has been a significant supporter of the Family Online Safety Institute, or FOSI, a project of A Platform for Good which is designed to help parents, teachers and students to connect, share and do good online. FOSI is an international not-for-profit organisation that convenes leaders in government, industry and the NGO sector to formulate new solutions and policies in online safety. We will launch FOSI's European chapter at our Dublin headquarters on Barrow Street in May and would be delighted to have members of the committee attend to learn more about the work FOSI is doing and its plans for Europe.
In addition to education and awareness-raising on responsible digital citizenship, YouTube has a responsibility to ensure its users have the tools they need to stay safe online. The second pillar of our approach is empowerment. I mentioned at the outset that YouTube is a video-sharing platform made up of and for our users. As it is a community, we have developed community guidelines which set out the rules for being on YouTube. The community guidelines set out what behaviours are not allowed on YouTube, including bullying, harassment and hate speech.
There are two processes that are most relevant to the issue currently being considered by the committee. I said earlier that 72 hours of video are uploaded to the platform every minute. This amounts to hundreds of thousands of videos every day. To manage that much content, we have developed a community policing system which involves our users helping us to enforce the rules on YouTube. The way this works is that the community flags content that is believed to break the rules. Every day, thousands of YouTube users flag potential violations in this way. The mechanism is straightforward and easily navigable by users of all ages. I will talk the committee through what happens when a user sees someone being bullied on YouTube. On the slide being shown now, members will see that there is a flag under every video on YouTube. If I want to report a video, I hit the flag, which triggers a drop-down menu of options. I select "Hateful or abusive content", click "Bullying", and my report is sent to the YouTube policy enforcement team for review. There are a number of other categories in the drop-down menu which help users to identify the behaviours that are banned on the platform. There is also an option to include additional information about a video, including what is in it, who is involved and a time stamp. Reported videos are notified to the policy enforcement team and promptly reviewed to establish whether they break our rules. There are sanctions where rules are broken. Users who repeatedly violate our terms of service will have their accounts terminated, while those who upload particularly egregious content will have their accounts terminated immediately.
The flagging mechanism is also a great way for us to listen to users to enable us to continuously improve our products and services. On the basis of recent user feedback, we have added a flag that allows users to highlight threats of suicide or self-injury. If we receive such a video, we immediately escalate it to law enforcement where the threat is actionable or we believe there is a potential for immediate harm. If a threat is too vague to escalate to law enforcement, we can get in touch with the person who uploaded the video to provide him or her with resources on suicide and self-injury prevention to help the person to start to get the help he or she needs. In addition to the flagging options, we have an easy-to-use reporting tool at youtube.ie/reportabuseto report comments and videos more generally.
A second way we empower users is by providing them with the tools to control how and with whom they interact online. We provide our users with full control over their videos and the comments made on them. As a user, one has a number of options available. The first option is private sharing. If a user wants to limit the exposure of a video, he or she sets it to private. When a video is set to private, only the person who posts the video and those he or she invites can see it. The video does not turn up in search results or on playlists. A further option is to block a specific user. If one chooses to block a specific user, that user cannot comment on a video or contact one privately in any way. One can also control the comments made on one's videos. There are three main options in this regard. One is to approve all comments automatically and allow them to appear below the video; another is to allow comments only after approval; and the last is to disable comments altogether so that no comments appear below a video. The person who posts a video has ultimate control over the video, the comments made and who gets to see it in the first place.
The last strand of our approach is protection. Google has invested heavily in technologies that protect users against bad actors on the Internet. We have a dedicated website, google.ie/goodtoknow/, which educates all users about protecting themselves online. The Good to Know website provides families with helpful tips and advice on how users can protect themselves against identity theft, malware, fraudsters and online scams. People can also learn about securing computers and mobile devices as well as how to get the most out of the Internet. We used Safer Internet Day last month to drive awareness about these online resources by including a link to the Good to Know portal on our search homepage google.ie. In addition to using technologies that provide an extra layer of protection for users, we have also formed partnerships. Google is a member of a number of organisations which work directly with companies to help them to improve safety and security for users. We collaborate with European Schoolnet and anti-bullying groups. We recently supported European Schoolnet's The Web We Wantbooklet, which is an educational document designed with the needs and preferences of young people in mind. The Web We Wantaims to start getting young people to think about how to manage their digital reputations and advises them on their rights and responsibilities online. The booklet was designed to be used in the classroom and for peer-to-peer learning.
Finally, I offer a few thoughts on where industry, governments and other stakeholders can work together better in this area. We believe it is critical to listen to and work with experts and organisations in this area, including those people who have their ear to the ground with regard to the technologies families are using and the challenges they face along the way. We collaborate extensively with experts and draw heavily on their advice and feedback to constantly review and improve our systems. A recommendation that has been made strongly to us is to focus more on peer-to-peer learning.
Based on that advice we recently launched the web rangers project in Israel which brought together over 200 children up and down the country to design and run their own safety campaigns. It was a huge success, with the participants producing 82 safety campaigns that ultimately reached over 20,000 children. The campaigns worked well because they addressed children's issues and spoke to them in their own language. Likewise BeatBullying, the largest bullying prevention charity in the United Kingdom, has a cyber mentoring scheme in which young people mentor their peers. This project has been very successful in reaching vulnerable children. We strongly recommend providing more support for such peer-to-peer learning projects and would be happy to work with the Government in taking the lead on such projects.
I have mentioned that as a platform provider, we have a duty to make sure we give our users the tools they need to stay safe online, but we also believe the Government has a role to make sure parents, teachers and students acquire the digital skills they need to keep themselves safe online. Learning to use the Internet is as important a lifeskill today as the safe cross code. Digital literacy, for example, how to verify online sources, protect information and where to go for help, should be a vital part of every child's education. We believe a coherent digital literacy curriculum in schools would dramatically improve children's safety and would be happy to work with the Department of Education and Skills on any effort in this area. We recently made a submission to the New Zealand Government setting out in detail what such a curriculum might look like. I have brought a copy of that submission for the committee in order that members can read it in detail. If questions arise, we would be happy to respond at a later date.
I hope the committee can see just how seriously we take the safety of our users. We have worked long and hard and invested heavily in ensuring our users have a safe and positive experience online. We are constantly innovating and improving our products. Ms Frank and I look forward to answering questions the committee may have and working with it in the coming weeks in this important area.
I am glad that Miss Duke's presentation points out that YouTube does take everything seriously, particularly any form of bullying. I have one question before opening the floor to members. Will Ms Duke, please, elaborate for the committee on the number of people referring to suicide on the site?
I welcome the delegation and thank Ms Duke for the presentation. I acknowledge the great work the company is doing and wish it well in its future endeavours. It is interesting to see the figures which show how the company has taken off globally in the past ten years and its interaction with the ordinary citizen. The committee is trying to ascertain the legal issues around cyber bullying and posting defamatory material on YouTube and other social media. Ms Duke has outlined the company's policy on flagging and how it deals internally with something dangerous, misleading or malicious on the site. What resources does it put into this work? It must be an enormous job to police the site, judging by the volume of traffic on it.
The Internet is a bread and butter matter for kids growing up today. What data does the company have on the issue of suicide? People are interacting at very vulnerable times in their lives. Has YouTube dedicated any resources to dealing with this issue? Can Ms Duke safely say the company has intervened when people have posted such material or was it too late? That kind of information is hugely important. We would all love to think that from playschool on society is pleasant and that no one makes malicious comments, as everybody wants to live in Utopia.
The question of whether there is a gap or otherwise in the legislation in Ireland or internationally is constantly raised. The Minister outlined it in the opening paragraphs of his contribution before we started these hearings. Many believe that when they are damaged by malicious comment on the Internet, whether on YouTube or any other platform, they have no recourse to the law. What recourse is available if somebody from outside the State posts something defamatory about an individual that is without foundation and damaging? Is there inter-jurisdictional legislation in place? What law or recourse is available to that person? There is a great deal of evidence, anecdotal and factual, that what is happening on the Internet is causing huge damage to a certain percentage of people. We do not have to go outside the Republic to see the numbers who have been damaged. If that reflects the percentage worldwide, a very high percentage of people are affected. Many access this very successful technology, from which the general populace derives great good. Is it necessary for the United Nations or other international bodies to take this crime as seriously as they do any other?
Ms Duke has shown us how much the technology has developed in the past ten years, with 4 billion people using it, and for how many hours each person uses it. What do we need to do? We want to present a report to the Government and others on what we believe is the way forward.
Ms Sue Duke:
I thank the Deputy for his questions. To answer the question about the resources we have invested in this area, a high volume of content is being uploaded all the time. We have developed a flagging system precisely to allow us to handle the content. Once a report is received, it is sent to a YouTube policy enforcement team. We have multiple teams working around the clock, around the globe, covering all geographical regions and time zones, to make sure that once content is flagged, it is examined and if it violates our rules, it is taken down very quickly.
It is not true to say the Internet is an unregulated Wild West space. What is a crime offline is a crime online and legislation dealing with offline crimes applies equally to online crimes. There are regulations and directives that apply to online transactions and behaviour and, in addition, there are self-regulatory models that companies have come up with at a global level to enforce rules on the Internet. Many companies have a community policing system similar to the one we have developed.
The third point relates to the multi-jurisdictional aspect. We accept that the police force has an extremely difficult job to do. It is a difficult job when looking at crimes across a number of jurisdictions, whether online or offline. There could be an improvement in educating law enforcement agencies on the tools and information available to them in investigating online crime. We started to do this with law enforcement agencies in America and are hoping to extend this work to other jurisdictions, including Ireland. That would go a long way towards informing law enforcement agencies in many jurisdictions of how to tackle these issues and co-operate with other law enforcement agencies where crimes are of a transnational nature.
I thank Ms Duke and Ms Frank for their attendance and interesting presentation. I have no doubt that they take abuse and inappropriate behaviour online seriously, which is welcome. Will they outline where and how Google and YouTube generate revenue? The more postings, downloads, views and activity on the site, the greater the revenue. The Korean video of "Gangnam Style" generated 1 billion views, which is extraordinary. How does this activity generate revenue for the company and those who upload content?
Suicide was a secondary option in the drop-down menu shown in the presentation. Is there any consideration given by Google to making it a primary option in the drop-down menu? It is becoming a huge issue and this measure may prove helpful.
From what our guests have said, users have protection. They can choose to vary their privacy settings, but subjects of content do not seem to have the same protection. One of my concerns is how the subject, rather than the user, of content might encounter difficulties. This applies not only to the content but also to comments made afterwards. Last night I looked at a music video on YouTube. It was of a folk singer from this country who had passed away. The subject was in a minority group, but it did not take me long to find a comment that any Member would be lambasted for making. It is still online and it is an incitement issue. It may never have been flagged to YouTube. I understand how the rules work with community policing, but it could happen anywhere. If it was flagged by a subject, does YouTube or Google have tracing capabilities? Where content is flagged and being dealt with by someone, is there a timeline, record or log of how the company deals with the report?
Does Google or YouTube work with the office for Internet safety in the Department of Justice and Equality? There are cases in which users violate the rules which are quite strong. I am pleased that the rules are so effective. When users violate them, there are sanctions and they can be suspended. What happens when users violate the law? Does Google or YouTube contact law enforcement agencies directly or leave it to those who have been injured to do so?
Ms Sue Duke:
On the first point, the YouTube revenue model focuses on advertising and forming partnerships. When the Deputy was on YouTube last night, he probably noticed that we formed partnerships with channels, merchants and individuals. Under the advertising model, we share revenue with our partners.
The second question was related to foregrounding the suicide and self-injury category. We recently added this category because the feedback from users was that, where they saw a threat of suicide or self-harm, it would be useful to have a specific box to tick to highlight the issue to YouTube. That is why it is place and we use the same dynamic to consider whether it should be a stand-alone category. If users say it should not be categorised under the heading of abusive and harmful content and should be a stand-alone category, that is something we will examine.
The third question concerned videos. If there is a video on which someone is being treated wrongly or abusively, there is the option to flag the content. The second option involves using our privacy complaints process. As a user, if someone appears on a video but does not want to be included in it and the video identifies him or her by showing his or her face or revealing a name or personal details, there is a very straightforward form in our help centre which requires the URL of the video, to know who the person is and how he or she has been identified. As long as the video is not in the public interest and where there is no issue of consent, we remove it. I did not include the issue in my presentation, but I am happy to send the details to the committee.
The other issue was related to criminal investigations. It is an extension of my previous comments. We have included the details of the privacy complaints process because the video may not break any rule while still violating someone's right to privacy. In that case, we remove it. If someone's right to privacy is infringed, we respond. Likewise, one has normal legal recourse offline to report a crime committed against a person or where rights are being infringed. Where there is due legal process, we co-operate with law enforcement agencies in investigating the crime and, potentially, assisting in prosecuting.
I am not talking about law enforcement issues but issues that are flagged. If someone flags an issue, there is the technology to log it. If the issue takes time to resolve, the time it takes to do so is also recorded. Is information available on whether it takes an hour, a day or a week to remove content?
I thank Ms Duke for her presentation.
YouTube has a massive network and a phenomenal number of people use it. It has enhanced society massively, but, as politicians, we always hear about the negatives. We do not hear the good comments. We constantly receive representations about bullying, suicide, abuse and defamation. I assume the company has a legal department to offer advice on these issues and analyse information and regularly update staff in this regard. Perhaps Ms Duke might elaborate on that aspect.
There is also a language problem as social media spread across jurisdictions. How does the company address this issue, given that it has staff in every country? Language usage varies between and within countries and identifying what is considered abusive by one person or group as to opposed to another is an issue. That argument also applies to religion. Somebody in the West might have a different view compared to someone in the East, Africa or on another continent. Is this taken into account when handling complaints? How is this aspect monitored?
With regard to people remaining anonymous, will Ms Duke elaborate on how YouTube identifies who is online and how it takes people offline? For example, it took a long time to have a video from my area on which a number of people threatened others taken down, even with the Garda involved and others making complaints. She has mentioned the flagging system, which is a good idea. How often is it used? How quick is the response to a complaint? Is it 15 or 20 minutes or half an hour? On the basis of representations I receive, it takes many hours.
Ms Sue Duke:
I might ask Ms Frank to say a few words on how organisations are working on this issue at European level, which will speak to the Deputy's question about different cultures, languages, etc. She is our dedicated liaison person for the Insafe project and, therefore, can give the committee a sense of how this is working for all 27 member states.
We have a dedicated legal department looking at these issues around the globe around the clock. We make sure we have people dedicated to reviewing complaints in order that content is taken down as quickly as possible.
With regard to how people are identified on the platform, we have two options. One can use one's real name or a pseudonym. Some people want to use their real name because they are retailers and want to build an awareness of their product and interact directly with people, while others use pseudonyms because they feel more comfortable expressing themselves using a pseudonym. I am aware this issue has been discussed at recent hearings, but this has been a huge help in allowing people to express themselves freely. The Minister, for example, mentioned the Arab spring, which has given a huge fillip to freedom of expression around the world.
My question relates to the complaints issue raised by Deputy Dessie Ellis. I have not participated in the virtual world as frequently as younger generations and, in some ways, politicians are playing catch-up. Perhaps other politicians are more adept than I am. Are the complaints YouTube receives in other jurisdictions similar to those it receives in Ireland or do they differ, particularly on the cyber bullying issue?
Ms Sue Duke:
As I said, I will hand over to Ms Frank to talk about what is happening at European level, but it must be remembered that we have a global platform and, therefore, our policies need to be globally applicable and executable. When we developed our guidelines, we took experts on global legal frameworks and safety etc., into the room. We worked long and hard on developing them and thought a lot about them because of the issues raised by the committee. We consider bullying to be about somebody who is persistently or gravely being attacked or harassed by somebody else. That is common to all cultures. All parents and teachers know when somebody is being bullied or attacked in that way and want to stop it. The committee should rest assured that this is something we experience around the globe. It is an issue to be addressed around the globe and a value common to all cultures.
Ms Sabine Frank:
I am happy to add to what my colleague has said and outline what we do at European level to reflect how we do this. We have dedicated persons who speak almost all languages. This reflects the fact that in videos we have different languages and need to understand what people are saying. We also need to take the context into account because context is key if we are to police videos against our community guidelines. The committee should rest assured that we make sure we have all the knowledge we need to do this work in a proper and meaningful way.
We put a lot of emphasis on education because we believe mobbing and bullying are phenomena in the offline world just as they are in the online world. It is important that we educate teachers and equip parents and students in their early years with the proper tools. We are all educated to read books critically and carefully and understand the context in books, but we are not well equipped to do the same in the online world. This is where we can see we have a role we can play and co-operate and have many partnerships with experts in this area. My colleague mentioned peer-to-peer projects. We have similar projects in different countries and would love to explore with the committee whether we could so something similar in Ireland. Our experience is that teenagers take it much more seriously if people of the same age educate them and it is much easier for them to turn to them if they have a question or problem. It is also about having a level of trust. They trust their own peer group more than older people. That is why we have invested in many partnerships moving in this direction.
Parents obviously have an important role to play and should equip their children with all the tools they need. We have dedicated special resources on our platforms directed at parents to give them a better understanding. They outline the background to cyber bullying, to whom parents can turn, what they can do and how to spot the first signs and reactions if their children are being bullied.
It is important that parents take quick action in this regard when it comes to their attention. Alongside this, we have significant information about safety tools parents can use on our platforms.
The YouTube curriculum was developed in Ireland and we are happy we can export it to many other European countries. I come from Germany, where we have introduced this education package to relevant institutions and they like it very much. We try to use the experience we take from one country such as Israel or Germany and import it to Ireland and vice versa. YouTube is also a member of Teachtoday, a cross-industry and cross-NGO initiative which equips teachers with safety resources they can implement in their schools to address issues such as online reputation, cyberbullying and what one does to grow up. We believe education is the key, not only media literacy.
Ms Frank feels parents have a key role to play in keeping on top of what their children do with social media and the Internet. How aware are parents of this role? I know of parents who are not properly informed as to what is going on with social media.
Ms Sabine Frank:
We have to empower parents to understand that the online world is no different from the offline world. They might not be on top of all the technology and changes to it but they know how to raise their kids. Accordingly, online education is no different from offline education. This is where we have to empower them.
European studies show parents are aware of their role in this regard but we need to help them, as the Chairman said, to better play this role. Parents have a key role next to teachers and companies such as YouTube in this regard. We all have to take our share. That is why YouTube has dedicated resources to educate parents. In Germany, schools hold parent events after school hours at which parents are helped to better understand the online world. We would be happy to assist the committee on this.
Ms Sabine Frank:
The Government has a key role to play because it can work closely with the schools. We are happy to help the Government by providing resources which can be given to teachers and parents. The Government, however, must introduce media literacy into the curriculum and assist parents in their role in this regard.
I thank Ms Duke for her presentation, which was very good, particularly when dealing with digital citizenship and how YouTube, along with the Internet, can be used as a force for good. The example of the US President, Mr. Obama, dealing with bullying and LGBT groups shows how YouTube and the Internet can be used positively. With regard to the flagging system YouTube has for inappropriate material, in some ways it only happens after the event. There was a case in Dublin recently of a young man dodging a taxi fare after a night out which was videoed and, subsequently, put up on YouTube. It went viral but another young man was identified wrongly as the man in the video and he had to go to court to try to have it taken down. Is there someone in real time monitoring what is going up on YouTube?
I accept that freedom of expression includes the right to remain anonymous. However, if someone uses anonymity as a disguise to put up offensive material or attack someone personally on YouTube, is it possible for the victims to trace the source of the video?
I thank the delegation for its presentation. As a teacher before I came into the Dáil, I believe YouTube is a strong power for good and a great teaching aid. It was used by me and my peers to enhance our teaching practices. For example, when teaching the safe cross code to six year olds, it is easier to get a YouTube interactive video with the song up on the whiteboard rather than having to sing it out one's self. However, most schools, particularly primary schools, have robust Internet policies and block access to YouTube to prevent children from seeing unsuitable material. Has YouTube ever considered, with the Department of Education and Skills, a separate platform for YouTube's education videos? It could be rolled out across the European Union.
In my constituency a fortnight ago, the principal of a post-primary school took a robust and concerted approach to dealing with an issue on a different social media platform which has since become a high-profile incident covered in national media outlets. When YouTube receives a complaint from a person about a video uploaded on its platform, how long does it take to remove it? The last thing anyone wants is for people to have to go to the courts to have a clip taken down. Usually, the person's identity becomes known through court proceedings and more people will look at the upload.
I woke up to four inches of snow this morning.
Ten miles away there was no sign of snow but that is how it goes. The delegation stated that bullying was common to all cultures throughout the world. I have no experience of bullying in the home but I believe children are being bullied in their homes and that it starts there. I have experiences of it. Then, consequently, it goes on to national school level. In some instances it might simply be by staring at people. I am concerned about this issue. In the case of three or four young children from a particular national school the natural progression would have been to the nearby secondary school, but since they were being bullied by people in the national school who were going to take the same steps to second level they changed their whole career and, in some cases, at grave expense by going to private schools to ensure that they would not be followed into second level. It concerns me that it will continue. Some children at national school are afraid to tell their parents in case their parents take action against those involved and things get worse. Parents and teachers have a significant part to play. Should national school teachers who have identified bullies in their school make the second level teachers aware of Terry Brennan or Patrick O'Donovan or whoever and explain that the second level teachers should watch out for the bullies coming from national school?
Ms Sue Duke:
I will call on Ms Frank to speak to the point about the dedicated liaison officer to which Deputy O'Donovan referred because she is our dedicated liaison officer and I will ask her to address Senator Brennan's point as well. Deputy Kenny referred to looking at content as it is being uploaded. As Deputy Kenny rightly stated, the vast majority of content going up is good, wholesome, fun, educational content that is being used for remarkable good, not least the safe cross code content in schools in Cork.
Ms Sue Duke:
It is in Limerick. I am sorry. Let us consider the scale of the content. There are approximately 72 hours worth posted every minute. The committee can imagine the difficulty that poses in terms of having someone examining the content going up given that every minute there are 72 hours going up.
What YouTube is, why it works, why it is so successful and why people go back to it is because it allows them to take a video and upload and share it instantly. That is an integral and central part of the platform. That is why people love it and why we had 1 trillion views in 2011. It is important to maintain the integrity of the platform for the vast majority of users who use it for good.
It is not a question of either pre-screening the content or doing nothing. I have already referred today to several systems we have in place to ensure that when content is uploaded and if it breaks the rules, then it is flagged and we get to that content quickly. If it breaks the rules we take it down very quickly. We are constantly improving those systems and examining them to see where we can make improvements. For example, recently we introduced the deputise programme where we have trusted flaggers in organisations. These are people that we know who have a history of flagging content that was abusive and that broke the rules. Such people have logged complaints with us and it turns out that they were legitimate complaints. Now, we look at the flags these people log quickly because we know that they are a reliable source. Other organisations have done likewise.
The Deputy referred to apps for education. We have a Google apps for education programme. I do not know much about it to be honest but I know it does something similar to what the Deputy suggested, namely, it is a bespoke system with content only for educational purposes and which is suitable for students. Similarly, there is a safety mode on YouTube. This means a parent can set YouTube to safety mode and this means mature content that does not break the rules but which is not necessarily appropriate for a child will be hidden. This tool is available to parents and I would be happy to send on some more information especially on the apps for education and what is involved in that suite of products.
Ms Sabine Frank:
We will clarify the position on YouTube for education and come back to the committee on it. Reference was made to the deputise project. We specifically partner organisations which have history in certain subjects. In Germany, we partner with an organisation which strongly fights against racism and Nazi content. It is very knowledgeable about this area and it is a great partner for us to work on, specifically flagging such content. These are the types of partners we are looking for and we would be happy to engage with more local partners. If the committee could point us to any well-known NGOs here it would be of great help for us as well.
Senator Brennan commented on where bullying starts and what we can do about it. I fully agree that bullying is an online phenomenon as well as an offline one. In most cases it actually starts offline. We know from statistics that in most online cases which we know of, the origin is from an offline encounter. Many of these children know each other from schools and it simply extends to the online world. We need to fight the idea of not being trustworthy and respectful to others. I cannot really judge whether the information should go across school systems. That is more a decision the committee and others must make. However, it is important for schools to work precisely with students on reputational issues. Perhaps the Chairman will allow me to give another example. We have developed a campaign called 361 Degree Tolerance. It is a YouTube channel we have created with several partners in Germany. We asked students and schools to give us video material and to set out how they define reputation and how to fight anti-respectful behaviour. This is the third time we have done the exercise and on the first two occasions we received 700 videos from youngsters between 13 and 25 years. This is a great way for them to engage and show us how they see respectful behaviour. We know that if people take action they are more likely to flag content if they see someone being bullied online. This is where we must get to. People must take action not only if they or their families are attacked but if they see people being attacked in the offline and online worlds. This is where we want to help.
The Department of Education and Skills and especially the Department of Children and Youth Affairs are the two Departments to which the Government must link.
If we were to link with Google-YouTube from now on, would it be able to monitor whether there was an increase or decrease in the number of complaints about bullying on its systems? If we were to devise a scheme now, would Google-YouTube be able to monitor that campaign in Ireland to see how effective it was?
I thank the delegation from Google-YouTube for its openness. Its contribution will certainly help the committee in formulating its report. I acknowledge that the presence of Google-YouTube's European headquarters is vital in providing employment for young people here and that it is a success. With regard to the invitation to visit its headquarters, either some of us or the full committee will take it up.
On behalf of the joint committee, I welcome Mr. T. J. McIntyre and Mr. Fergal Crehan from Digital Rights Ireland. I have no doubt that they are well aware of the reasons for this engagement and that we propose to compile a report for the Houses at the end of our consultation process. I will not read the statement on defamation as it was read earlier.
Mr. T. J. McIntyre:
We are very glad to have the opportunity to address the joint committee. Digital Rights Ireland was set up in 2005. We had a concern that while traditional civil liberties were well represented offline by the likes of the Irish Council for Civil Liberties, new issues were arising online that needed an additional input that perhaps did not come from more traditional groups. We have since joined similar groups across Europe in the European digital rights initiative and instituted a constitutional challenge against a European law that we believe violates privacy. This challenge will shortly be given a hearing before the European Court of Justice. We have also engaged with people, including a number of those present, on issues such as online privacy, Internet filtering and blocking.
I am associate dean at the school of law in UCD, where I specialise in information technology law issues. I am also a practising solicitor, working both for social media websites and users who believe their rights have been violated. My colleague, Mr. Fergal Crehan, is a practising barrister. He has advised Digital Rights Ireland from the outset, including in our constitutional challenge which is before the European Court of Justice. He is also a specialist in this area.
As committee members are aware, abuse of the online medium is very far from being a new phenomenon. On the slide before us we have the first report I was able to find of a criminal conviction for this type of behaviour. It dates back to 1999, when a man was jailed under the old criminal libel legislation for spreading false allegations of sexual abuse against a teacher. That example illustrates that this is very far from being a new issue and that it is one very much regulated by existing laws. Going back to 1999, we see that where the will and the resources are available to allow enforcement, we can achieve results using existing laws.
I have listed on the slide I am displaying a number of the laws that can be used. I had to hold myself back so as to cram them into one slide as I could have filled two or three listing relevant legislation, but I did not want to bore members. I would like to make some particular points about certain aspects of the legislation, some of which have been mentioned in previous hearings. Deputy Dessie Ellis mentioned earlier the phenomenon in his constituency of threats being made via YouTube. It is not always realised that there is an existing offence which applies. The Non-Fatal Offences Against the Person Act provides in section 5 for a distinct offence of making threats to kill or cause serious harm. Therefore, this constitutes a crime which can be prosecuted under existing laws.
Deputy Noel Harrington asked earlier about the rights of the subjects of YouTube videos. Again, these rights are already amply protected by Irish data protection law. There are general rights in section 2 of the Data Protection Acts to ensure one's information is fairly obtained and processed. There is also a specific right in section 6A of the Data Protection Acts which provides people with the ability to object to the processing of his or her personal data that will cause him or her distress. Perhaps there is not adequate awareness of these rights and we need more education of users about their rights and more resources to enforce these rights.
That leads us to the point made by Deputy Ann Phelan who mentioned the jurisdictional issue and the possibility of cross-border enforcement. One of the benefits of data protection law is that it is largely harmonised across Europe. Therefore, we already have instances to which we can refer if material is hosted in the United Kingdom, Germany or France, etc.
We can go to the local data protection regulators and ask them to enforce Irish citizens' rights, which are largely the same in those jurisdictions as they are here. We already have substantial laws in place that deal with most of these points.
I would like to respond to what Deputy Seán Kenny said earlier about traceability. This is covered in two ways. A so-called Norwich Pharmacal order can be used to enable users to be identified by the High Court in civil actions. Section 8 of the Data Protection Act enables information to be released to the Garda by YouTube - for example, to aid the force in its criminal investigations. While the tools are already there to authorise disclosure in many cases, whether there is enough awareness of those tools may be another matter.
Mr. T. J. McIntyre:
Absolutely. In 2005, the Irish music industry brought an action against people it accused of uploading music. It obtained the IP addresses of those individuals. It then had to link those IP addresses to particular names and addresses. It decided to get a Norwich Pharmacal order, which is named after an English case from the 1970s. It went to the High Court and got an order from Mr. Justice Peter Kelly requiring the ISP - Eircom, in this case - to disclose the identities of the individuals in question. This approach has been used quite commonly since then. I recently counted the number of applications for Norwich Pharmacal orders that have been made in Ireland in recent years. We know of six or seven such applications that have been made in the last three years. The number is probably much greater because not all of them necessarily appear in the media. The jurisdiction of such orders is well established.
Mr. T. J. McIntyre:
It can be expensive. One of the problems with Irish law is that we have an access to justice issue that is not unique to this area. A billionaire will find it much easier to assert his rights than an average citizen. Unfortunately, we are not an outlier in this regard. As a lawyer, I do not want to stand here and say lawyers are too expensive, but perhaps that can be said of the courts system generally. Collectively, those of us who offer the technical expertise needed to enforce one's rights are too expensive.
I am concerned that if we make changes to existing laws, we might undermine some of the rights that currently exist. An example of this relates to anonymity. It has been suggested at this committee that a greater degree of identification can be provided for by requiring real-name registration of users, etc. I am sure the members of the committee will be familiar with two great Irish writers who chose to write anonymously for various reasons: Jonathan Swift and Myles na gCopaleen, aka Flann O'Brien, aka Brian O'Nolan and many other names. Just as people find it important to protect their identities online, Brian O'Nolan wrote anonymously because of his position as a civil servant. Some media columns are written anonymously by politicians so that they have the freedom to speak with a certain candour they might not otherwise be able to show. I am thinking particularly of the Drapier column in The Irish Times.
Similarly, Barnardos recommends that children should be able to speak freely about issues such as parental break-up via chat lines, forums or blogs. It strongly recommends that children who use the Internet to talk through their issues and engage with others about the challenges they face should never use their real names or reveal their identities online in case this might be used to persecute them in future. There has to be a concern that if we begin to restrict privacy online, it may have unintended consequences for vulnerable groups such as victims of domestic abuse. The Women's Aid website talks about the risks to women whose abusers discover what they do online. Such abusers often use their online identities to track women, or use the fact that women might be seeking help online to prevent them from leaving abusive relationships. Privacy is important for such women.
This issue also arises in a slightly more unexpected context. In 2006, Ryanair brought an action against the Ryanair European Pilots' Association website to seek to identify particular pilots who were posting anonymously to complain about their working conditions, terms of employment, etc. Ryanair essentially claimed to be protecting the safety of its workplace, but the High Court rejected its application. Mr. Justice T. C. Smyth took the view that, rather than engaging in a bona fide application, Ryanair had engaged in a "feigned exercise" designed to divide the loyalty of members of the IALPA union. He said it formed part of a "war of attrition" on Ryanair captains, aimed at dissuading them from exercising their rights through legal battle. There is a concern that if the captains in that case had been identified, they would have been unable to gather together collectively to exercise their right to form a union, engage in union activity and possibly take industrial action.
The final example I would like to give relates to whistleblowers. The committee will be familiar with the case of Mr. Noel Wardick, a senior member of staff at the Irish Red Cross who expressed concerns about the governance of that organisation in an anonymous blog. He was eventually identified after an application was made seeking his identity from Blogger, which is a blogging platform operated by Google. He identified himself and, as a result, was dismissed from his position at the Irish Red Cross. I understand he testified before the Oireachtas on a number of occasions to elaborate on his governance concerns. We are concerned that restrictions which make it easier to identify people will tend to deter potential whistleblowers from coming forward.
A related point arises when we think about how existing telecommunications law might be applied to the Internet. It has been mentioned at previous hearings that the sending of abusive messages is an offence under section 13 of the Post Office (Amendment) Act 1951. If we expand that offence further, there is a real risk that it might have unintended effects on what can be said online. With the permission of the Chair, I will ask my colleague, Mr. Crehan, to address that point.
Mr. Fergal Crehan:
Section 13 of the Post Office (Amendment) Act 1951 does not date back to 1951. It is a much more recent creation. The original Act was most recently amended in 2007. Section 13, as most recently amended, deals with messages sent by telephone, including text messages, that are "grossly offensive ... indecent, obscene or menacing". It also deals with messages persistently sent "for the purpose of causing annoyance, inconvenience, or needless anxiety to another person". This applies to persistent nuisance telephone calls. It also applies to messages "that the sender knows to be false" and to prank calls. If one examines the Oireachtas record from the time this most recent amendment was made, one will find that the purpose of this section is to deal with telephone messages which are person-to-person communications. Any harassment that happens via person-to-person communication is by its nature far more intimate than something one might stumble across on the Internet following a Google search. There was a very good reason Internet communications were not included in that section.
It has been suggested that the section is ripe to be expanded away from person-to-person telecommunications and into Internet communications in general. We have concerns about that. Our first concern is that the concept of intimacy of harassment would be lost. We would be dealing with something that is spoken to the world at large but to nobody in particular. Deputy O'Donovan made a telling point last week when he said it would be a case of moving the focus from talking to a person to talking about a person. Harassment can take place when a person is spoken to aggressively. Freedom of expression issues come into play, however, when it comes to talking about a person. One of the problems we see with expanding this offence is that it would potentially criminalise every web page, every online newspaper story and every RTE broadcast placed on rte.ie that is intended to cause annoyance. It is in the nature of certain kinds of political art or political speech-making that annoyance or provocation is intended. That would be criminalised immediately. Such a measure would create a very subjective crime. We would have to decide what is grossly offensive, indecent, obscene or menacing. If one examines a forum such as politics.ie, in which partisan politics are involved, one finds that tempers frequently run high, especially coming up to elections. One man's vigorous debate is another man's offensive, grossly offensive or indecent message.
It is a very subjective question. Do we allow judges decide, given different judges may have different views on what is grossly offensive, or do we attempt to set some sort of a subjective standard based on the offence caused to the person who was offended? If we do that, we effectively allow the most easily offended people to set the standard for freedom of expression.
It would also break the principle of parity between online and offline speech in that it would make it a crime to say something online which it was not a crime to say offline. There are certain admittedly unsavoury things that are shouted, for example, on the terraces of a football or hurling match but, if the same is said online, it suddenly becomes a crime. There are difficulties of parity in this regard.
A key practical point is that it would add to the already huge workload of the Garda computer crime unit. These would be very work-intensive offences to investigate and prosecute. They would be dumping a huge amount of very work-intensive offences onto the desk of the Garda computer crime unit which, as we will see, is already significantly overworked. There is a comparable law in England and Wales, at section 127 of the Communications Act 2003, which is not limited to telephone messages in the same way this is. It has been misused and the Crown Prosecution Service has recently issued guidelines suggesting these kinds of grossly offensive prosecutions not be pursued in the public interest. It was felt it would lead to a debacle and that the focus should be moved back to person-to-person communications of a threatening nature. In any case, we feel expanding section 13 would likely be in breach of Article 10 of the European Convention on Human Rights. Freedom of expression is guaranteed by our Constitution. I also include a quote concerning the Sunday Times v. UK case, which states that freedom of expression applies to offensive, shocking and disturbing speech as well as to uncontroversial speech. Indeed, there is very little point in having a right of freedom of expression if it only applies when one is saying things that are universally popular.
With regard to suggestions, we are not just here to say "Do not do anything". We understand there are issues with the Internet and that certain actions have to be taken. With regard to what we suggest, first, we suggest adequate funding for the Data Protection Commissioner. The graph in the slide before committee members shows funding of the Data Protection Commissioner over recent years, and members will see that, as of 2011, funding is almost back down to 2004 levels. Over this time, a huge number of large data holders have moved into Ireland. Because the likes of Google, Facebook and LinkedIn are based in Ireland, the Data Protection Commissioner has one of the biggest workloads of any regulator in Europe, bigger than that of countries many times the size of Ireland, because we uniquely have so many of these companies based here. However, the Data Protection Commissioner is not adequately funded. It has the job of regulating all of the data, for example, that Facebook holds for all of Europe yet it has a staff of 20, so few that they fit in a small office upstairs from a Centra in Portarlington.
We would also suggest more funding for the computer crime investigation unit. The slide before members is from the Irish Examinerlast year and points out that it emerged in recent court proceedings there is a three-year waiting list to prosecute some child pornography offences because the resources simply are not available to the unit. To create a new offence of offensive communications via the Internet would be to dump on this already clearly overworked unit a vast amount of low level, District Court level offences that are nonetheless hugely work-intensive. We feel that until we have that unit adequately resourced to do the job it already has, it would not necessarily be a good idea to dump yet more work-intensive offences onto its desk.
We are happy to have been invited to the committee today and we are happy to answer any questions the members or the Chairman might have.
I thank the witnesses for coming before us. Their presentation was excellent and has certainly gone into many of the issues we have been exploring and that we have put to the various service providers over the past couple of weeks. With regard to the point about comments made on Hill 16 or terraces during a football match, when it is said, it is finished. However, when it goes up on the Internet, it stays there. That is where some of the issues arise.
The witnesses quoted some of the judgments going back to 2005 where it was possible to trace IP addresses. The technology exists for false or malicious information to be put up on the Internet. It has been available for ten or 15 years and is not a new phenomenon, and while some have claimed it is not possible to get that far in investigations, that is fictitious, for want of a better word.
With regard to the cases, including legal cases, the witnesses have put before us, there is very little knowledge out there among those who are using the Internet. In the past couple of weeks, we have tried to explore this and address the legislative gap with every group that comes before us. There seems to be a whole raft of legislation governing this area. As a committee, we want to highlight this and make people aware of the evidence of today's witnesses, given the detail they have put before us. It is fantastic to see all of the judgments and the quotations they have provided. We need to highlight this in any way, shape or form we can. Again, it was an excellent presentation.
The witnesses spoke about perhaps changing the legislation in an already overburdened system. If there were one or two issues on which they were handed a blank page and asked to highlight issues with regard to ensuring those who use the Internet are protected fully, what would they be?
Mr. T. J. McIntyre:
The Deputy's point on education is a very important one and these committee hearings are a very important aspect of educating people in general by bringing the issue to public awareness. He asked what would be one or two points where we could foresee change. First, the problem is largely not one of existing laws but one of access to those laws and access to justice. It is important that people have mechanisms available to them which do not require them to spend several thousand euro on a trip to the High Court for, for example, a Norwich Pharmacal order. One of the best ways in which people can exercise their right is through the Data Protection Commissioner's office which, as we have seen, is rather underfunded.
Second, with regard to the Garda computer crime investigation unit, investigations in this area are necessarily very often cross-border and necessarily very often technically intensive. For example, they can require looking at the contents of laptops and technical evidence. A problem in this regard is that, in many cases, these are not very complex matters that should be handled out of Harcourt Street and, for the most part, should be handled at the local Garda station level. However, it is the need for specialist expertise that imposes a bottleneck. If we were to see greater funding for the computer crime investigation unit, we would help to address that particular bottleneck.
On the cross-border issue, is there an issue with regard to posts made in a different jurisdiction to where the complainant is living, for example, China or the UK? What blockages exist in terms of them seeking redress?
Mr. T. J. McIntyre:
Ireland has mutual legal assistance treaties with a number of jurisdictions, most importantly the US, because the vast majority of these services tend to be headquartered in the US. If something amounts to a criminal offence, it is already the case that the computer crime investigation unit could make use of that process to get information from, for example, Google in Mountain View, California. The mechanisms are already in place.
One problem is that Ireland has yet to ratify the convention on cybercrime. We entered into this convention and agreed to it in 2001. We intended to implement it and ratify it by 2003 and we are still waiting.
One of the advantages of doing that is that it would help provide for dedicated points of contact between national police forces.
I thank the delegation for its presentation. I agree with everything it said about access to the courts. I said here last week that it is the preserve of the rich - end of story. I know people will say that legal fees have come down but in the real world with which most of us deal, for civil matters, the courts system is the preserve of the rich. If one has to ask the price, one cannot afford to go up to the four goldmines. Anybody who thinks they can go up there about any matter we discussed in the past fortnight would need a serious amount of cash. For the ordinary punter on the receiving end of some of the stuff we are discussing, access to the courts system to vindicate one's name and have it restored is simply not available.
I agree with the delegation in respect of the protection of identity. I note the examples it gave in respect of Drapier and Jonathan Swift. If Drapier said something defamatory about a Member of the Oireachtas, one could take a case against The Irish Times or whichever newspaper published it. The issue is where freedom of speech ends and defamation begins and how one sets margins. In one very high-profile case, an anonymous tweet, in some people's minds, changed the outcome of an election. I am firmly of the opinion, although I know some people would not agree with me, that this tweet did change the outcome of a very significant election. Where does anonymity fit in there? If Mary from Dunloe sends a politically motivated text or tweet to a programme and does untold damage as a result of a baseless allegation being read out on a national or local broadcaster, how does one marry freedom of speech with the untold damage it does to someone's reputation because somebody then says "ah sure, there is no smoke without fire"?
I was interested in what the delegation said about the convention on cybercrime. What are the reasons for the delay in ratifying it given that it has been knocking around since 2003? Let us take the example of the owner of a national newspaper which publishes a story given to it by Mary, Jack, Billy, Tom and Joe on which background checks have not been done and to which everybody has added their 10% VAT so that by the time the story comes to the newspaper, it is totally different. The newspaper goes to court and a judgement is handed down. Based on the delegation's experiences, have any social media outlets or discussion forums been challenged in the courts recently regarding what they permit to be hosted? A newspaper or news outlet that knowingly publishes or allows a newscaster to announce something without checking it to see if there is any element of truth in it has a vicarious liability straightaway and will take a hit on it and end up paying the piper. Have any of these so-called forums been subjected to the same treatment?
Mr. T. J. McIntyre:
The answer to that is yes. Not only is that the case, they are constantly under threat of and very often subjected to legal action. The moderators of boards.ie recently posted a picture of a whiteboard on which they had written "It is now five hours since we have received a threat of High Court action." It is the experience of forums such as boards.ie, politics.ie and others that threats are made on a regular basis. One can see from High Court records that cases are filed against Irish social networking sites on a regular basis.
Mr. T. J. McIntyre:
The majority would be settled. The number of cases that have gone to judgment is very limited. The one Irish written judgment on the point is a case called Mulvany v. Betfair which involved allegations against Betfair regarding comments a user made in their chat forum. The Betfair case illustrates the answer to Deputy O'Donovan's second question which concerned the liability in these situations. The liability in these situations is governed by the e-commerce directive which provides for so-called hosting liability. If a social networking site hosts comments a user makes, it is not liable for the comments until it has been notified of them and given an opportunity to take them down. If a network is notified that these comments are defamatory, in breach of the criminal law or in contempt of court and fails to take them down at that point, it faces liability. That aims to strike a balance between avoiding the need to, for example, pre-moderate the many hours of content uploaded to YouTube every second and allowing an individual whose rights have been infringed to seek redress.
Mr. T. J. McIntyre:
Where the law stands at the moment is a good point. One enjoys a presumptive right to anonymity. It is a right that can be taken away from one by the High Court in a civil action if a good case is shown against one - not just an allegation but a reasonably substantive case. It is a right that can be lost in the context of criminal proceedings where the gardaí have reasonable suspicion to investigate a possible crime.
I thank the delegation for its presentation. I know it says that traditional rights must be protected online as well offline. Are there any laws that are missing with regard to this? It mentioned different areas where there are many protections but are there any areas where protections are missing? If someone is using a computer of smartphone, are there any powers to seize those devices if they are identified? Should a mechanism exist whereby equipment can be seized if it is not possible to get at the person? Are there anomalies in the law that would stop that? In respect of the monitoring of phone calls, certain words that were offensive or used by certain groups triggered off-----
Overseas intelligence networks identified particular words. Does the delegation believe this is acceptable? Does it think people should be able to monitor phone calls in a similar way? I was going to talk about Twitter because some stations have a slight delay so they can monitor messages before they broadcast them. Is that the normal process? Does the delegation have an opinion as to whether they should appear straightaway without being interfered with? The Office of the Data Protection Commissioner seems to be underfunded. We need more monitoring.
If our guests could tell us what should be done in terms of funding for the Data Protection Commissioner, I would appreciate it. I refer to additional funding and how it would be spent, etc.
I thank our guests for their very detailed presentation. There is a great deal of legislation in place, but enforcement seems to be lacking. The volume of traffic on these social networks is so huge that the mind boggles in terms of how one might even begin to trawl through all of the relevant material. It is always possible to deal with serious crime because the relevant laws are in place and the Garda will become involved because of the nature of such crimes. The most difficult crime to deal with is low-level, nuisance-type crime which is not serious enough to pursue, but it causes people the greatest level of discomfort. If a serious crime is committed, the perpetrator will more than likely end up in prison. There is a great deal of low-level, nuisance-type activity on social networks which no one is really too bothered about pursuing because doing so would take up too much time, funding, etc. In addition, no one wants to clog up the courts with cases relating to the matters to which I refer. However, the type of activity to which I refer is very abusive in nature and involves bullying. In order to plug the gap between serious and nuisance-type crimes, would our guests be of the view that it might be useful to establish a body similar to the Press Council which could deal with social media? People could bring their cases before such a body in order to have them examined without being obliged to initiate defamation proceedings, etc., in the courts.
The first matter on which we should focus is that one of the reasons these hearings are taking place is to deal with cyber bullying and the dangerous effects of online abuse or bullying on those who are most vulnerable. The first presentation we received was from the Minister who mentioned that there might be a gap in the legislation in this regard. In that context, Mr. Crehan referred to the relevant section of the Post Office (Amendment) Act 1953. In view of everything we have heard since these hearings began, I have come to the view that there is not a significant gap in the legislation. Almost everyone who made a presentation indicated that what was illegal offline was similarly illegal online. The difficulty lies in getting to a position where one can deal with this. Each of the platforms and companies which made presentations has different ways of operating and reporting. Some allow users to remain anonymous, while others such as Facebook demand that they identify themselves before being able to use their sites.
There has been a great deal of discussion about defamation and illegal use online. Quite often, those who are the subject of online abuse are well able to defend themselves. I refer to politicians, business people, journalists and others who are in the public eye and would often be online. These individuals are well educated, have access to the courts and know how to proceed in matters of this nature. The difficulty which arises concerns those who suffer abuse and do not have the resources to go to the High Court. In many cases, they would not even know how to report online abuse. Mr. McIntyre has suggested increased funding be provided for the Data Protection Commissioner. In the light of the fact that Google, Facebook and other companies have their European headquarters in this country, that is an absolute must. The economic benefits which these companies bring to the country are vastly superior to the costs we might potentially incur in seeking to establish a balance in this area.
Would there be any merit in creating some type of platform or method of connection, by means of a group or association, in order to assist teenagers who are vulnerable and might become so distressed by what they read online that they might do something similar to that which others have done? We are all aware of the recent cases in this regard. The committee received a presentation from the office for Internet safety, an organisation of which nobody seems to have heard and which operates under the aegis of the Department of Justice and Equality. I am very disappointed that the office in question has not played a greater role in the area under discussion. However, I am not sure it is designed to deal with it. In addition, I do not know of any other body which might be in a position to deal with it. Would our guests support the establishment of an entity to deal with the low-level problems to which I refer in order that we might protect the most vulnerable? Having listened to all of the debates in which we have engaged, it is the most vulnerable who will continue to be caught out.
On the gap in the legislation, Mr. Crehan seemed to imply that digital rights were covered under section 13 of the Post Office (Amendment) Act 1953, whereas the Minister for Communications, Energy and Natural Resources stated the Communications Regulation (Amendment) Act 2007-----
Mr. T. J. McIntyre:
Deputy Noel Harrington inquired as to how we might protect those persons who did not have the resources necessary to bring a High Court action. One of the approaches which might be taken as a model is that taken in the United Kingdom in which there is a Safer Internet Centre. This centre is really just an informational resource, but it provides people with a one-stop-shop where they can indicate they have been abused or bullied online. It then provides them with guidance on how they can access the remedies that might be available. It might, for example, show them how to use a blocking function or the reporting function on systems such as Facebook. If the office for Internet safety is to develop a role, I would like it to move towards emulating that model.
Deputy Noel Harrington's point overlapped somewhat with that made by Deputy Ann Phelan on this being rather low-level crime. That is an important point in the sense that we are possibly misleading ourselves by focusing on criminal prosecutions and the role of international borders. What is often involved is localised bullying and there is not necessarily an international dimension. In fact, there is not necessarily even a criminal dimension because many of the cases about which we are talking involve children. It is very unlikely, as a matter of practicality, that juveniles will ever be prosecuted, even for criminal offences they may commit in this context. Those of us who are parents would not want children to obtain criminal records for ill-judged tweets they might have issued one night. We might be better served by focusing more on concrete examples of abuse rather than on some of the hypothetical issues which do not necessarily arise.
Deputy Ann Phelan also inquired whether there was a case to be made for the establishment of a body similar to the Press Council to monitor social media. We already have such a thing in the individual complaint resolution systems set up within the likes of Facebook, boards.ie, YouTube and so on. All of these entities operate their own complaint resolution services. I can see merit in having a one-stop-shop, but if one were established, it would threaten the diversity that is extremely important. Ultimately, we are discussing online communities which operate to different standards. Some of these communities are very family-friendly in nature and enforce certain standards in respect of identity. I refer, for example, to Facebook. Others such as Twitter are much more open and freewheeling and allow users to remain anonymous. There are still others which are much more vitriolic and I daresay politics.ie falls into that category. We do not want a one-size-fits-all solution which would, in effect, try to impose a single set of standards on all of these disparate organisations.
The UK Safer Internet Centre might be more useful as a front-end to those individual complaint resolution mechanisms, rather than having a separate body which would adjudicate on complaints about all of the different communities. Ultimately, the governance of each individual Internet community should be, in the first instance - if it does not break the law - a matter for the community itself.
Deputy Ellis also made a point regarding prior filtering or monitoring, by analogy with the practice carried out in England for many years - which may still continue - of monitoring all telephone calls from Ireland and looking for certain suspicious keywords. This is governed by European law. Article 15 of the e-commerce directive provides that we may not impose a general duty to monitor on Internet service providers. This means that the law may not, for example, require the likes of YouTube to pre-moderate every video before it goes up or, for example, implement a keyword filtering system before a video goes live. That said, there is nothing to stop an individual provider from choosing to do that if it wishes. For example, Facebook would certainly be entitled to monitor public posts for bad language, if it wished, and block those posts. Again, I suggest that this is really a matter for the community norms within each organisation rather than a one-size-fits-all legislative solution.
Mr. Crehan will address the remaining points.
Mr. Fergal Crehan:
Mr. McIntyre questioned whether there is a gap in the legislation. We need to ask whether some of these problems can be fixed by legislation and whether there is a lack of legislation to deal with bullying. In a classic case of bullying, a child is beaten up and his lunch money is stolen. In the adult world we call that a mugging and people are sent to jail for committing this offence. We have to ask ourselves if that is how we want to deal with bullying. Are there legal remedies to bullying, or is it something that has to be dealt with on a much lower, more educational level? I would say that it is. I agree that laws to deal with bullying exist, but it may not necessarily be a good idea to involve the law. Some kind of educational programme is necessary. There are television advertisements about drink awareness, food safety and various other public health issues. I argue that this is a public health issue in that, when children go to school, they learn to read and write but they also learn how to play with other children, how to get on with people their own age or different ages, and how to respect people who are different from them. That is what they learn in the playground, outside of the classroom. Increasingly, children and adults live their lives online. This may provide education not only from the point of view of general socialising and learning to play with one's peers but also from a civic point of view. If we are to have some kind of civic education where debate and public debate is polite and reasonable, then this may be something to be considered at an educational level.
Deputies O'Donovan and Ellis asked about tweets and text messages sent to radio programmes. They referred to the famous occasion of the presidential debate. In my experience, radio station presenters have a computer in front of them and they read whatever they think is interesting as a contribution to the programme. The presenter is acting in a journalistic role. There is no filtering except that the broadcaster chooses what he or she decides is a funny tweet or an interesting text, which is then read out. In the event that something defamatory is read out, it will be that journalist's and that radio station's necks on the line. It may not be financially worth their while to go after the person who sent the tweet or the text. However, they are free to do so and they can join the person as parties to the dispute. There is no question of filtering. Broadcasters decide what they want to read out, in the same way as newspapers. Deputy O'Donovan said that stories often snowball with quotes from various parties. Everyone has their ten percent added. The journalist and the editor must decide whether their sources are credible or otherwise. If a source is only a tweet, then it may not be a good idea to go to print.
There is a distinction, in my view. To be fair, a journalist will not print in a newspaper, "Anne from Kilkenny says...". Neither will a journalist identify a source by a nickname. Not only in the instance of the presidential election but also in other cases, politically motivated texts are sent with the intention of doing damage and are read out. The press office or the individual concerned who has been defamed or had damage done to him or her rings the radio station but to no avail. It is almost like emptying a bag of feathers in a field and then trying to collect them. It is impossible.
Mr. Crehan said the same was applicable to the print media as to the broadcast media. A news bulletin or a newspaper article will never quote as a source, "Noel from Beara". If Noel from Beara is Noel Harrington, that is the way it will be quoted. It will not be simply Noel from Beara, when there happen to be 40 men called Noel in Beara and it is impossible to identify which one is which.
Mr. T. J. McIntyre:
If I may elaborate on that point, what Deputy O'Donovan is describing is really something that relates to heavily regulated media. The media are already subject to obligations, be it to the Press Council in the case of newspapers or to the Broadcasting Authority of Ireland in the case of the broadcast media. Deputy O'Donovan's example is striking because it illustrates that abuses will occasionally arise even in a heavily regulated environment. That is not necessarily an argument for extending regulation to a different context, particularly because the editorial involvement and endorsement are very different in each case.
The Deputy referred to The Irish Times, a newspaper that stands over what it publishes. Its brand stands for a certain level of integrity and truthfulness in the content of the articles it prints. It is not open to The Irish Times to say that it is not willing to stand over what John Waters said today. It would be very misleading to extend that analogy to the Internet, where we do not expect Twitter or Facebook or YouTube to stand over everything their users say. The idea that we should take a principle that has developed in the context of media that select whose words they will print, the editorial line they will take and so on, and then apply that to providers that really are just conduits for the opinions of other people-----
That is not the point I am making. My point is that the information is being relayed onwards through another medium, almost always by means of the broadcast medium of television or radio. I accept that a person may tweet whatever he or she wishes. However, that information is being relayed through a broadcast medium even though it would never be allowed in the print medium, for the reason that it cannot be substantiated. No self-respecting journalist in a print outlet will do that. Why is it acceptable in a broadcast outlet?
Mr. T. J. McIntyre:
I am reluctant to say anything about the particular case of the presidential election because I understand litigation is still pending. More generally, if a broadcaster lends its endorsement and lends its credibility to a particular tweet, that is a much more serious matter. As an individual I am entitled to discount what I read from some random tweeter, whereas I will give it more credibility if it comes from RTE. In that case, we have to think about looking more towards RTE than towards Twitter.
Mr. Fergal Crehan:
Under defamation law, if one rebroadcasts or republishes a tweet, those words are regarded as one's own and one can be pursued through the courts. There is that remedy. I refer to the point raised by the Deputy about access to justice and the cost of same. If a broadcaster reads out a text or a tweet on air, those words are now the broadcaster's own and it will be liable for them.
In some programmes, the broadcaster reads out the tweets and they also appear on screen. Is there any difference in law between the two? I always believed that there was a delay, during which time the texts were monitored.
Is the broadcaster and its background staff responsible?
Mr. Fergal Crehan:
The broadcaster, in its corporate sense, is responsible. Therefore, all of its staff are responsible whether that is the man or woman in front of the camera or somebody backstage who decides what appears and what does not appear. The question is irrelevant because it is the company or corporate entity that ultimately is responsible. It is a good idea that they filter these things. If they do not filter them then they may face the consequences.
I thank Mr. McIntyre and Mr. Crehan for attending. They have helped us in our deliberation and with our committee work. That brings us to the end of the public sessions which have taken place over the past month or more. I thank the Members for their patience and I thank the secretariat for its hard work. I have no doubt that we have educated ourselves over the period. Our report will contain a lot of useful information. We have heard many viewpoints from interested parties so the report will benefit anybody that uses it. After today's meeting and our other meetings any parents will have learned a lot of information. The committee must compile all of that information into a report and present it to Government. I thank the delegation for attending today and helping us in our deliberations.