Oireachtas Joint and Select Committees

Wednesday, 23 February 2022

Joint Committee on Tourism, Culture, Arts, Sport And Media

Online Disinformation and Media Literacy: Ms Frances Haugen

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Our meeting today is with Ms Frances Haugen who is a Facebook whistleblower. The specific theme of today's meeting is to discuss online disinformation and media literacy. Ms Haugen is very welcome and I thank her for joining the meeting today despite time differences and everything else. There is huge interest in what she has to say to the committee. As she knows, we have been working on the Online Safety and Media Regulation Bill. We have heard a lot of what she has had to say through the media internationally and in Ireland. Ms Haugen is very welcome and we look forward to hearing what she has to say. I hope she can bear with me as I have some housekeeping to go through.

I remind members the format of the meeting is that I will invite our witness to deliver her opening speech, which has been limited to five minutes, and this will be followed by questions from the members. The committee may publish the opening statement on our web page. Before I invite Ms Haugen to deliver her opening speech, I want to explain some limitations on parliamentary privilege and the practice of the Houses as regards to references made to other persons in evidence.

The evidence of witnesses physically present or who give evidence within the parliamentary precincts is protected pursuant to the Constitution and statute by absolute privilege. Ms Haugen is reminded of the long-standing parliamentary practice that she should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable or otherwise engage in speech that might be regarded as damaging to the good name of that person or entity. Therefore, if her statement is potentially defamatory in relation to any identifiable person or entity, she will be directed to discontinue her remarks. Witnesses participating in the committee session from a jurisdiction outside the State are advised they should be mindful of their domestic law and how it may applied to the evidence they give.

Members are reminded of the long-standing parliamentary practice to the effect they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him or her identifiable. I remind members of the constitutional requirement that they must be physically present in the confines of Leinster House to participate in public meetings. I cannot permit members to attend unless they adhere to that constitutional requirement.

I remind members if contributing via Microsoft Teams to identify themselves when contributing for the benefit of the Debates Office staff preparing the Official Report. I ask members to mute their microphones when they are not contributing to reduce the background noise. I remind them to ensure their mobile phones are switched off or on silent.

It is a pleasure to have Ms Haugen with us today, albeit virtually. We know she is speaking to us from the other side of the world and we very much appreciate her taking the time to be with us. This particular debate and conversation with the committee is on disinformation, misinformation and online platforms. We are really looking forward to hearing Ms Haugen's views and ideas on obligations we may mandate on social media platforms to protect users. This is the main ethos and theme of our discussion. With this, I open the floor to Ms Haugen and ask her to make her opening statement.

Ms Frances Haugen:

I thank the committee for inviting me. I am excited to be here. My name is Frances Haugen. Last year I left my job at Facebook as I felt I could no longer be part of a system which harms children, stokes division and weakens our democracies. I disclosed what I felt governments and the public should know about what is happening within these social media giants and the deliberate choices they make to maximise their profits at the expense of public safety and the never-ending pursuit of growth at any cost. I hope my revelations have been valuable for committees such as this one.

It is clear governments all over the world had been searching for answers about how to tackle some of the biggest issues of the day and how to protect their citizens and their democracies, but they did not have enough information even to ask the right questions let alone write the answers. This is beginning to change, with European nations leading the way in developing game-changing online safety regimes. As the European home to some of the most powerful companies in the world, Ireland has a unique role to play in holding these companies to account.

As the online safety Bill acknowledges, self-regulation of social media has not worked. Platforms cannot be trusted to act in the public interest. They are often, as my revelations showed, fully aware of the harms caused by their products and services and yet choose to ignore these in favour of growth and profit. The Bill is an important intervention, as is the EU Digital Services Act. Both regimes say "enough is enough". Their alignment is critical, not least because Ireland will need to implement the Digital Services Act but also because regulatory fragmentation does nothing to help Irish citizens. It only creates complexity and uncertainty at a time when what we really need is clarity. The Digital Services Act and Ireland’s online safety Bill should align on key principles that allow quick, effective regulatory implementation.

There is, of course, no silver bullet to make the Internet safer. The challenge is how to make things easier and not harder. Alignment between Ireland and the EU will be crucial in smoothing this path. Part of this alignment is about how to develop robust rights-respecting regimes that go to the heart of the business model of platforms, addressing the algorithms and design features that enable the spread of disinformation and hate. We all want Ireland’s Bill to succeed, and to do so it must focus less on content and more on how that content is shared, spread and amplified. It is the systems of these platforms that should be the top priority and not the content. These are systems that promote harm, target people with conspiracy theories and drive people to hate and abuse. We cannot simply rely on the deletion or criminalisation of harmful content. This is not only because it risks infringing on free speech but because it does not work. Regulatory regimes that have focused solely on deleting content have failed. There is just too much out there.

We have seen too many examples of how hypervirality and amplification of lies and hate can lead to division and polarisation online, and violence and harm offline.

The US insurrection and Covid infodemic are just a couple. However, this is far from just a western issue. In fact, some of the most egregious harms caused by the decisions made in the shiny glass headquarters in Facebook in Dublin have been in faraway places like Myanmar and Ethiopia. The ethnic violence fuelled by Facebook in those countries are the opening chapters of a book too horrific to read.

That is why I came forward. I can no longer be part of such a company. It is also why I am urging the Irish Government to reflect deeply on its own role and on the responsibilities that come with housing these social media manipulation machines. An important live decision currently rests with Ireland.

A few Rohingya youth from Myanmar have filed a complaint with the OECD national contact point in Ireland demanding justice for the genocidal violence unleashed against their community. They are desperate for financial support to be able to continue their studies from refugee camps in Bangladesh. Facebook has admitted it failed to stop horrific hate speech in Myanmar on its platform. The UN concluded Facebook played a “determining role” in fuelling hate against the Rohingya. Despite this, this trillion dollar company refuses to provide any meaningful remedy for fuelling violence, while pouring money into video games and virtual reality. That is why these Rohingya youth are now appealing to Ireland to intervene.

Ireland has been part of the OECD for 50 years and has committed to promoting its standards for responsible business conduct. It rests with Ireland to evaluate the facts and decide whether it will promote justice or the interests of a trillion-dollar profit machine. I am hopeful that Ireland will do what the Tánaiste, Deputy Leo Varadkar, promised when Ireland got a seat at the UN Security Council, which was to champion "peace and security, conflict resolution and reconciliation."

Finally, as Ireland sets up the new regulator for its online safety regime, I would urge it to learn lessons from the criticisms of its other digital regulator, the Data Protection Commission, DPC. The DPC is widely considered to have stepped back from its responsibilities in properly enforcing general data protection regulation, GDPR. The tech companies on Ireland's shores, once again, got away with it. As Ireland creates an independent, robust and effective online safety regulator, it must launch an independent review into the DPC so that it too can start to enforce the law thoroughly and boldly.

Governments and citizens around the world will be watching Ireland’s regulatory regime closely. It is reassuring that members are taking this agenda seriously, and I urge them to lead by example when holding these companies to account. Members have the power and responsibility not only to improve the online safety of Irish citizens, but of people across the world.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for her opening statement and the chilling insights from her own experience. We appreciate her sharing that with us. I will now turn to my colleagues, who have seven minutes for questions and answers. I ask them to be mindful of that. I remind Ms Haugen that members of this committee have seven minutes, so within that time they will probably move quickly through their questions to try to get short answers if at all possible to cut to the crux of the issue we are trying to discuss.

On that note, I will call the first speaker, Senator Cassells.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for attending and her opening statement. In it, she talked about governments not having enough information to even ask the right questions, let alone get the right answers. She has helped us with the questions.

Ms Haugen spoke on our national radio station at the weekend and said that, in tackling the tech giants, a regulator would need at least 20 algorithm experts at their disposal who understand deep product expertise and how the small choices made in those systems interact and hold those systems accountable. However, as she said, there is no undergraduate course to help train these young students. They trained inhouse by Facebook.

Last night in our Seanad, I asked our Minister for Tourism, Culture, Arts, Gaeltacht, Sport and Media the following question. Even if we pass the legislation in this country, are we fighting with one hand behind our back because we will not have the algorithm experts on the public pay roll to fight against companies that make a €40 billion profit and can attract those people to come work for them? As I said to the Minister, we would need to break into the chocolate factory and steal the Oompa-Loompas to get the magic formula to be able to fight Facebook. Are we fighting with one hand behind our back?

Ms Frances Haugen:

Unquestionably. I want to give people a little bit of context on how difficult it is to hire people who have deep algorithmic expertise and understand not just machine learning, but what those consequences are on the effects of, specifically, socially media platforms. I would guess there is of the order of, perhaps, 200 or 300 people in the world who have the level of depth of expertise in how these systems holistically work that could even be hired. The idea that Ireland will be able to go out on the world stage and pay market-competitive salaries to specialists such as this is an undue burden on the Irish people. I do not think it is responsible to force all of the cost for an effective regulator only on the Irish taxpayer.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

What would one of those top algorithmic experts generally be paid at market value?

Ms Frances Haugen:

One could expect for total compensation, meaning both salary and equity, of the order of $500,000 to $750,000 a year.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

We need to employ the equivalent of 20 Secretaries General.

Ms Haugen spoke of how Facebook rewards hate and effectively promotes posts where people are trashing each other and fighting each other back and forth because that drives engagement and advertising. From a political point of view and, in particular, from the point of view of the political system in Ireland, where we have parties that are prepared to fan the hate and distort facts, is Ms Haugen confident that the rewarding of such hate still continues? She mentioned the shiny glass offices of Facebook in Dublin. Is Facebook following at a high level the fallout from rewarding such actions and how, through the promotion of this, it could change the course of future general elections, even in small countries such as Ireland?

Ms Frances Haugen:

I want to be very clear. Facebook never set out to sow discord or to pull us apart. It never designed that. It never thought that society was too stable, so it should pull people apart. No one did that. It faced a business problem that people were producing less and less content over time on Facebook and Instagram, which is a natural course for all social medial platforms. It did studies trying to figure out how it could incentivise and elicit people to produce more content. What it found by experimenting on people - treating them as test subjects and trying different ways of stimulating them to make more content - was that by giving them more likes, comments, re-shares and more engagement incentivised them to produce more content. Unfortunately, when it started promoting optimising for engagement - those reactions - instead of just optimising for how much time someone spent on site, it had a side effect of ending up giving more distribution to more extreme content. This is because the shortest path to a reaction is hatred.

The real question we are examining here is why Facebook did not act when it found out this was happening back in 2018. At this point, it has been four years since this change happened. Yet, it knew almost immediately. Within less than six month, it sent researchers into Europe to talk to political parties and found that across Europe, people knew it changed the algorithm. They said they used to be able to distribute content, such as a White Paper on agricultural policy - things that did not get the most clicks or most comments - and that people did not fight in the comment threads about the agricultural policy. However, that stuff is important. That is what society is built on. Now, when we try to share that, we get no distribution.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Political parties have copped on to this fact and extreme political parties realised they will get a free pass on that. In her opening statement, Ms Haugen spoke, in particular, about the Government-----

Ms Frances Haugen:

Excuse me, Senator. It is slightly worse than that. When those same researchers asked local parties how they felt about it, the political parties said repeatedly, across Europe, they were not happy about it because they are running positions now that they know their own constituents do not like, but those are the ones that get distribution on social media. Therefore, it is influencing the ballot box even before we get to vote.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I want to touch on content, in particular, advertising content. Obviously, money is where this is all at. Something that our Bill has not tackled is an outright ban on junk food or gambling advertisements. It is a very complex thing. However, junk food and online gambling are two areas in society where there is massive revenue being generated. From a Government point of view, should we be looking towards an outright ban on these products?

Ms Frances Haugen:

As a general rule, I support rights of governments to manage commerce as is appropriate for their own communities.

This is a great example of where national governments, especially for things like advertisements because this is where Facebook makes a profit, deserve to have a right to weigh in. I am not currently aware the regulations or policies across this area.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

From the point of view of the protection of the rights of children, based on Ms Haugen's analysis and what she has seen were the advertisements, in particular in the areas of fast food and online gambling, being used to target children and to exploit children? Is that something that was happening in terms of the harm to children?

Ms Frances Haugen:

I did not work in the advertising area so I do not have the knowledge to answer that question.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

That would be a big fear for us as well. Ms Haugen mentioned the Irish Government's responsibility, "which come with housing these social media manipulating machines", which is a very strong statement. The irony is that most people in Ireland will get their information on this hearing from Facebook. I would welcome a comment from Ms Haugen in regard to Ireland's reputation as a base for companies such as Facebook.

Ms Frances Haugen:

Ireland holds a unique responsibility because it is the place of establishment for these companies. Even though Europe has led the way in terms of robust, strong regulations around issues such as privacy, unfortunately, Ireland has faced a challenge in that it is falling behind in terms of enforcing that law. The Irish people have an opportunity here to really show that they can stand up to these giants but we need to change the pattern because thus far, GDPR has not demonstrated that Ireland is willing to fund and hold to account Irish tech companies when it comes to tech regulations.

Photo of Shane CassellsShane Cassells (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Apologies have been received from Deputy Dillon, who is unable to attend today. The next speaker is Senator Warfield.

Photo of Fintan WarfieldFintan Warfield (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I welcome Ms Haugen and I commend her on and thank her for the work she has done for the public good. Moments such as Cambridge Analytica, the documentary "The Social Dilemma" and Ms Haugen's testimony to Congress give us an insight into what goes on under the hood of big tech giants. What Ms Haugen is doing is very powerful. I thank her for that, which, probably, is something whistleblowers need to hear that from legislators.

I had also intended to ask Ms Haugen about Ireland's reputation in the tech world but I will instead move to the Online Safety and Media Regulation Bill 2022 which, as mentioned by my colleague, commenced in the Seanad last night. As of publication, the Bill does not include an individual complaints mechanism for citizens who have exhausted all avenues with social media companies to find some resolution or a mechanism for the State to help that citizen to find a resolution. Given Ms Haugen's experience in this industry, why is it necessary to have that mechanism?

Ms Frances Haugen:

To be clear, I support the formation of some type of class mechanism such as, for example, parents of teenagers. Teenagers facing eating disorders and people who have experienced other types of mental health harms from social media are the victims of violence facilitated by social media. I support the formation of a class mechanism because I believe there needs to be a way for individuals affected by social media to come forward. At the same time, I do not support an individual mechanism because I worry that any regulator that Ireland might establish would be swamped by complaints. There needs to be a way for people to bring forward their grievances but it needs to be in a way that scales so as not to overwhelm the system.

Photo of Fintan WarfieldFintan Warfield (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

If there was a way to ensure that the system would not be overwhelmed, would Ms Haugen support an individual complaints mechanism, at the very least for minors?

Ms Frances Haugen:

I strongly support the formation of a class system. I agree that there are harms that currently are not represented either by specific safety Bills or are not being addressed by the social media companies. At the same time, it is important to think about the life journey of a complaint. It is not just about someone being bullied, it is about how we find the appropriate remedies for that bullying. By beginning to form together in classes, we can have more robust conversations on where should be the systemic interventions and how these products need to be changed.

Photo of Fintan WarfieldFintan Warfield (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Ms Haugen mentioned that there is no silver bullet to make the Internet safer. We are so far behind in discussing regulation in that we are still talking about looking at the 2D flatscreen while big tech, as Ms Haugen has acknowledged, is pumping money not into safety but into ploughing ahead with a version of the Internet that we are inside, that is a very immersive experience in terms of how we participate in meetings, parties, the gym and so on. Questions have surfaced about its harms and the safety of people, in particular women and children. Will measures such as the Digital Services Act and similar legislation to be introduced here in terms of a regulatory framework mark the beginning of governments being able to keep pace with big tech? Will we ever be a match for it or will that be done by citizens taking action such as boycotting various platforms, choosing email providers that protect our data and messaging Apps that value our privacy? Can we keep apace with regulation?

Ms Frances Haugen:

I have been very heartened by the progress with the European Union's Digital Services Act because it focuses on systems and processes that can be extended into the future. My concern in regard to Ireland's Online Safety and Media Regulation Bill 2022 is that it strongly focuses on content, that is, what content should be illegal, what type of content should be taken down and so on. As stated by the Senator, we are having conversations about 2D screens when Facebook-Meta is trying to shift the conversation to this immersive metaverse video game kind of world. What the Digital Services Act does well is that it focuses on risk assessments and the idea that what we need to do here is figure out how to close the feedback loop. Currently, nobody but Facebook can see behind the curtain of its business. Academics cannot study it and citizens cannot see more than their individual experience. At present, there is no feedback loop whereby if something goes wrong on Facebook, anyone other than Facebook can observe it or suggest corrections. The Digital Services Act's risk assessments close that circle. It states that companies need to start disclosing what the risks are and they also need to be able to listen to concerns from NGOs and Governments and to state what those mitigations are going to be and have them assessed by an independent party. A feedback loop would allow us to begin to shift the process and the way in which these systems work from being optimised for growth and profit to something that is more in the public good.

The Senator spoke of the idea that individuals needed to be the solution. That is not expansive enough to be able to help society in the way it needs to be helped. I cannot overemphasise how addictive these products are. Facebook's documents demonstrate that some of the most vulnerable people in our societies are the ones who get sucked the furthest down the rabbit hole. Some of the characteristics that put people most at risk to being hyper-exposed to misinformation include the recent death of a spouse, being recently widowed, a move to a new city or the person being recently divorced. It is the idea that when people are feeling the most vulnerable, they turn to this quick social fix, they turn to the online community instead of the in-person community. The idea that individuals are the solution and they should boycott it is really dangerous. It risks us leaving behind those who most need our care and need to be reintegrated into the community.

Photo of Fintan WarfieldFintan Warfield (Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I again thank Ms Haugen for her work.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

The next speaker is Senator Hoey is with us virtually.

Photo of Annie HoeyAnnie Hoey (Labour)
Link to this: Individually | In context | Oireachtas source

I welcome Ms Haugen and I thank her not only for engaging with us today but, as stated by Senator Warfield, what could arguably be a great service to safety across the world and for giving us a glimpse into what happens behind the scenes. I fear that what we get to see is really only scratching the surface in terms of how we are being manipulated.

I do not know if we, despite the best efforts of Ms Haugen and many others, are able to comprehend the level of manipulation that is happening.

I will jump straight into my questions. I spoke to someone the other day who said they believe that the Internet will ultimately be fully encrypted. There is much conversation around Facebook and encrypting Messenger, etc. Does Ms Haugen think that the ultimate right to privacy, via this encryption, comes at the cost of safety? There are obviously many concerns around child welfare, etc. Does Ms Haugen think that the ultimate goal of encryption in the Internet will have a cost to safety?

Ms Frances Haugen:

It would be a simplistic lens to say that there is an absolute trade-off between encryption and safety. It is important for us to understand how product choices, when integrated together, influence the impacts of encryption. I strongly support the individual's right to access to encrypted communications. The only chat app that gives me my core social media is Signal, which is an open-source form of encrypted messaging. At the same time, we need to understand how product choices can increase the risk of encryption. For example, let us contrast Signal and Facebook Messenger. In the case of Facebook Messenger, you get a direct line to a giant catalogue of whatever people you want. On Signal, if you want to talk to a 13-year-old, you will have to find that 13-year-old’s phone number. In the case of Facebook, you can go and follow whatever the latest meme is, find a group about that meme and then you will find a bunch of middle school kids. You can go on Instagram and do the same thing. We need to think about the interactions between systems. We need to begin to think creatively about what tools we can provide to children in those encrypted spaces. Therefore, I do not think this is a binary trade-off between X and Y. This is why risk assessments are really important. If we allow companies to build and progress on an encrypted pathway without having had conversations with the public, we will never build in systems that say, "Hey, there's a bunch of these different tools that we could be providing to children in encrypted spaces". Yes, it will cost companies a little bit of money to develop them, but that is how we can make sure that we can have both safety and encryption.

Photo of Annie HoeyAnnie Hoey (Labour)
Link to this: Individually | In context | Oireachtas source

That is incredibly helpful. I thank Ms Haugen. That will be useful for thinking about things going forward. I want to jump to what Ms Haugen mentioned previously around online safety rules that focus on subject matter in a linguistically diverse world. Can Ms Haugen elaborate a little more on that? It is an interesting point. Linguistic diversity is a global situation. For example, there are words that are used in Northern Ireland that would be considered quite insulting, but in the Republic of Ireland they are not.

Ms Frances Haugen:

Exactly.

Photo of Annie HoeyAnnie Hoey (Labour)
Link to this: Individually | In context | Oireachtas source

That is only just across a border. Could Ms Haugen give a little bit more information on that? I will have one more question after that.

Ms Frances Haugen:

This is a really important question, and I am so glad that the Senator brought it up. We spoke before about the idea that because Facebook operates behind a curtain, it gets to set the rules of conversation. The ways we are talking about safety today have been shaped by Facebook's PR machine. They have put the focus on the idea that AI and censorship, or the removal of content, is the solution. They have said that the arguments we should be having are not about whether Facebook should be able to operate in the dark but about whether we are taking down too much content, or if we need to take down more. They know that when we argue about censorship, we will never come to agreement. However, we could have a conversation about the rules of the road, for example about whether Facebook should be allowed to operate without us being able to ask questions, but they do not want to have that conversation. Therefore, when Facebook comes out and talks about AI as the solution, they are saying that they can come in and write a magical hate speech classifier or violence inciting classifier. A classifier is a machine learning system or an artificial intelligence system. It allows you to say that a post is inciting violence, so it must be taken down. Facebook's own research says this strategy is deeply limited because language is nuanced. As the Senator said, a sentence that could be totally innocuous in one part of a country could be very different and defamatory in a different part of it.

Even for American English - let alone Irish English, British English or Indian English - it can only take down 3% to 5% of hate speech. Facebook's own documents said that because of these nuances in language, in a best-case scenario it would only be able to take down 10% to 20% of hate speech. This is for the languages in which Facebook has chosen to invest. What they did not tell the world was that in order to have artificial intelligence keep us safe, you have to rewrite those systems over and over again in each individual language. Think of how many member states exist in Europe. Think about how many languages exist in Europe. Think of how many dialects exist in Europe. A dialect has to be retrained. Facebook has chosen not to invest in all the languages of Europe. It has definitely not chosen to invest in all the dialects in Europe.

In 2018, Mark Zuckerberg wrote about engagement-based ranking, which is a kind of ranking whereby Facebook sees if it can get you to react more and rewards content that gets you to react more. He said that is dangerous because people are subconsciously drawn to extreme content. Even when Facebook asks people afterwards if they enjoyed this content, they say they did not. Subconsciously, our brains are evolved to be drawn to that content. This is super dangerous. It is disrespectful to a linguistically diverse world. Instead, we need to focus on how the algorithms work because those changes can apply to every single language and dialect separately. What was the Senator’s next question?

Photo of Annie HoeyAnnie Hoey (Labour)
Link to this: Individually | In context | Oireachtas source

Ms Haugen mentioned the Rohingya people and the hate speech in Myanmar. She has also referenced the situation in Ethiopia. Could she elaborate? I am interested in whether Facebook Ireland, specifically, could have prevented those situations. How did Facebook Inc. fail to protect these people? If it can happen there, it can happen anywhere else. It is happening in other places. I am interested to hear Ms Haugen’s thoughts on exactly what role we specifically in Ireland had around the atrocity, as well as what we should or could have done with the powers we have, considering that the HQ is here.

Ms Frances Haugen:

I will pose a question that is even closer to home. When I was visiting France, as part of my visit in the fall, almost every single governmental minister who I met in France - and I met a bunch - almost every single one asked the exact same question. They asked me whether I knew how many moderators speak French. They said they had been asking for years, but Facebook would not tell them. The reason Facebook will not tell them is because the number is so small that on certain pie charts. where they list all the money that is being spent on hate speech, there is no slice for French. This is an interesting question. The French authorities cannot get answers when they ask a basic question about many people Facebook has staffed to keep people who speak the French language safe. I think approximately 1 billion people in the world speak French. This is a major language. If the French authorities cannot get the answer to that question, or if they cannot ensure there are enough moderators, how is a small ethnic group on the other side of the world supposed to get a minimal level of safety?

I will give a concrete example of how Ireland could have stepped in. Ireland could have stepped in and said it was taking a leadership role to ensure a basic level of safety is ensured for platforms that exist in the country. It could have said that Facebook had to disclose how many people were supporting each language in each dialect, because then we could have had a conversation about it. If that had happened, it would have been disclosed that only one person in the company spoke Burmese. That was one of the things that came out of the UN report. You could also imagine a world where Ireland said it would start staffing a complaint hotline for people who work inside the company. In the United States, if I work at a hospital, there will be a sign up in my break room saying, "Did you see something that endangered patient health? Call this number."

When I worked in counter-espionage at Facebook, I was literally on the other side of state actors who were trying to weaponise the platform to hurt other nations. I did not know who in the government I could call for help. We know that content moderators are on the front line of basic safety for people. Someone might be getting stalked or threatened. Moderators are the first line of safety. Many of them are staffed out of Dublin. Imagine if Ireland had provided a phone number and said that if you believe someone is in danger, you can call this line and we will make sure someone at least listens to you. That could have been an avenue for the Facebook employee who was trying to raise the alarm about the Rohingya, but did not have a chain of command that they could escalate it up to. The UN found that this employee did not feel heard. Those are small, little things that Ireland could do. It could require minimal transparency, at least, and provide a helpline for people who see things that are problematic.

Photo of Annie HoeyAnnie Hoey (Labour)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen. I could ask her 1,000 more questions but my time is up, unfortunately.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Deputy Griffin is next.

Photo of Brendan GriffinBrendan Griffin (Kerry, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for joining us and for all of the work she has done. She has done the world a great service and it is important that is acknowledged. This meeting is a good opportunity for us to consult her and hear her views. Today we are having a good, moderate and reasonable discussion, with some good contributions from my colleagues but it is not likely to be amplified on Facebook. Someone would have to say something ridiculous or extremist to get more attention but more people need to be engaged in this area.

I wish to ask about online anonymity, particularly in the context of Facebook and Meta. What efforts have been made to reduce the element of anonymity? How difficult would it be for them to clamp down on it? When I was launching my "like" page, I had to produce my passport and basically tell them what I had for breakfast but one does not have to do that across the board. Indeed, with some social media platforms, the lack of individual tracing is just ridiculous. Has there been a real effort on the part of Facebook to look into this area more?

Ms Frances Haugen:

I understand the attraction of wanting to decrease anonymity on platforms like this. I have heard a lot of different people suggest that we should have identification verification for all people on social media platforms. However, one of the important things to remember is that we live in a really diverse world. In most parts of the world, there are not computerised systems where we can verify an ID, where we can say "Here's a picture of an ID, let's make sure that person is the same person.". As long as we live in a world where we cannot actually confirm with any computer method the ID of people, we are going to have a situation where someone who wants to get around any enforcement system will just use a virtual public network, VPN. A VPN is a little pipeline that allows an individual to access the Internet from a different place in the world. People can just pretend that they are coming from one of those places and can get an account that is not traceable. Either we exclude those parts of the world from the system, which is not equitable, or we accept that making sure we know who people are online is extremely difficult. I do not think it is realistically going to happen any time in the future.

The second thing is that platforms actually have a huge amount of information on people and companies like Facebook should have to work with governmental officials who provide things like search warrants. They should be fast and transparent in providing relevant information when a crime has occurred, for example.

There are two sides to this issue. First, it is very difficult to confirm people's ID online and we should not pretend that this is a magic bullet. Second, companies should disclose the information that they do have. They know, for example, where people access things from and they can infer lots of things about people's friends. They should disclose the information they do have when, for example, a warrant is presented and should not stonewall people.

Photo of Brendan GriffinBrendan Griffin (Kerry, Fine Gael)
Link to this: Individually | In context | Oireachtas source

On a separate matter, it seems that the more psychopathic the leader of a country is, the more of an online presence he or she has. This is possibly more pertinent this week than other weeks and in that context, I wish to ask about Facebook's relationship with Russia. What is happening at the moment in Ukraine is incredibly worrying and could be destabilising for the whole world, not just for Europe. What is the nature of Facebook's relationship with Russia? Has any effort been made to clamp down on nationalistic and jingoistic online commentary that can be very damaging and that we have seen in countries like Myanmar, Ethiopia and all across the world? Does Ms Haugen have any specific information in that regard in the context of the current geopolitical crisis?

Ms Frances Haugen:

There is a major national security problem with Facebook with regard to lack of transparency. How does transparency relate to national security? Usually when one thinks of national security, one thinks of secrets. In the case of Facebook, it is important to compare what information it shares with researchers versus what information other comparable platforms like Twitter share. I worked within the threat intelligence part of Facebook, with the people who are the first line of defence for finding information and influence operations. This is a very commonly used technique by Russia for influencing how politics works in neighbouring countries and in other places around the world. It is also used by China, Iran and many major adversaries who have decided to weaponise one of the most powerful things about western countries, which is their openness. When I worked within the threat intelligence team, I saw a gross level of understaffing. Facebook kept itself behind a curtain. Twitter has said it will make 10% of all public tweets available to anyone in the world to research. It is called the Firehose. I was shocked to find out that many influence operations on Facebook were caught via Twitter because Twitter made data available but Facebook refused to do so. People would analyse tweets and find networks. They would find sets of accounts that acted in co-ordinated ways to spread messages that destabilised places like Ukraine. People would then take those networks' IP addresses, give them to Facebook and point out that the accounts have the same names on Facebook and that the IP addresses are operating from the same places on Facebook. It is unacceptable that we have to rely on Twitter's good actions to keep us safe with Facebook. That is why any online safety realm must include safe harbour for researchers. Facebook is suing researchers who try to keep Facebook safe. It also needs to include mandatory, enforced transparency, like the Digital Services Act does. Unless we have enforced transparency, we will not adequately find those influence operations that Russia uses to destabilise free societies.

Photo of Brendan GriffinBrendan Griffin (Kerry, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Deputy Mythen is next.

Photo of Johnny MythenJohnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for her guts and grit. The world owes her a debt of gratitude for exposing the system. How important is it that correct and transparent data is mandated and made available to the commission by high-tech companies?

Ms Frances Haugen:

We got to where we are today because Facebook was not transparent. Think about it; we do not let children grade their own homework because humans are humans. Sometimes we need to cut corners. Sometimes we are stressed with other things. Facebook has only had to report publicly its profit and loss position, its expenses and that is what is optimised for. Facebook has lots of small solutions that would have radically reduced things like misinformation or inflammatory content, including things like requiring users to click on a link before resharing content, which Twitter did. Has anyone been censored by that action? Has anyone been oppressed because they had to click on a link before resharing? No, but Facebook chose not to do that because it decreased the amount of content distributed on the system by a very tiny amount - we are talking tenths of 1% here - and that would have decreased profits. We must have mandated transparency. The public must have the right to ask questions and get real, validated data in return on an ongoing basis. Facebook will tell us that they cannot give us this data on a monthly or weekly basis but that is a lie. If they pull the data once, if they write the code once to pull this information, they have systems that allow them to run it every day if they want to. It is basically free. Unless we have mandated, ongoing transparency, there will not be enough eyes looking at these things.

Part of why Twitter is able to find these information operations is that there are literally 10,000 researchers around the world who enjoy looking at these data and who find these security flaws. Unless we have a similar mechanism for Facebook, we will not have a level of public safety that is acceptable.

Photo of Johnny MythenJohnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen. I am particularly concerned about the workforce behind the monitoring of the platforms and their content. Could Ms Haugen comment on the protection, or lack thereof, for the Facebook staff who have to review the content that has clearly been deemed harmful or potentially harmful? Are there workers' rights issues in this regard?

Ms Frances Haugen:

When we talk about whether we should be focusing on censorship strategies or transparency and platform changes, or changes to the algorithm or design to make it fundamentally safer instead of bolting on a censorship safety system, it is important for us to think about the human costs of trying to implement that censorship system. Timemagazine recently came out with a brutal article about how Facebook treats its moderators who are trying to keep people in Africa safe, how grossly underpaid they are and how little they get by way of mental health supports. Time and again over five to seven years, we have been hearing stories of people getting permanently damaged or coming out of these jobs after three or six months with profound post-traumatic stress disorder. When we focus on censorship strategies instead of strategies that make the systems themselves safer, we end up forcing human beings to spend all day looking at some of the most brutal content in the world. That alone should make us at least pause and ask whether we want a censorship-based solution or to decrease virality, amplification and the safety choices made in the platforms, because there is a real human impact and cost.

Photo of Johnny MythenJohnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Does Ms Haugen believe protections for whistleblowers are strong enough? Do they need to be strengthened in legislation?

Ms Frances Haugen:

There is a very subtle thing happening in our society today that is important for legislators around the world to be aware of and think about consciously when writing their whistleblower laws: technology has always outpaced regulation. This has always been, and always will be, true. We are creative little monkeys and are always looking for new ways to treat the world, and we often get ahead of ourselves. We push too far, realise the consequences of our actions and then pull back a little towards the public good. Technology is accelerating and is getting more sophisticated and nuanced. As I said before, you cannot get a graduate degree in what I have spoken to the members about today; you have to go to one of these places to be trained in-house or acculturated inside it. If we do not have whistleblower protections that are strong and robust, we will face some very large systemic risks to our societies. Technology is getting more powerful and will get farther and farther from the public good before we realise these problems exist. A closed system is not like in days of old whereby you could measure the pollution coming from a factory and talk to the workers about what was happening inside it. The systems in question, however, are opaque technical systems that hide behind our screens. If we do not have mechanisms to protect people who come out and explain them to us, we will face large risks to ourselves.

Photo of Johnny MythenJohnny Mythen (Wexford, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

We have heard from many NGOs and charities during our deliberations on this Bill that are working at the coalface of the fallout of online harm, especially to children. How important is it for these NGOs to have a real voice in regulations such as our online safety Bill?

Ms Frances Haugen:

It is important for any safety regulation to include risk assessments, but not just risk assessments conducted by the company in isolation. We should remember that these places are very privileged, they are very isolated and they are not diverse. They do not speak for the world. There must not be risk assessments by companies alone; risk assessments need to include the voices of bodies such as NGOs. The reason is that places like Facebook do not reward focusing on the dark corners; they focus on growth and optimism. They reward people who are able to figure out how to advance them and who perhaps avert their eyes a little. We must include NGOs if we want to ensure all voices in society get heard; otherwise, we will not be able to protect the most vulnerable in our societies.

Photo of Christopher O'SullivanChristopher O'Sullivan (Cork South West, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for attending. I welcome her to Ireland in a virtual way. I apologise if some of my questions have been answered. I have not been able to listen to the entire meeting. If I repeat points, I ask to be indulged.

It is now widely known that in 2016 Facebook made changes to its algorithms. These resulted in Facebook becoming a much angrier place. That was evident to all and it resulted in circumstances in which people tended to share angrier content and posts. The change to the algorithm and atmosphere clearly led to incredibly partisan pages. Partisan organisations were able to exploit it. It has been fairly widely written that the change to the algorithm and how Facebook was operated and used had an influence on the US election of 2016. Was Facebook fully aware of the material impact of the change to its algorithm?

Ms Frances Haugen:

It is important to differentiate between changes Facebook made to its product before 2016 and changes it made to its main prioritisation system, which puts together the newsfeed. The latter happened in late 2017 and into 2018. Before the US election of 2016, there were other problems with the system. For example, it was found then that a huge proportion of people who joined neo-Nazi groups in Germany joined them because Facebook recommended those groups to the people. Sixty-five percent of people who had joined neo-Nazi groups joined them because Facebook stated those groups had really excited people who really engaged a lot with Facebook and that they might like them too. Those kinds of things were happening before 2016. They unquestionably influenced politics around those times. However, we have found since then that Facebook did not invest enough in safety systems and researchers who were asking these questions. That is part of why we need to have mandatory risk assessments that do not just involve the company, because it is not going to see things it needs to see, but that also involve bodies such as NGOs. That was before 2016.

In 2018, Facebook changed from focusing on how long it could keep you on Facebook to meaningful social interactions, where meaningful could involve hate speech or bullying. It really just meant the question of whether Facebook could optimise for reactions. Facebook, which had researchers in Europe preparing for the EU elections, identified within six months of the change that its systems were rewarding angry, partisan, divisive content over content that looked for reconciliation or a common path forward.

Facebook has also known that advertisements are equally a problem and that a high-quality political advertisement is one that gets reactions. As a result, extreme, polarising, divisive advertisements can be five to ten times cheaper than compassionate or empathetic ones. We need to ask questions such as whether we want to subsidise angry, divisive content when we know we cannot have a democracy when we are partial towards division over reconciliation.

Photo of Christopher O'SullivanChristopher O'Sullivan (Cork South West, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Ms Haugen just painted an incredibly bizarre picture of the period before 2016, when neo-Nazi pages were being recommended to people whom Facebook thought might relate to or like them. On that approach from Facebook, did any other Facebook staff at the time raise concerns about this approach? If so, what was the corporate response of Facebook to any concerns raised regarding that type of behaviour or approach?

Ms Frances Haugen:

I have talked to people who have worked on the safety systems across the company.

We are talking not only about civic integrity, which was the area in which I worked, but also about the integrity of groups, pages and searches. One of the things we must remember is that the way the incentives are aligned now, the only numbers that have public accountability are profit and loss. Those are the only things Facebook must mandatorily report in the United States. Facebook has shown that for things such as its transparency centre, it selectively reports information, and I can answer questions about this, to paint a picture of the company that has little relation to it. When we think about incentives, the people inside the company who find these problems were a part of the reason I came forward. There are very kind and conscientious people working at Facebook who have no space to operate. When they find these patterns and raise them, they might have to spend three or six months hammering on the doors to get anyone to pay attention. When they do that, they can be labelled as squeaky wheels. They are complaining and are not team players. Unless we have mandatory transparency, there will not be a countervailing force to profit and loss that gives space to these people to operate. We want to create a system of incentives whereby if someone sees a problem with, for example, neo-Nazi groups, as we are recommending, it is obvious and immediate that it needs to be escalated. Today that is seen as a cost centre. It is someone who is complaining and perhaps even exposing the company to risk. Mandatory transparency makes it clear that we are watching and encourages the company to be creative and find solutions. That is what good regulation does. It helps to pulls the centre of mass back towards the company.

Photo of Christopher O'SullivanChristopher O'Sullivan (Cork South West, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen. I am going to squeeze in one more question in the ten seconds I have left, if that is okay with the Chair. This atmosphere and approach have led to a situation where, as we have already said, partisan pages and organisations have exploited it to a point where many people have trouble distinguishing factual and non-partisan material from that which is partisan and non-factual. This is the point of us having Ms Haugen before the committee today. What legislative and algorithmic changes would she recommend to ensure this does not happen again and we hit this on the head?

Ms Frances Haugen:

I totally understand the emotional attraction of looking for a magic bullet and the one thing we can do to fix things. The real solutions are things such as the way the Digital Services Act is structured. The focus is not on forcing Facebook to do one thing but is instead about changing a relationship, making sure the public can ask questions and raise concerns, and ensuring Facebook must engage. It must give us data so we can provide accountability because those processes where we have feedback loops are how we get to a better place. We must put in public reporting requirements which mean we can see in an ongoing way on a week-to-week basis. This is how violence and incitement content are working. This is how our ability to even detect that content is working. That transparency would suddenly give another goal to the company and it would have to allocate enough resources to solve those things. That is how we get to a better place. We change our fundamental relationship with the company. We cannot just ask the company to turn a knob.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for her contribution to the debate in Ireland and, indeed, internationally. I was conscious of the Rohingya genocide. It was discovered that there was one contractor working in Dublin to observe harmful content in Myanmar and he only spoke Burmese while 100 languages operate in Myanmar. I found it interesting that in September 2018, when human rights organisations wanted to take a case to the International Criminal Court, lawyers for Facebook refused that request on the basis that it would be a violation of individual privacy. Ms Haugen will be aware that there is a big debate at the moment within the European Union. Our concerns, as Europeans, relate to privacy rights, yet Facebook wants to ensure it can continue to transfer data to the United States. Bruno Le Maire, the French economic minister, said last week, "I can confirm that life is very good without Facebook." He went on to say, "Digital giants must understand that the European continent will resist and affirm its sovereignty." Should we, as Europeans, trust Facebook? Should we, as Europeans, allow any of our data to be transferred to the United States?

Ms Frances Haugen:

Trust is not given; it is earned. The way we trust people is that we must know them and see the patterns of their actions, and they earn our trust. Facebook has had scandals over and over again. The scandals are very consistent. They take the form of Facebook claiming the world is a certain way, an internal person sees the consequences of that lie and comes forward. As long as we see that pattern happening over and over again, we must ask what is going on. We need mandatory transparency whereby Facebook must disclose this information or we are not going to get where we need to get.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I take from that answer that Ms Haugen thinks the European Union should stand up to Facebook and if Facebook decides to pull out of Europe, so be it.

Ms Frances Haugen:

I worked on Google+ long ago, and people like to make fun of Google+. Every year the European Union exports billions of dollars and advertising dollars to Facebook. That is money that leaves the continent. It cost less than €500 million to build Google+ and that was with incredibly well compensated engineers in Mountain View, California. Europe has options. We should not think about these systems as being impossibly complex. Europe could lead the way in saying it deserves to have publicly governed and publicly produced transparent social media. That is a thing Europe always could be doing. It has choices.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I will come in on the question of accountability and transparency in the algorithms, which is key. We are a small country that is trying to enforce accountability on companies. We are trying to deal with the taxonomy of online harms and so on. Does Ms Haugen have a suggestion as to how we might be able to enforce that kind of engineering solution? Does she believe that individual company directors should be held liable if they knowingly mislead regulators?

Ms Frances Haugen:

I strongly believe in the sovereignty of the people with regard to corporations. If an individual lies to the public, that is entirely unacceptable. I want to make sure we contextualise the privileges that social media have today. Facebook is a powerful entity with very little transparency. It is not like a factory where one can put out a sensor and see is it polluting. The people in Facebook are the only ones who get to see the data, which means they have a higher degree of responsibility for telling the truth. The fact that they lie is even more egregious in that context. I totally believe they should be held responsible. I have forgotten what the Senator's first question was.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

It was around transparency in algorithms and how we can enforce it through legislation.

Ms Frances Haugen:

If we were to hold a conference for a week, we could write a White Paper that fully outlines the minimum amount of information that needs to be disclosed in order to allow public transparency into these systems. One of my goals for the next 12 months is to have a specification of the minimum set of information that would allow supervision of the systems. The most important thing for Ireland to do is to pass an online safety Bill that is aligned with the Digital Services Act across the European Union, which is currently focusing on how to have an ongoing process of accountability. Having a taxonomy of the harm caused by content will not work because focusing on content is a downstream solution. We need to go to the source of the problem, which is around the algorithms and making sure Facebook must report ongoing metrics about the performance of those systems is how we will be able to answer those downstream harms caused by harmful content.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Facebook tried to get some independent system to introduce the Facebook oversight board. What are Ms Haugen's views on the oversight board?

Ms Frances Haugen:

The oversight board is a wonderful illustration of how Facebook tries to focus our discourse and examination of the problems that Facebook introduces. If we look at the actual mission of the oversight board, it is currently not responsible for seeing how these algorithms perform or ensuring that Facebook releases adequate and accurate data. That is not its charge. Its charge is very specific around censorship. It asks, "Are we taking down the right content?" or "Are we taking down too much content?". It wants us to focus on a censorship debate and not on a systems debate. It wants that so much that the company spent $100 million to establish this board that can only talk about censorship. There are very well-intentioned, smart and gifted people on the oversight board but they have been limited in what they are allowed to engage with. It makes them not be able to live up to their job of keeping us safe.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Facebook has now rebranded as Meta and is moving into the metaverse space that everyone else will be moving to. There will be challenges for us regulating in the metaverse space as well. Does Ms Haugen think there is anything that we, as legislators, should look at in this regard?

Ms Haugen is aware the Data Protection Commissioner here has responded to some of her criticisms and has offered to meet her. That would certainly be useful. Would Ms Haugen be happy to meet the Data Protection Commissioner?

Ms Frances Haugen:

I am always happy to meet governmental officials who want to hold the public safe. I would be happy to deal with the commissioner. On the question of keeping the metaverse safe, it is a great illustration of why we must focus on our systems-based approach rather than the current harm assessments. The metaverse is new and if we sat down to write a protection plan today, we would know very little about the metaverse.

I will lay out what will happen if we do not implement some recurrent system of mandatory risk assessment. We will see exactly the same thing happen with the metaverse - or perhaps even worse things - that we have already seen on places like Instagram. People have been complaining about many of the issues in my disclosures for years, or even a decade. The only thing that changed was that I brought forward proof that Facebook knew about these harms and they were real. Facebook, for as long as ten years, had been saying that these problems were anecdotal and it was not the way it actually is. It argued that if people could see everything, they would see such complaints are not borne out in reality. They were real and the company was just lying to us.

What we will see with the metaverse is that people will start having individual problems. Paediatricians are an example. Kids are getting more addicted to these immersive systems than they have been even to Instagram. We are not going to be able to prove such an issue because we will all only see our individual experiences of things like the metaverse. We will not be able to see the systematic attacks because only Facebook will get to see behind the curtain. That is why we need mandatory risk assessments or we will have the same arguments and the company will say the issues are anecdotal and not real. The company holds all the power.

Photo of Malcolm ByrneMalcolm Byrne (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen.

Photo of Ciarán CannonCiarán Cannon (Galway East, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I welcome Ms Haugen and thank her for her work in drawing back the veil and allowing us to get some degree of insight into how companies like Facebook and many others work. I happen to believe social media, in the main, is a force for good. The people who founded most of our global social media companies did not set out to do societal harm.

Ms Frances Haugen:

Yes.

Photo of Ciarán CannonCiarán Cannon (Galway East, Fine Gael)
Link to this: Individually | In context | Oireachtas source

The community of power in social media provides all citizens the ability to question, parse and analyse the actions of government, and this is exceptionally important. Much of the societal change we have seen internationally in the past decade or so has come about as a result of social media. We had a perfect example of that in Ireland when we were the first country in the world to vote for marriage equality, with a huge part of that campaign being developed and implemented through the use of social media. Perhaps I am coming at this from a different perspective. There is an inevitability in how we communicate as a species that social media would eventually have come into being in any event, whether it was through Mr. Zuckerberg or anybody else. We are now moving to the next stage of that evolution through the metaverse, as has been mentioned.

The question is presented to us as legislators with limited knowledge of the inner workings of social media companies. Right now, what is the most effective deterrent to Facebook, Instagram, Twitter and all the other social media companies or the most effective tool to force behavioural change from within these companies and perhaps return them to their initial ambition of providing for a greater degree of communicative power to all citizens? Social media is ultimately about empowering citizens. How do we make these companies go back to their founding principles? Is there a way of doing that?

Ms Frances Haugen:

That was beautiful. I want to be really clear. I really love social media. I have worked with four social networks at this point. The first time I wrote a programme that went out and read the Internet, it was on a social media site called Friendster a long time ago. I was in Europe when I wrote it. I thank you guys for sponsoring my education.

I want to be really clear. When we have social media, it is about our friends, families and immediate communities and it is not that dangerous. Things that cause these systems to be super dangerous are driven by a profit motive. We could roll back in time to 2008, when we mostly saw content from family and friends. Facebook runs experiments today when it rolls back the clock and gives boosts to content to people to which we are directly connected. This is from people we know. It is for free, and people get less hate speech, less violence and less nudity.

What is the problem then? If we imagine a world where we only get content from family and friends, after 20 or 30 minutes, a person would finish catching up with those family and friends and go on to something else in life. If Facebook wants to keep people on for hours and get more advertising dollars, it needs to find more content. Facebook has begun forcing people into giant groups of 500,000 or 1 million people, and these are the pathways being weaponised, for example, by Russia. Russia has figured out it can bootstrap these groups. It has found hacks to build them very quickly and it forces out content to people.

What can we do to force Facebook back to a system that is more about our family and friends? By requiring the company to disclose ongoing information about the performance of their systems and even those censorship systems, for example, we can begin to make it attractive for them to do some very basic fixes. Let us imagine Alice writes something and Bob, her friend, shares it. If Carol shares it again, it is a friend of a friend. It might land in Dan's newsfeed and now we are beyond friends of friends. If Dan had to copy and paste in order to propagate further - he cannot knee-jerk and hit a share button - that change alone would have the same impact as the entire third-party fact-checking system. Facebook does not do it because it costs a sliver of profit. It may be half a percentage point of profit because there is less content on the system. It would be radically safer and more pleasant but it would be less profitable.

We must require mandatory ongoing reporting of metrics that can be audited by the public so Facebook feels shame when it knows somebody is watching. It should be willing to use these little tricks they already know about and they work for everybody in the world and not just people who speak English. The company will not do this unless it has to report something other than profit and loss.

Photo of Ciarán CannonCiarán Cannon (Galway East, Fine Gael)
Link to this: Individually | In context | Oireachtas source

Ms Haugen has been involved with building social media platforms. There is an analogy with the construction industry. Thirty years ago in Ireland and internationally, construction sites were very dangerous places to work. Workers were killed and maimed and as a result of the development of legislation, health and safety regulation and particularly the development of tort law and significant compensation that has been paid to workers, it became very clear very quickly to construction companies that safer workplaces were cheaper to operate than unsafe workplaces. Would the taking of direct mass court claims for compensation against all social media companies involved with disseminating damaging information have an effect in changing behaviour within Facebook and the other similar companies?

Ms Frances Haugen:

I will get to the question but the Deputy speaking about employment safety is fascinating. Earlier today we spoke about the idea of people moderating this content. If a system is based on censorship instead of making products safer, we must have humans in the loop who get exposed to huge amounts of this horrible and toxic content.

They are the humans who actually keep the AI honest or who deal with the nuances of it. Imagine a world where there were better employment protections for Irish citizens who were the moderators of these companies - people who are getting horrible mental health issues and some of whom are killing themselves. It might be a thing where just giving better employment protections to Irish citizens ends up pushing these companies towards having more systems-based solutions instead of censorship-based solutions.

With regard to the larger question of how we make sure we have ways of filing claims, having some kind of class action mechanism where, say, the parents of children who have been harmed could come forward and say, “We have all had the same problem happen and we have been deeply injured”, or “Our children have been deeply injured”, would provide a financial incentive for making sure these platforms make responsible choices.

Photo of Ciarán CannonCiarán Cannon (Galway East, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I am hugely encouraged by the fact Ms Haugen is focusing on systemic change rather than content moderation or content censorship because that is something that would concern me deeply. Getting under the hood of how social media companies share the most incendiary content to further create engagement is a really important perspective. As Ms Haugen stated, this is not about the simplistic lens of just removing damaging content. We tried that, and a couple of items of legislation were proposed in 2015 whereby we sought to make it an offence to annoy people on social media. That is not the direction we need to go.

If Ms Haugen was to recruit the most effective team within the office of our Data Protection Commissioner or in any other office that has a regulatory role on social media, and she mentioned experts in AI and algorithmic development, would behavioural scientists have a role in that? What would be her optimum team to begin bringing about that systemic change that she has spoken so strongly about in the recent past?

Ms Frances Haugen:

I commend Facebook on thinking in a holistic way about these problems. There is one of the things they do that I think it is really important to acknowledge, and I do not want to be somebody who just nags on Facebook because Facebook does some good things. One of the things they do is they have probably the most robust research team in the world for a comparable social media company. We need to keep hiring social scientists, psychologists and people who understand human dynamics. We need to fund research into how these fields play together. We need more intersectional people who focus on both technology and the social impacts of it. There is a lot of very good research coming out of places like Taiwan around how we build technology that is pro-social, that helps us to have better deliberations and that makes our democracy stronger. We should not think that technology is inherently oppositional to the market, but it is a thing we must invest in if we want to develop these pro-social technologies because it is likely the market is not going to give us these things. The market is incentivised to keep us addicted, to keep us coming back over and over. I encourage places like the European Union or Ireland to invest in funding research into these fields.

Photo of Ciarán CannonCiarán Cannon (Galway East, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Mr Haugen. I appreciate her giving of her time.

Photo of Imelda MunsterImelda Munster (Louth, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I welcome Ms Haugen. Disinformation is not classed as harmful content in our online safety Bill. Does Ms Haugen think it should be?

Ms Frances Haugen:

Given that nation state actors like Russia know they can harm our open societies because of our openness, we have to include things like information operations and disinformation when assessing the harms of these systems. I want to be clear again that I do not believe that focusing on a content-based strategy scales or is very effective, but I believe that when we are classing harms, we must include disinformation as a harm.

Photo of Imelda MunsterImelda Munster (Louth, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Ms Haugen mentioned the EU Digital Services Act and said it is critical our Bill is aligned with that. Is the Digital Services Act sufficient, in Ms Haugen's opinion, or should additional matters be addressed in it?

Ms Frances Haugen:

We need to make sure that we are not only focusing on illegal content but also focusing on harmful content. Things like disinformation are not illegal. The contact that drives young adults, in particular young women, to eating disorders is not illegal. Things that make people depressed or estranged from their communities are not illegal. We need to think holistically about the harms of these systems and require them to be disclosed to the public. If we only focus on things like terrorism content and things that are grossly criminal, we will not be able to hold these companies to account for the costs of the pollution they are imposing on society.

Photo of Imelda MunsterImelda Munster (Louth, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

In terms of the information that Ms Haugen feels Facebook should make available to regulators, and she has outlined some of them already, can she expand further on what other types of information we would need from these companies?

Ms Frances Haugen:

A simple example concerns even just disclosing which of the artificial intelligence systems exist today, and that covers most classifiers. This could be even just saying, “You must disclose which of these labelling systems exist [and hate speech, violent content and nudity are each a classification system] and which languages are supported, and give us a thousand examples of each score of those systems so that we can see how effective your systems are”. Just that one action about which safety systems, which languages and giving us some samples would allow us to see things. For example, Facebook's own documents talk about the idea that they treat all Arabic speakers as if they speak one language even though Arabic is actually six very different language families. There are things like the fact 75% of counter-terrorism content is classified as terrorism content. If we had something as simple as that disclosure on which safety systems and which languages and to show us some samples, we would be able to catch that. We would have been able to have a conversation saying things like, “Irish English is different enough to regular English and you do not support it as a dialect, but the Irish people deserve to be safe”. Those are the kind of things where ongoing disclosures would incentivise Facebook to invest in more languages or to invest in the quality of the safety systems in those languages.

Photo of Imelda MunsterImelda Munster (Louth, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

Ireland has a history of engaging in the light-touch regulation in many sectors and it led to the bankruptcy of the State just over ten years ago. Given Ireland's position, particularly as the home of the European headquarters of many of these tech companies, can Ms Haugen outline the dangers should Ireland adopt a hands-off or light-touch approach to the regulation of these tech companies?

Ms Frances Haugen:

My strong belief is that the reason Europe is so much further along with things like the Digital Services Act than, say, the United States is that Facebook intentionally invested overwhelmingly in safety budget in only the United States in order that we would not notice how toxic it was for the rest of the world. Ireland holds a profound responsibility. It does not matter how good the Digital Services Act ends up being. If all of that regulatory implementation happens only in Ireland, and Ireland under-invests in that regulator, if they are not actually tough and do not hold these firms to the letter of the law, there is basically no last line of defence. With the GDPR, we saw that the Data Protection Commission in Ireland struggled to keep up and 98% of cases that have been brought have not been resolved. If we cannot find a robust way to implement the Digital Services Act, I do not know who is going to speak for people like the Rohingya or what will be the next Ethiopia. Right now, there is another ethnic conflict that is unfolding that has been driven by Facebook. I strongly hope the Irish people can either stand in solidarity and have a joint European enforcement or can adequately fund a robust and tough implementer to keep the people of the world safe.

Photo of Imelda MunsterImelda Munster (Louth, Sinn Fein)
Link to this: Individually | In context | Oireachtas source

I have one further question. I was looking at Ms Haugen’s opening statement, where she took a fairly strong stance in regard to the Data Protection Commission. Will she expand on that?

Ms Frances Haugen:

GDPR was a groundbreaking law. It changed how Silicon Valley operated. I remember working at Pinterest while GDPR was rolling out and I believe it was vitally important in having companies even just look at the data they were storing. Companies were not even aware of all the stuff they had on people, so GDPR is very important. However, the reality is that because Ireland is the regulatory hub for these large tech companies, it has been flooded with GDPR violation complaints. There are two problems. First, a very limited number of judgments have come out of those decisions and those judgments, as they have emerged, have been deeply criticised by other parties inside the European Union. Other parties are allowed to assess the judgments, confidentially, before they become public. Time and again countries such as Germany, Italy and France have come forward and said that given the amount of resources and time that were put into the investigations, not enough substance is coming out of them. Second, they are not coming out with enough teeth. I believe that having an effective regulator is very hard. I recognise that. It is very expensive and hard. It is something we are developing from scratch. It is similar to being in the 1970s and developing the first environmental protection authorities. I believe binding together is the way forward because each individual party in Europe is not going to be able to afford the number of algorithmic specialists, and there are just not a lot of them in the world. Until we begin training enough we cannot expect each country to have a tough enough enforcement.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I will now move on to Senator Carrigy. I assure Senator McGahon and Deputy Higgins that I will include them as well. I am not forgetting about them, but I have to go through the members of this committee first. Senator Carrigy has seven minutes.

Photo of Micheál CarrigyMicheál Carrigy (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I welcome Ms Haugen and thank her for what she has done. The decision she made to come out to the public will make the lives of many people around the world safer and better. I wish to put that on the record. Part of the Future of Media Commission and its establishment in Ireland is an online safety commissioner. I would feel a lot safer if somebody like Ms Haugen was in such a position. Many of my colleagues here would agree that we need somebody who knows the systems and the companies and Ms Haugen would be an ideal person for that portfolio or job.

To be honest, many of the questions I had planned to ask have been asked, so I will not repeat them. A number of my colleagues wish to ask questions. However, with regard to false accounts and the many millions of false accounts that exist, particularly on Facebook, in our pre-legislative scrutiny we discussed introducing a minimum age for accounts and requiring the online media companies to get more information to make sure the accounts were, first, of people over a certain age and, second, that they were not going to be used falsely for harm. What is Ms Haugen's view on setting a minimum age for any account on social media?

Ms Frances Haugen:

Age verification is an interesting issue. It is a great example of where we do not necessarily need official enforcement with things such as driver licences in order to have much stronger protections for children under the age of 13 years. One of the most shocking things I have brought forward for many people is the idea that Facebook has an estimated age on every person on the platform and it uses that estimated age, which is different from somebody's stated age, to do lots of analysis to target them with things like advertisements. One can ask how Facebook is able to accurately predict somebody's age. Things like who the person's friends are helps one estimate the person's age because not all the person's friends lied when the person was 11, 12 or 13 years old if the person snuck on. Some of them waited until they were 13 years old. It also knows the person's interests. There are many ways to figure out approximately how old somebody is. Facebook has found that, for some age cohorts, 10% to 15% of that age cohort were on systems like Facebook at the age of ten years old. There has to be much stronger reporting. Imagine if Facebook had to report for every cohort of 14-year-olds what fraction of them were on the platform at 13, 12, ten and nine years old. It could be doing that right now and it would encourage it to be much more aggressive at finding underage people and taking them off the platform.

There are also systems such as facial recognition. I have been sent multiple studies by start-ups that are investing in facial recognition and analysing facial features for estimating people's ages. If we want to keep people under the age of 13 years off the platforms, I strongly encourage this because the rates of addiction in Facebook's own research are highest for 13 and 14 year olds. Young people have less ability to regulate addictive behaviours. That is why we do not let them smoke. Similarly, there should be much more stringent controls for things like gating how much time somebody under the age of 16 years old can spend on these platforms because the self-regulation systems are not as mature.

Photo of Micheál CarrigyMicheál Carrigy (Fine Gael)
Link to this: Individually | In context | Oireachtas source

Based on what Ms Haugen said about 13 and 14 year olds becoming addicted, should we set a limit of 15 years old as a minimum age for being on a social media platform and put the responsibility on those companies to make sure that they enforce that?

Ms Frances Haugen:

I am not a child psychologist, and I encourage the committee to consult with experts who have much more education than I do. However, I strongly believe that we probably need greater protections for children for all these platforms. Right now, there are children who are developing self-soothing behaviours where when they feel an interest, they pull up Instagram and just start scrolling. If those children are being made interested by the content on Instagram, and Facebook's research says there are many children who are developing eating disorders and things like self-harm because of the content they are being exposed to, and if their self-soothing behaviour ends up exposing them to more of the thing that is causing them harm, that is a huge problem. At a minimum, things such as the rate limiting for people under the age of 15 or 16 years old is important, with an hour, two hours or three hours a day, some limit like that, but also being even more aggressive and saying that perhaps somebody should not join until the person is 15 years old. I would support examining that question.

Photo of Micheál CarrigyMicheál Carrigy (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

This has been an insightful discussion with you today, Ms Haugen. I begin by crediting your great bravery for coming out and speaking so publicly across the world. All my colleagues have said you are doing the world a service, and particularly the next generation. Hopefully, we will be in a position to protect them through your testimony here and to other parliaments around the world. I will begin by quoting you. In one of your discussions you said that Instagram can lead from healthy recipes to anorexia. At another point you said that Facebook's internal research states that 13.5% of teenage girls said Instagram made their suicidal ideation worse, with 17% of teenage girls reporting that Instagram made their body dysmorphia worse. They are the types of things we have been hearing from our witnesses' statements throughout our pre-legislative scrutiny. There are a specific instances. Can you extrapolate on the experiences you had with regard to the figures and particularly with young girls and how they are so affected in terms of body image?

Ms Frances Haugen:

I want to unpack that quote about how Instagram leads from things such as healthy recipes and healthy eating to topics such as anorexia. Facebook has reproduced at least four times - perhaps more, as those are just documents I am aware of - what is basically the same study. Facebook has said it publicly previously. Mr. Nick Clegg, who I believe has some new fancy title at Facebook as head of global something or other, wrote a long blog post last year and I encourage the committee to read it, because I find it quite condescending. It shows the type of attitude Facebook has towards the public. It effectively said: "You have been complaining about extreme content on Facebook, but you know what? You picked your friends, you picked your interests. This isn't on us. It takes two to tango."

However, even as he said that, Facebook had multiple instances of studies that took a blanket count and followed a couple of innocuous things. Perhaps on Instagram they searched for healthy eating and followed a couple of topics. Then all they did was click on the content Facebook, Instagram and Meta gave them and came back the next day and did it again. The nature of engagement they strengthen is that the algorithms are not smart. They are just scanning for what one might engage with, what is going to draw one in.

The reality is that people develop things like anorexia and start cutting or engaging in other forms of self-mutilation is that there is something in our brains that is drawn to these topics. Adolescent children are going through major phases of physical change, and that can be stressful. Kids are following topics like healthy eating and just engaging with the content. There is no two tangoing here; this is just the algorithms. As they engage with it, they get led to more and more extreme content. That is super dangerous. These are young, impressionable children who instead of being supported as they go through a stressful experience in life are being drawn into an addictive platform that ends up hurting their mental health and even their bodies. Anorexia is not about girls that are too skinny. It is about women 60 years from now who are going to walk the Earth with brittle bones because of choices Facebook made or a 35-year-old woman who yearns for a child who cannot have one. That is what anorexia is. I encourage the committee to step in and to have a strong voice. The regulator should be adequately funded and supported so that it can be tough and make sure that these platforms work for the people and bring out the best in us.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Could Ms Haugen identify three – I am sure there are loads of them – design features that we could mandate on these online platforms to address that?

Ms Frances Haugen:

I want to be clear. Going in and saying there are magic bullets where we just turn this knob this way or turn this dial that way and it will solve the problem is not the solution. The solution is measures like the Digital Services Act, where there are ongoing risk assessments every six months or every year, and we force the platforms to disclose to us the harms they know of and how they are going to solve other issues that NGOs have identified. We must say they need to show us enough data so that we know they are making progress. It is important to have that kind of system and an ongoing conversation with Facebook, and changing our relationship with it so that we stop passively receiving whatever lies it wants to tell us and instead we can come in and say we are noticing that kids are getting harmed and what is its plan. We need laws like that to change the dynamic in order that Facebook has to work in the public interest.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

All of us have contacted Facebook regarding issues relating to our own accounts. We are referred to community standards and whether something meets the bar for community standards, which, 100% of the time, it does not. As a result, whatever goes on Facebook stays on it. Ms Haugen has been on the inside. Are those in Facebook laughing up their sleeves at us in terms of the company's view of its community standards?

Ms Frances Haugen:

There are a couple of real challenges here. One is that Facebook has chosen to focus on a censorship-based safety system, not what we are thinking about, which is the product traces it is making and arguing if they are acceptable risks. When we focus on censorship, it leads to this very elaborate dance where there are incredibly long documents that describe exactly what the characteristics are of violating or non-violating content. This is a little bit like arguing about the number of angels that dance on the head of a pin. We should be focusing on why it is that we are giving the most distribution to the most extreme content and not being able to have these defensible documents, as they call them, where they say: "No, we've thought really hard about this. This is what is violating and non-violating." It is right to say that language is nuanced and especially in circumstances where it is not tailored to individual cultures. There was a Member of Parliament in the UK who told me about how he was actively sent homophobic rhetoric, but because Scottish slang is not acknowledged by San Francisco, he was never able to get any protections. We need to stop focusing so much on individual pieces of content and focus on the overall risks to the system and making the system safer overall.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen.

Photo of John McGahonJohn McGahon (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank the Chair for allowing me attend the meeting. I am not a member of the committee. The reason I am here is that I have been following Ms Haugen very closely for the past year. I watched her entire three-and-a-half-hour testimony before the United States Congress-----

Ms Frances Haugen:

I thank Senator McGahon.

Photo of John McGahonJohn McGahon (Fine Gael)
Link to this: Individually | In context | Oireachtas source

-----the European Parliament and the House of Commons select committee as well. I would describe her as having real moral fortitude in coming out of the shadows by herself and taking on a colossal company like Facebook right around the world and speaking to parliamentarians. It takes immense personal character to have the ability to do that and it is to be commended.

It is very difficult to come in at the end of a debate and to ask an original question. I will try to differentiate my points a little bit and put a different slant on it. Ms Haugen lifted the lid on some of the darker practices that Facebook has engaged in. Does she think she has lifted the lid on all of it or that it is just the tip of the iceberg and there are other practices that Facebook is using that we have not yet found out about?

Ms Frances Haugen:

There are unquestionably things that we do not know about. I am one person. Let us think for a moment about the things that I learned about how Facebook is operating. I just did a few searches. I am one person, and I was able to capture a lot, but I was only able to capture a tiny sliver. The fact that Facebook can operate at the scale that it does, that it can touch the lives of 3.1 billion people and be completely opaque to us is unacceptable. We are going to keep having whistleblowers come forward from Facebook until we have managed to get greater transparency. There are not whistleblowers coming out of Apple. That is because Apple is radically more transparent. People can take their devices apart. They can measure the pollution coming out of their factories. Apple has much less incentive to lie than Facebook does, because those in Facebook know that no one can catch them in that lie because they are the only ones that have the data.

Photo of John McGahonJohn McGahon (Fine Gael)
Link to this: Individually | In context | Oireachtas source

It was mentioned earlier that we are very much living in a digital world and that technology operates at the speed of light. Does Ms Haugen have any confidence in governments and democracies around the world being able to keep up with big tech, which can outgun governments at times, particularly when one considers what it has at its disposal? Does she think there is any possibility of our trying to keep up with it?

Ms Frances Haugen:

I want to remind people about their original dream of the Internet. I am an elderly millennial. I remember the Internet in the 1990s. The dream of the Internet was about the idea that anyone could reach out and impact people on the other side of the world. With systems that are more transparent, like Twitter, which has a fire hose, anyone in the world can analyse one tenth of all the public tweets. What happens when we open up and we say "We are going to give access and we are going to do this collaboratively", is that people come out of nowhere who help. There are easily 10,000 people who closely analyse that fire hose on Twitter. Because Facebook has chosen to be closed, however, it has both taken on a higher obligation because it is the only one that can currently solve these problems. It has made life much harder for itself. If we force Facebook to disclose more data, more people would take college classes where they would use that data. Yelp, for example, does this. Yelp has a data set where it lets people use its reviews. As a result, 18-year-olds learn how to work with rough data. Imagine if Facebook had 20,000 or 50,000 students around the world who are always looking at its data. We would all be safer because we had forced Facebook to be more transparent. That is how we are able to these things.

Photo of John McGahonJohn McGahon (Fine Gael)
Link to this: Individually | In context | Oireachtas source

We get it that hateful and divisive content drives engagement and it elicits responses. We get that Facebook's economic business model is built around that. It has been discussed earlier today, but how can we ask a private company to change a business model which will result in it losing money, for example? We have talked a little bit about that today. Would one of the things we could ask Facebook to do be to change its algorithm from a tailor-made model that shows us what it thinks we want to see back to what it used originally, namely, a chronological algorithm whereby we saw what was posted by our friends chronologically? Would that be a solution to try to take away some of the toxicity of the algorithms?

Ms Frances Haugen:

If we talk about chronological news feeds, Facebook is going to come out and say that it has run an experiment and that a chronological news feed has more bad content. I want the committee to hold Facebook to account. Facebook has forced us into lots of large toxic groups. We talked before about the idea that 65% of the people who joined neo-Nazi groups joined them because Facebook recommended those groups to them. At this point, Facebook cannot just change the algorithm; it has to come in and change other things about how the groups product is designed. It has to do these changes simultaneously or, as Facebook says, we will just get more inflammatory content.

However, if Facebook had to report more safety factors in an ongoing way - a weekly update as opposed to a yearly one - or if the public were allowed to request those data, Facebook would suddenly have to take things out of its filing cabinet. It has all of these solutions, for example, cutting the reshare chain at two or requiring people to click on a link. These solutions exist today but Facebook chooses not to use them because of the profit motive.

Part of why it is important to put these regulations in place is because the choices Facebook is making right now will limit its long-term profitability.

Ms Frances Haugen:

Facebook has research saying that, when something is more toxic, people engage in attrition faster, but it needs to have a countervailing force to help it towards that long-term perspective.

Photo of John McGahonJohn McGahon (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I will ask two further questions briefly. Does Ms Haugen believe that the algorithms governing the rabbit hole – the content someone looks for then taking him or her to more divisive and extremist content – still exist in Facebook or has it tried to move away from them a little since the lid was blown on this?

Ms Frances Haugen:

I cannot imagine that this issue is fixed. These are large problems and until there is mandatory transparency, we should assume that they are still there or are even worse. I left last May and people have continued making the algorithms more optimised.

Photo of John McGahonJohn McGahon (Fine Gael)
Link to this: Individually | In context | Oireachtas source

I have two final questions, which I will ask at the same time. Does Ms Haugen believe that Facebook has become too big to handle? Over the past 100 years in the US, anti-trust laws were used to break up big companies that had monopolies. Should something like those be used on Facebook?

Ms Haugen fielded a similar question when appearing before the US Congress when a Senator asked her what major changes she would make to make the platform safer for users if she were CEO of Facebook the next morning.

Ms Frances Haugen:

Regarding anti-trust, I know that we want simple, sound-bite solutions. "Break up Facebook" is something that I can say in one sentence. It is clear and crisp. When we discuss engagement-based ranking and the idea that these algorithms are giving the most reach to the most extreme, polarising and divisive content, however, those problems are not just happening at Instagram and Facebook. They are happening on TikTok and YouTube. Breaking up Facebook does not solve that problem. Mandatory transparency and having other countervailing centres of mass can solve it.

If I could do any one thing, I would probably open more channels for inviting public participation in safety. For example, when we talk about the need to translate AIs into every language, there is a precedent of companies like Google developing tools that allow communities to transfer the companies' systems into the communities' own languages. Facebook could have a strategy like that. For example, there are 95 dialects in Ethiopia and six major language families. Facebook could decide that the only way it could ensure adequate safety in Ethiopia would be to have community involvement in building safety systems. That is probably the single fastest action to improve safety, but it would still not be rigorous enough. We need ongoing transparency and measures like mandatory risk assessments so that we can evolve with time.

Photo of Emer HigginsEmer Higgins (Dublin Mid West, Fine Gael)
Link to this: Individually | In context | Oireachtas source

I thank Ms Haugen for being with us. It is fantastic to have her and I thank her for all of the work she is doing in this space. Her bravery and insight have had ripple effects across the globe. Today is our chance here in Ireland, not just as a society or as legislators, but as a strategic location for the EU headquarters of many of the platforms in question, to use her industry experience and inside knowledge to make the online world a safer space.

Democracy is fragile and there need to be laws to protect it and citizens across the globe. This is true whether you are in Ireland or Myanmar or on Capitol Hill. Unfortunately, social media can become a weapon that threatens to destabilise democracy. Engagement-based ranking is dangerous - Mark Zuckerberg told us this – yet we are living in a metaverse where what we see on our social media platforms are the posts that will get a corporation the most engagement and the most money. He was right, in that it is proving dangerous to our values, our health – in particular, the mental health of our young people – and our democracy.

In Ireland, we used to look across the Atlantic at politics in the US and thank our lucky stars that our politics were not fuelled by money in the way it is there. Unbeknownst to us, though, platforms like Facebook were changing that under the radar. The sad reality is that someone can use money rather than messages to influence the national discourse and shape democratic decisions. That is not the kind of society in which I want to live.

Social media poses great opportunities, especially for keeping families and friends connected, and it would probably pose nothing but opportunities if it was not monetised and did not become a global multibillion euro company, but it did, and that has meant that the threats are getting bigger – threats to democracy and stability through how easily people can engage in the viral spreading of disinformation, and threats to our health, including our mental health, through the generation of patterns of online hate messaging.

I was struck by what Ms Haugen said about safeguarding children of 12, 13 or 14 years of age and how modern technology like facial recognition could be adopted to make a digital age of consent work. That is an interesting point because we in Ireland constantly struggle with the debate on how to get the digital age of consent to work without needing to share private data on children as proof of age.

I was interested in Ms Haugen's views on an individual complaints mechanism and how our new commission may end up drowning in complaints. That is something that we hear constantly. We want to strike the right balance between allowing people the ability to escalate their complaints without flooding the system to the point where it no longer works. Ms Haugen mentioned a class action of complaints. In Ireland, we have many great NGOs, for example, Webwise, CyberSafeKids, SpunOut and Bodywhys. Is Ms Haugen suggesting that people complain to them and, if those people wanted it, the NGOs could escalate the complaints to the commission or is she saying that we need to have a system where we hit these companies where it hurts, that is, with significant fines, when we see that they are systemically failing to protect users?

I will run through a couple of questions and then hand over the rest of my time to Ms Haugen. She is saying that the Online Safety and Media Regulation Bill is overly focused on content. That seems to be a concern for her. I would like to hear more about that. Ms Haugen is saying that we need to look more at the root of the problem. How do we do that through legislation? My blunt question is whether the Bill as it stands is fit for purpose. Will it work? Will the EU's Bill work? Is it more important that the EU's Bill work and that the EU give Ireland the resources we need to ensure that we can manage online safety in terms of the social media companies that are based in Ireland but providing services throughout the EU?

My three main takeaways from Ms Haugen's exceptional testimony today are that the algorithm needs to revert to what it was pre 2018 so that reactions and engagements are no longer driving Facebook's metrics; investment needs to happen in AI around local languages and in moderating content so as to minimise the human cost of censorship; and we need mandatory risk assessments and the mandatory enforcement of transparency. How do we as legislators achieve that? What laws need to be introduced? What else do we need to do and what else do we need to incorporate in EU law or our Bill so that we can have a real and lasting positive impact?

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

I am sorry to say that Ms Haugen only has two minutes to answer all of that. That is all the time we have left.

Ms Frances Haugen:

The most important point I can leave the committee with is that the Digital Services Act, DSA, is amazing legislation. It has been thoroughly vetted, many people have spent a great deal of effort on it and it is very important for whatever Ireland passes to be aligned with the DSA. The DSA has considered the question of how to stay abreast of these platforms in an ongoing way. Members are right, in that the platforms are large and move very quickly.

Ongoing mandatory risk assessments – not just those done by the company, but ones where NGOs and other parties can chime in – are important.

Deputy Higgins asked about handling individual complaints. It is not that difficult to write software that could provide people with a way of making complaints and then forming classes from those. We could have a conversation about that another time, but there is a mechanism whereby there could be a broad range of complaints that would then be channelled to NGOs. This would highlight that 10,000 people were all complaining about the same issue, for example.

Look to Europe. In most industries, the polluter pays.

This is going to be a very expensive problem. Algorithmic specialists are rare; there are few of them in the world. They are incredibly well compensated. I will give the committee example. When I worked at Pinterest, myself and a team of three other engineers, were able to increase the total profits of the company by 6%. We increased it by tens of millions of dollars with a project that took us a few months. This is why algorithmic specialists are paid so much because we really can make these things radically more profitable. Ireland does not want to take responsibility for paying enough people to be able to have a robust enforcer. Taxing platforms to make sure that there is safe enforcement is the critical path forward. Look to Europe. The Digital Services Act is important. Focus on systems, ongoing risk and the polluter pays.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

That sums it all up and it is a great way to finish our engagement today. I thank Ms Haugen for her time. You have had two hours of constant back and forth and in all of that time have never failed to entertain us with your insight, experience and wealth of knowledge on how we, fingers crossed, can do this right in Ireland for the rest of the world, and as you have said, be a global leader in all of that. I thank Ms Haugen.

Ms Frances Haugen:

I thank the committee for the opportunity to be here. This has been a very special experience for me. Part of my family is Irish and I am named after my Irish grandmother, Geraldine Frances Sommerville. It is a big treat for me to be here. I am aware that Ireland has a history of being a small country but one that has had an outsized impact on the world's history. Ireland has stood up to strong, powerful forces before in the world. I know that Ireland can build a strong enforcer that keeps the world safe and speaks up for people like the Rohingya who have pressed so well their voice to stand up to this trillion dollar company.

I thank the committee for inviting me today. It has been an honour.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Thank you and I hope this will not be our only engagement with you. I see us having further engagements if you are willing to do that.

Ms Frances Haugen:

I would love to.

Photo of Niamh SmythNiamh Smyth (Cavan-Monaghan, Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Thank you very much. I thank my committee members also for their good questioning and their participation today, it is very much appreciated.

The joint committee adjourned at 3.33 p.m. until 11.30 a.m. on Wednesday, 9 March 2022.