Oireachtas Joint and Select Committees

Tuesday, 16 April 2024

Joint Committee On Children, Equality, Disability, Integration And Youth

Protection of Children in the Use of Artificial Intelligence: Discussion (Resumed)

Photo of Erin McGreehanErin McGreehan (Fianna Fail)
Link to this: Individually | In context | Oireachtas source

Thank you. Our guests are very welcome to today's meeting. Many topics have been thrashed out already today. We would all agree that social media is an absolute cesspit and that X is the worst, to speak plainly. X is the worst, in my experience, but social media generally is not always a nice, comfortable place. While I do not want to be presumptuous about ages, most of us grew up in the era of disposable cameras and Nokia phones and are grateful for that because we did not have to put up with so much content coming at us. I have been listening to the conversation today and am wondering if we are coming at this from the wrong angle. There is no safe way to lie in the middle of a road, especially for a child. We could give guidance. We could tell a child to wear bright colours, do it at off-peak times or wear a light but social media is really not a safe place. We can give guidance and tell a child to do this and that but it is not a safe place. RTÉ did a "Prime Time Investigates" programme on TikTok focusing on self-harm and suicide content. It looked at the content that a 13-year-old who logs on sees. The researchers did not search for topics, like or comment on videos. They did not engage with any content. They just watched videos shown by TikTok on the "For You" feed and that feed went straight into content that is dangerous for a child, including content about depression, self-harm and suicide.

Welcome to TikTok as a 13-year-old. It is absolutely frightening. I have four young fellas. One was on the verge of wanting to be on social media. It is not safe to lie in the middle of the road. Car companies make their products safer. They have a responsibility to do that. We hear of safeguards and that the companies are doing various fancy things with algorithms and so on. We also talk about age verification and account verification but we wait for others, such as the European Commission, to do it. Do the witnesses not think that, because their companies make so much money from the products those businesses have created, they have a responsibility to be the best they can be and to put forward proper age verification? You would not let a child lie in the middle of the road. Would the witnesses be comfortable with their own 13-year-olds being on their platforms? Do they believe they are safe places? I would not send a teenager down a dark alley and yet we are allowing this. While it is also the fault of parents, adults and policy, the companies have created these products and responsibility lies with them. There are a great many incidents on all of the platforms, although, to be honest, Facebook and Instagram are probably the safest places. Do the companies not have a fiduciary responsibility to their clients and users not to put them in the middle of the road and to do their damnedest to verify ages, to ensure that young users are not getting racy content and not to send 13-year-olds, young vulnerable minds, straight to self-harm on their first day on TikTok?

While it is a long time ago, I remember being a 13-year-old. It is a scary and lonely place. If someone had shown me self-harm and suicide, I fear what the 14-year-old Erin, as opposed to the 13-year-old Erin if I had to have been 13, would have been. I fear what would have happened if I had been educated about self-harm and suicide. We did not get that in the Cooley Mountains in County Louth. You did not get it from your local papers or from school but we now get it from TikTok and it is on the companies' platforms, their product that they are providing to us and to our children. That is more of a statement but I would like some opinions.