Oireachtas Joint and Select Committees
Tuesday, 16 April 2024
Joint Committee On Children, Equality, Disability, Integration And Youth
Protection of Children in the Use of Artificial Intelligence: Discussion (Resumed)
Mary Seery Kearney (Fine Gael) | Oireachtas source
I thank the Cathaoirleach. I thank the witnesses for the engagement, which has been very good. I appreciate they sit there and take a lot of stick from us, so I have two positive things to say. First, I congratulate Meta for the creation of Threads. It is brilliant. It is safe. It is shown to be a very positive experience compared with X, which sadly is a vile experience. I say that as a politician. It is just vile from beginning to end. It is a necessary evil, but any time I have ever reported anything, it never violates X’s community guidelines, so I do not really have any time for X, to be perfectly honest.
TikTok’s algorithm is doing something very positive at the moment. My husband has a private TikTok account. He never posts anything. He never would do or create anything, but it is good for us to look up cat videos and all that kind of funny stuff. We have an eight- going on nine-year-old daughter who likes TikTok because she hears other people talking about it and she is allowed see it so long as she is sitting beside one of us. She did a video because she saw a filter she liked. He let her do a video and supervised. The algorithm immediately picked up on the fact she was underaged and closed down his account, or he had to go back and verify whatever. I wanted to say I had real-time experience of that happening. It was a fantastic thing to happen in our house, because we got to say, “See? TikTok don't want you there.” That put an end to that demand, which was fantastic.
I wanted to acknowledge those two things. I am aware there is an awful lot with Meta. I know from the minutes of the previous meetings there are an awful lot of very positive things that go on. However, I have a couple of beefs. One is WhatsApp’s age requirement being reduced. That is unforgivable. I do not know why the company would take such a retrograde step at this time. I really do not understand.
I push all the witnesses again on safety. I am aware there is a time limit and the user has to put in a code if that time limit is hit, but putting in a code is not flagging the fact there are mental health implications. We can even look at time spent with friends by age group. I appreciate it is about smartphones full stop, but since social media the number of minutes of engagement children have with other real, live children has diminished to a frightening extent. If one goes into any coffee shop, there are people sitting with their children and they are on a phone and their child is on a phone. There is this lack of human engagement that is just frightening and, I believe, a major contributor to issues with mental health, anxiety and all of that. The message comes up after an hour. I would need to be convinced an hour is enough. At that point it should tell the user they need to go and talk to a human being. We have warnings on the sides of cigarette packets and all sorts of other places.
It is a fact the mental health of younger people is being affected by the length of time and the addictive nature of social media. I want to push the issue of what the companies can do to address that mental health element when they have a business model that runs contrary to that. With all the safeguards that are being talked about, they have a business model the runs on keeping them there. By the time they become an adult, they are already well addicted. We changed how we ring doorbells and press lift buttons after Covid. We change our behaviour. Young people are all thumbs when typing and we are not. I am still typing with all my fingers.
That behaviour has changed, but that is only symptomatic of what is going on cognitively. There is a need for very clear mental health flags here.
No comments