Oireachtas Joint and Select Committees
Tuesday, 16 April 2024
Joint Committee On Children, Equality, Disability, Integration And Youth
Protection of Children in the Use of Artificial Intelligence: Discussion (Resumed)
Mary Seery Kearney (Fine Gael) | Oireachtas source
I thank the Cathaoirleach and thank the witnesses for taking the time to be here, and in person, which I very much appreciate. My remarks will probably be more pertinently directed to TikTok and Meta because I take the point that X is more for older users.
I come from a place where my view of social media is that, by design, it pushes an operant conditioning in behaviour modification. It uses the circular model of trigger, action, variable reward and investment. It involves deliberate manipulation and capture behaviour modification. It captures attention and it is attention that is monetised as being that which advertisers can have access to. That is the basis of an advertising-based business model here.
I read the advisory of the US Surgeon General, entitled Social Media and Youth Mental Health, and i am concerned about young people and their access to smart phones, which I think should be banned. Brain development has changed. Mental health has been affected by the amount of time spent on social media as well as the content that is thereon. I also read the report from Coimisiún na Meán that looked at video platform services, how online harms and its evidence review.
I met Coimisiún na Meán last week and it urges a whole-of-society response to this and I completely agree with that. This is not all at the witnesses' platforms' doors by any means. It is for us, as policymakers and legislators, to take action. Yes, the platforms as the technology companies. Parents and caregivers, and children and adolescents themselves, need to set their own boundaries, so I am not abdicating responsibility to anybody else. Do the witnesses acknowledge that harm, that impact on that behavioural modification and developmental modification that has occurred as a consequence of social media and its design? I looked at some of the statistics of mental health illness among college students with the advent of the smartphone.
Since 2010, in the United States, anxiety has increased by 134%, depression has increased by 106%, ADHD is up by 72%, bipolar is up by 57%, anorexia is up 100% and substance abuse addiction is up 33%. I have a colleague, Mr. Roderick Cowen, who speaks a lot of how cognitive resilience is behind in the design of social media and we need cognitive security for young people in its design. Consequently we need transparent risk assessments that come from this basis of mental health focus and there needs to be impact assessments published. I do not believe there has been transparency around that.
The design and development of decisions within social media companies need to prioritise safety and health, including children's privacy and their age verification. I note that WhatsApp reduced its age recently. I would not approve of that. The response to reporting of problem content, for all of the companies but particularly X, is abominable.
Social media needs to come with a mental health warning. If an individual of any age, but particularly a young person, is on a social media platform, a warning needs to come up that he or she has been on for 15 minutes and he or she needs to take a break from it. There needs to be a health warning that comes up that limits the time people spend but that is in contrast to the business model, which is about attention holding. I would like to know what their position is and whether they acknowledge the harms. I would also like to know about cognitive resilience and cognitive security and what steps are they taking in that regard. We can talk about all the harmful content. That is all agreed, but that is a distance from the fact that their design is something that has brought behaviour modification.
No comments