Teenagers could be banned from spending more than two hours on popular social media services such as TikTok under Government plans. Science, Innovation and Technology Secretary Peter Kyle said he was looking at proposals to limit the amount of time children could spend scrolling on their phones, with a blanket nigh-time curfew also possible. An announcement is expected in the Autumn.
It comes after Deputy Prime Minister Angela Rayner warned “the amount of time people were spending alone online” was threatening social cohesion. Laws requiring websites and apps to prevent under-18s accessing pornography or content that might encourage suicide, self-harm or eating disorders come into force on Friday but MPs have warned this will not prevent the spread of the “misinformation” that played a role in rioting last year.
Mr Kyle said there should be some control over how long children spent looking at content that “isn’t criminal, but it’s unhealthy”.
He said: “I’ll be making an announcement on these things in the near future. But I am looking very carefully about the overall time kids spend on these apps.
“I think some parents feel a bit disempowered about how to actually make their kids healthier online.”
The Minister said children would welcome controls on how long they spent online.
“I think some kids feel that sometimes there is so much compulsive behaviour with interaction with the apps they need some help just to take control of their online lives and those are things I’m looking at really carefully.
“We talk a lot about a healthy childhood offline. We need to do the same online. I think sleep is very important, to be able to focus on studying is very important.”
Mr Kyle told Sky News: “I think we can incentivise the companies and we can set a slightly different threshold that will just tip the balance in favour of parents not always being the ones who are just ripping phones out of the kids’ hands and having a really awkward, difficult conversation around it.”
Age checks are being introduced due to the 2023 Online Safety Act, passed by the previous government but only now being put into effect.
Sites could demand credit card details or other forms of ID. Regulator Ofcom says acceptable methods include users showing their face via the camera on their device, allowing an AI system to estimate their age.
A report by the House of Commons Science, Innovation and Technology Committee earlier this month warned that the new laws did not do enough to keep the public safe. MPs said the unrest and riots of summer 2024 were driven in part by misinformation and hateful content that was amplified on social media platforms.
The Committee, chaired by MP Chi Onwurah, highlighted the role of automated systems known as algorithms that recommend content to users and can push material most likely to provoke an emotional reaction. It said in a report: “Calls to violence were posted across major platforms, in some cases seemingly amplified by recommendation algorithms. Social media and encrypted private messaging platforms were used to organise protests and riots.”
Deputy Prime Minister Angela Rayner expressed similar concerns when she spoke to the Prime Minister and the rest of the Cabinet this week about her plans to improve “the social fabric, trust, and integration in communities across the country” and prevent a repeat of last year’s riots.
A Downing Street spokesperson said she told colleagues that “economic insecurity, the rapid pace of de-industrialisation, immigration and the impacts on local communities and public services, technological change and the amount of time people were spending alone online, and declining trust in institutions, was having a profound impact on society.”