New internet safety measures come into force today under new legislation, with Labour warning tech firms they “will be held to account” if they fail to comply with the protections. Technology Secretary Peter Kyle said the Online Safety Act will mean a generation of children will not be allowed to grow up “at the mercy of toxic algorithms”.
One of the most significant changes will see online platforms required to have age checks in place – such as using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders. They also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content towards them when online. The new protections are set to be enforced by regulator Ofcom.
Actions which could be taken against firms which fail to comply with the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK.
Under the changes, platforms must also provide parents and children with clear and accessible ways to report problems online when they do appear.
The act is designed to protect both adults and children online, with the strongest protections designed for children.
Mr Kyle said Labour has “drawn a line in the sand” and that the codes will bring real change.
He said: “This Government has taken one of the boldest steps anywhere in the world to reclaim the digital space for young people – to lay the foundations for a safer, healthier, more humane place online.
“We cannot – and will not – allow a generation of children to grow up at the mercy of toxic algorithms, pushed to see harmful content they would never be exposed to offline. This is not the internet we want for our children, nor the future we are willing to accept.”
He said the time for tech platforms “to look the other way is over”, calling on them to “act now to protect our children, follow the law, and play their part in creating a better digital world”.
Mr Kyle warned: “And let me be clear: if they fail to do so, they will be held to account. I will not hesitate to go further and legislate to ensure that no child is left unprotected.”
Ofcom chief executive Dame Melanie Dawes said: “Prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK.
“Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.”
She told BBC Radio 4’s Today programme that 6,000 pornography sites had agreed to bring in age checks from Friday.
Social media platforms X, formerly Twitter, as well as Bluesky and Reddit, plus dating app Grindr are among those to have committed to age assurances, according to Ofcom.
Th regulator said it has launched a monitoring and impact programme focused on some platforms where children spent most of their time, including Facebook, Instagram, TikTok and YouTube, plus gaming site Roblox.
The Online Safety Act also aims to tackle misinformation and disinformation.