New Video Frontiers, London, 15th-16th April, 2026 > Find Out More

UK Government Launches Consultation On Possible Social Media Ban for Children

Tim Cross-Kovoor 02 March, 2026 

The UK government this morning launched a new consultation asking how to keep children safe online across social media, AI chatbots, and gaming platforms, gathering views on the possibility of banning children under a certain age from accessing social media, as well as less severe measures.

It’s less than three years since the previous government passed the Online Safety Act, a broad piece of legislation which placed new obligations on social media companies and online platforms, designed to create a safer digital environment for young people. But the government says there’s still a sense from parents that keeping their kids safe online is a massive uphill battle. The platforms children use and the way they use them continue to evolve at pace, while entirely new technologies, such as AI chatbots, introduce their own unique challenges.

“Millions of parents across the country worry about what social media is doing to their children’s sleep, concentration and mental health,” said a statement released by the Department for Science, Innovation, and Technology. “Many feel they are fighting a losing battle against platforms designed to keep children scrolling. They are grappling with how much screen time their children should have, when they should give them a phone, what they are seeing online, and the impact all of this is having. They worry about their children talking to chatbots as if they’re real people and relying on their advice.”

The consultation, which was first announced in January, will run for three months, collecting views from “everyone with a view” including parents, carers, young people, those who work with children, civil society organisations, academics, and industry. Once the consultation is complete, the Prime Minister and Technology Secretary will have new legislative powers to act quickly on its findings. This will mean that “ministers can move within months instead of waiting years for new legislation every time technology evolves”, according to DSIT’s statement.

The government is looking for views on whether there should be a minimum age for social media, and what age might be appropriate. It’s also gauging opinions on alternative legislative possibilities such as forcing platforms to turn off their most addictive features for kids, and imposing mandatory overnight curfews.

What’s appropriate, and what actually works?

The idea of banning social media for children is gaining traction around the world, after an under-16s ban came into force in Australia last year. But as is always the case with this sort of legislation, there are big questions around what actually works in practice, alongside the core concerns around what sort of role the government should play in regulating social platforms. There have been plenty of anecdotal stories of teenagers finding workarounds to the Australia ban, and young people fooling age-gating technologies on platforms which have started verifying their users’ ages.

The government’s consultation will look into these issues too, examining (for example) how age verification enforcement should be strengthened.

There will always be dissenting voices around whether it’s the government’s place to control young people’s access to social media. But the case for action has been loudly re-raised by the recent release of Molly vs the Machines, a documentary which follows the tragic death of Molly Russell, who took her own life in 2017 after being exposed to large volumes of harmful content on Instagram.

And as has long been the case, many of the advertisers whose ad spend funds the social platforms want to see more done to regulate the tech giants too.

Simon Michaelides, director general of UK advertiser trade group ISBA, said that ISBA has supported proportionate regulation of the big tech platforms since 2017. “With the Online Safety Act coming into force, our hope is that enforcement will mean that more is done to remove the kind of inappropriate and harmful content that children and young people are often exposed to,” he said. “If it does not prove to be sufficient, ISBA will continue to work with the industry and Government – as it has always done – to ensure that advertisers’ concerns are reflected in any efforts to revisit the law.”

“Advertisers do not want to appear next to or near to harmful content, or to inadvertently monetise it. They want to understand platforms’ policies on what content they disallow, and how those platforms are working to detect, remove, and prevent it,” Michaelides added. “This information allows them to make their own, informed decisions about where they place their advertising.”

Follow VideoWeek on LinkedIn.

2026-03-02T13:15:36+01:00

About the Author:

Tim Cross-Kovoor is Assistant Editor at VideoWeek.
Go to Top