Ofcom Introduces Online Safety Measures to Protect Children From “Manosphere”

Dan Meier 24 April, 2025 

One month on from the Online Safety Act coming into force in the UK (and from Adolescence landing on Netflix), Ofcom has introduced new child safety measures designed to prevent children from encountering harmful content. Starting in July, companies that provide online services, such as social media, gaming websites and search companies, must implement a range of safeguards as part of duties mandated by the new legislation.

Child protection has been a core pillar of Ofcom’s new online safety regime, aiming to prevent minors from seeing harmful content relating to suicide, self-harm, eating disorders and pornography. “Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges,” said the regulator. Under the legislation, Ofcom can fine companies that fail to comply, and in serious cases can apply for a court order to prevent the site or app from operating in the UK.

The announcement follows extensive research by the watchdog into online harms, which found that 60 percent of children aged 13-17 encounter potentially harmful content online over a four-week period. Ofcom also highlighted the risk posed to children by “manosphere” content, which includes men’s rights activists, incel culture and misogynist influencers such as Andrew Tate. According to the research, 52 percent of boys aged 11-14 are aware of – and have engaged with – manosphere influencers.

Taking action

In efforts to counter these harms, Ofcom has introduced 40 practical measures to be implemented by sites and apps used by UK children. These include setting internal content policies, content moderation functions, and complaints processes; ensuring content recommender systems are designed to exclude harmful content from young users’ feeds; giving children the option to block and mute other users’ accounts; and providing “highly effective” age assurances to prevent children from accessing inappropriate services.

Age-checking mechanisms are a particular focus given the prevalence of underage users accessing services by falsifying their date of birth. For instance Ofcom found that the average age at which children first saw online pornography was 13. The regulator has notified services that allow pornography that they must implement age assurances to stop under-18s encountering this content.

“Failure to implement the necessary age assurance process by 25 July 2025 will result in referral to our Enforcement team, who can take a number of actions, including imposing financial penalties of up to 10 percent of a service’s qualifying worldwide revenue, or £18 million, whichever is greater,” Ofcom told the companies. “In the case of continued non-compliance, Ofcom may apply to the court for an order which would require third parties (such as your bank or internet service provider) to take action to disrupt the provision of a non-compliant regulated service – either by restricting the supply of services to you (such as advertising or payment services), or by restricting access to your service itself.”

“A reset for children online”

Alongside holding providers of online services to account, the watchdog has issued guidance for parents in managing the risks to their children’s online safety, warning them that “some popular sites and apps have no minimum age requirements for their users.” The guidance recommends parents talk to their children regularly about their online activities, ensure they register to online services using their real age, and encourage them to report inappropriate or harmful content.

The regulator noted that the act does not ban children from social media, set minimum age requirements for accessing sites and apps, or cover children’s owership of smartphones. But the watchdog maintains that “it is the responsibility of tech firms to keep children safe”, and the law requires these companies to consistently enforce their age limits and protect their child users.

“These changes are a reset for children online,” said Dame Melanie Dawes, Chief Executive of Ofcom. “They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”

Follow VideoWeek on LinkedIn.

2025-04-24T11:27:31+01:00

About the Author:

Reporter at VideoWeek.
Go to Top