Ofcom Steps Up Oversight of TikTok, Snapchat, and Twitch

Tim Cross 06 October, 2021 

The UK’s media and communications regulator Ofcom this morning set out new guidance for video-sharing platforms (VSPs), designed to better protect audiences from harmful online videos. Under the Audiovisual Media Services Regulation which came into force last year, Ofcom has power to regulate VSPs whose European operations are primarily based in the UK – which includes TikTok, Twitch, Snapchat, and OnlyFans. Ofcom’s powers include the ability to levy fines of £250,000, or up to five percent of UK turnover, and even to suspend operations for the most serious breaches of the rules.

The new regulation is the UK’s implementation of the EU’s Audiovisual Media Services Directive, which came into force before the UK had completely left the EU. The directive was designed to harmonise regulation of broadcasters across Europe, but also bring VSPs under broadcaster-like regulation, thereby evening the playing field. This includes enforcing rules around harmful content.

Specifically, Ofcom says it’s role in regulating VSPs is to ensure they have appropriate measures in place to protect users from videos which:

  • might impair the physical, mental or moral development of under-18s
  • are likely to incite violence or hatred based on particular grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation
  • directly or indirectly encourage acts of terrorism; show or involve conduct that amounts to child sexual abuse; and show or involve conduct that incites racism or xenophobia.

Still room for interpretation

The key here is that while Ofcom assesses individual pieces of content in the TV world, for VSPs the regulator looks at the systems put in place to prevent the spread of harmful content, and asks whether enough has been done to effectively protect users. The regulator acknowledges that some content will slip through the cracks, saying that “the massive volume of online content means it is impossible to prevent every instance of harm”.

Last year, Ofcom offered rough outlines on the steps it expects VSPs to take to protect users from harmful content. Today, the regulator has fleshed out this guidance – though there is still plenty of room for interpretation.

The 79 page document released by Ofcom includes three specific measures which VSPs are expected to take. The first is providing clear rules around uploading content, with clear terms and conditions on the types of content which can and can’t be uploaded, which must be effectively enforced. The second is implementing an easy reporting and complaint process, which allows users to flag harmful videos. And the third is that VSPs which host pornographic material must have robust age-verification in place.

VSPs must also have an impartial dispute resolution procedures in place, so users can contest any decisions which are or aren’t taken regarding the above measures, or any other measures against harmful content which a VSP puts in place.

And alongside these measure, Ofcom has come up with five principles which these measures must comply with: they must be effective, easy to use, transparent, fair, and they must evolve in line with changing behaviours and technological advancements.

Hard to judge the impact

Ofcom has previously asked companies themselves to discern whether they are a VSP, and whether they fall under Ofcom’s jurisdiction. So far, the list of companies which have done so includes TikTok, Twitch, Vimeo, Snapchat, OnlyFans, and Triller.

Does today’s new guidance radically change things for these businesses? That itself is unclear.

On the one hand, the measures outlined by Ofcom don’t seem revolutionary. All of the above platforms have terms and conditions which are designed to block harmful content from being uploaded, and each also has a flagging system of some sort in place. So in a sense, it seems the guidance is simply asking VSPs to do things which they’re already doing.

But Ofcom’s guidance is extensive and detailed in how exactly these measures should operate. For example terms and conditions must be easy to access and understand, steering clear of legalese and jargon. And apps should not direct users to desktop-optimised web pages for full terms and conditions.

For age verification meanwhile, Ofcom says that the strength of age assurance systems should be proportional to the risk associated with an underage user accessing the site. For sites with pornographic material, Ofcom explicitly says that self-declaration tick boxes, general disclaimers, age verification through online payment methods which don’t necessarily require the user to be over 18, and publicly available information such as name, address, and date of birth, are not valid forms of age verification.

So it may well be that while VSPs are implementing some measures to protect from harmful content, these measures fall short of Ofcom’s standards.

Child sex abuse material in focus

As usual with these things, it’ll partly be a case of waiting and seeing what actions Ofcom ends up taking. The regulator says it will release a report in autumn next year, outlining the steps VSPs are taking to better protect users.

But there certainly seems to be a sentiment from Ofcom that the current systems being used by VSPs aren’t up to the task. “Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them,” said Ofcom chief executive Melanie Dawes. “The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”

As such, it seems likely that VSPs under Ofcom’s jurisdiction will have to be proactive in responding to the new guidelines, or risk punishment.

And Ofcom says it will take a hard line against the most serious offences, namely distribution of child sex abuse material – including “self-generated” material. “Adult VSPs carry a heightened risk of child sexual abuse material and the rise in direct-to-fans subscription sites specialising in user-generated adult content has potentially made this risk more pronounced,” said a statement from Ofcom. “Given this heightened risk, we expect that VSPs’ creator registration processes and subsequent checks are strong enough to significantly reduce the risk of child sexual abuse material being uploaded and shared on their platforms.”

Beyond this Ofcom also says its priorities for the next twelve months will include tackling online hate and terror; ensuring an age-appropriate experience on platforms popular with under-18s; laying the foundations for age verification on adult sites; and ensuring VSPs’ processes for reporting harmful content are effective.

2021-10-06T12:09:11+01:00

About the Author:

Tim Cross is Assistant Editor at VideoWeek.
Go to Top