Online Video Platforms Will Be Regulated Like Broadcasters Under New EU Rules

Tim Cross 12 May, 2020 

Sarah MacdonaldThe European Union’s ‘Audiovisual Media Services Directive’ (AVMSD) is designed to harmonise European regulation of broadcasters. The EU says the directive ensures that minimum standards are met across the EU on areas including protections of children and consumers and combating racial and religious hatred.

A wide-ranging revision to the regulation passed in November 2018. One of the biggest changes was that it extended to video sharing platforms for the first time, essentially bringing regulation of these services more in line with how broadcasters are regulated. This regulation covers both the content shown on digital video platforms, and the ads they show next to that content.

Once the regulation passed, EU member states had 21 months to update their own national laws to reflect the changes in the AVMSD. Many EU countries, including the UK – who will be adopting the legislation in spite of Brexit – are still finalising their amendments to their laws, which must be passed by September.

VAN spoke with Sarah MacDonald, a partner at Wiggin LLP, a law firm specialising in media, entertainment, technology and IP, who explained who will be affected by the revised AVMSD, and what will be required of them once the new laws have passed.

The AVMSD has been updated to include ‘video sharing platforms’ – who exactly does this cover?

The term used by the EU – ‘video sharing platform’ (VSP) – essentially covers any company which shows video content online for economic gain. That could be through subscription fees, advertising revenues, or some other mechanism. There is no requirement for the content to be over a certain size, so it will include short-form platforms. And it is designed so that it will include services even where their parent company isn’t based in Europe, or their parent company wouldn’t traditionally be seen as a media company.

These platforms are competing for the same audiences as traditional TV services and video on-demand (VOD) platforms. So the directive has been revised to make sure their content is held to similar standards, and similar rules apply. But the revised directive doesn’t just stretch the rules for broadcasters so that they cover VSPs too. The obligations on online video platforms aren’t as all encompassing as they are for traditional providers.

What new rules will be applied to video platforms’ content?

There are some basic provisions around the types of content which can’t be shown – for example, that it shouldn’t be supportive of terrorist organisations.

But obviously many of the video platforms don’t necessarily have control over the content itself because it’s user-generated. The directive is less about regulating content itself, and more about making sure that if content is harmful, it is not shown to those it could hurt. So, for example video platforms have to make sure that minors are protected from harmful content, which might mean thinking about how users log in and making sure there’s age verification in place.

And the ‘worse’ the content is, the stricter those obligations are going to be so. The AVMSD will apply to porn sites for example, since they’re run for economic gain, and there will now need to be solid mechanics built into those platforms to make sure they’re not accessible by minors.

For live streaming platforms, it’s particularly hard to moderate content since it’s being broadcast live. How will they be expected to moderate content?

It will be a question of what the VSP knows about a creator, and what they could realistically be expected to have found out.

They will need to have mechanisms in place for creators to disclose what their live stream might contain. So that might be additional checkboxes that a creator can tick when they start streaming, to say that their stream does or doesn’t contain certain things. And then the platform can use that to block content, or to make sure it’s not shown to groups it could cause harm to.

If the platform has made the right enquiries and it’s the user that’s got it wrong, you would think that the VSP had done what they could and wouldn’t be blamed for the breach. But it will mean there will be more procedural steps that VSPs need to put in place to show that these enquiries are being made.

How will advertising on these platforms be affected?

For adverts, there is an obligation on VSPs to evaluate ‘commercial communications’ they show, and make sure they meet a set of minimum obligations. The term ‘commercial communications’ covers sponsorship of programmes, and also captures display ads around videos and in-stream ads.

These obligations are very much minimums. They’re around things like making sure that there are no tobacco ads, and that ads for alcohol aren’t shown to minors. You never see tobacco adverts on places like YouTube anyway, but theoretically you could, and this change legally obligates video services to prevent that from happening across the EU.

Platforms which sell the ads themselves, like YouTube for example, will face additional minimum obligations. So one big one is that there must be no surreptitious advertising, and there must be clear distinctions between editorial content and advertising. And there’s a clause where commercial relationships must be disclosed to the consumer too.

On user-generated content platforms, that will affect both the platform and the user. So an advertiser or a creator might not make it clear when they’re showing an ad, and that falls on them. But the platforms will also have to make sure there’s a clear way for creators to disclose when their content contains an ad of some sort, and then the platforms will have to make that clear to audiences.

In terms of how it’s dealt with specifically in the UK, that will depend on whoever Ofcom appoints to handle regulation. From conversations we’ve had, we understand that will be the Advertising Standards Authority (ASA), and they’ll be able to apply tighter regulations if they choose to do so.

Currently, video services’ ads aren’t counted as broadcast advertising, so they have to comply with the ASA’s CAP Code [a set of minimum standards which non-broadcast ads must meet]. But my understanding, based on conversations with the ASA, is that as a result of the AVMSD they will bring video services’ ads under the BCAP Code [a set of minimum standards which broadcast ads must meet, which is stricter than the CAP Code].

The BCAP code has more rules for platforms to follow (it contains 33 clauses, compared to CAP which contains 22). And some of these clauses are stricter – for example, the BCAP code bans political advertising, whereas some political communications are actually exempt from the CAP code.

A lot of video platforms have already been active in regulating their content and ads, despite these regulations not applying to them. How big of an impact will the change have?

It is difficult to say because governments are still transposing the AVMSD into law, so it depends on how strictly they apply the rules. But I think there’s probably going to be a lot of behind-the-scenes work for the video platforms.

There are many references in the AVMSD to video platforms taking “appropriate measures” around certain things. So for example they need to take “appropriate measures” to protect minors from harmful content. We’re not sure what those appropriate measures are, and there will probably be differences in what the different platforms consider to be appropriate.

And as is always the case, it will probably take some VSPs actually falling foul of the regulation before we really understand what those appropriate measures are, and therefore how tight the regulations are in practice.

Many are sceptical about regulation of the bigger video platforms, as historically when they’ve fallen foul of regulations, their punishments have been seen as a slap on the wrist. Will the AVMSD have teeth?

It’s hard to say yet as penalties will be decided by member states.

But the potential for the ASA to apply the BCAP Code rather than the CAP Code is interesting. The BCAP code has Ofcom as its legal backstop. So Ofcom could therefore be granted powers to apply bigger sanctions to the video platforms, compared to if the CAP code is applied, where sanctions are limited. Breaches of the CAP code are usually just met with “pull it down and don’t do it again”.

But we don’t yet know exactly what powers Ofcom might be given.

Subscribe to Our Newsletter

Follow VAN on Twitter

2020-05-19T11:07:10+01:00

About the Author:

Tim Cross is Assistant Editor at VideoWeek.
Go to Top