Towards the end of last year, a number of ad tech companies and trade bodies breathed a collective sigh of relief as the European Parliament opted against including a blanket ban on personalised advertising in the upcoming Digital Services Act.
But both the European Parliament and European Commission have shown appetite for introducing firmer rules on use of personal data in advertising, and it looks likely we’ll see some sort of new restrictions in the final version of the DSA.
One target in privacy activists’ sights, specifically picked out by MEP Patrick Breyer, one of the parliamentarians leading the charge on privacy in advertising, is pervasive use of ominously named ‘dark patterns’.
The Basics
‘Dark patterns’ is a general term for techniques in user interface design which attempt to trick, or at least nudge, users into taking actions which they wouldn’t otherwise choose to do.
So dark patterns can include anything from extremely small ‘x’ buttons on spam ads, to fake ‘click here to download’ buttons on software download sites.
But the types of dark patterns which concern Breyer are those which nudge users into agreeing to share their data on consent mechanisms. This can happen through a variety of different techniques – but the key is that dark patterns are designed to lead at least a portion of users opt in to sharing data in cases where their actual preference would be to opt out.
Some types of dark pattern are explicitly banned in Europe under the General Data Protection Regulation. And there are privacy activists who argue that all dark patterns are banned by GDPR, though this is more of a grey area.
Because of their deceptive nature, Breyer would like to see use of dark patterns explicitly prohibited in the final version of the Digital Services Act.
The Technical Details
As mentioned, the term ‘dark patterns’ covers a broad church of different techniques – and what does and doesn’t count as a dark pattern isn’t set in stone.
Some cases are more clear than others. Pre-ticking the ‘I consent’ box is an obvious example. Telling users that continued use of a website or app will be taken as a sign of consent is another. Both of these are explicitly prohibited under GDPR.
Other examples of dark patterns could include:
- Requiring users to click through multiple confusing screens and tick multiple boxes in order to opt-out
- Making it unclear which button opts out of data collection – for example giving users the choice to ‘Accept and Continue’ or ‘Save Settings and Continue’
- Using coercive language – for example by presenting the options as ‘Yes, show me a fantastic personalised experience’ or ‘No, I want a rubbish, bland experience’
- Making one option visually more appealing or obvious – for example by presenting a large, bright green accept button above a smaller, less obvious ‘reject’ button.
These softer techniques are widespread. A 2019 study from Bocham University found that 57.4 percent of German websites, out of a sample of 1,000, used some sort of dark pattern to nudge users towards consenting to data sharing.
Some believe that these sorts of techniques are also banned under GDPR. GDPR requires that consent should be “freely given, specific, informed and unambiguous” – and it can easily be argued that if a user is being nudged towards consenting, their consent is not freely given, informed, or unambiguous.
And indeed some regulators have come to the same conclusion. Just two weeks ago, French data regulator the CNIL fined Google and Facebook due to the fact that both make it harder for users to opt-out of data sharing than to opt-in (though it’s worth noting that France wrote this principle into national law when it folded GDPR into its national Data Protection Act, meaning this principle doesn’t necessarily hold across the whole EU.
In the UK, the ICO has also offered specific guidance on consent collection, pointing out various examples of dark patterns and stating that they are to be avoided.
But again, if dark patterns were explicitly outlawed under the DPA, the debate on interpretation of the GDPR would be settled.
The Pros and Cons
If dark patterns do become outlawed, advocates of the change say it’ll be a significant win for online data protection, allowing people more control over how their data is used and shared online.
But there will be two major challenges, should such a law be passed.
The first will be to define exactly what does and doesn’t count as a dark pattern. If this isn’t written into the law, then it will remain a subjective issue – dark patterns will likely remain common while the publishers and tech companies using them claim that their practices don’t fall under the definition.
This will be hard to do firstly because new tricks and techniques will likely emerge over time, meaning the law might need to be altered over time to account for new nudge techniques. But also, some cases really are ambiguous.
For example, when does language go from persuasive to coercive? Publishers would argue that they should be allowed to make the case, within their consent mechanisms, that using personal data helps bring in more ad revenue and funds journalism. But others might argue that this counts as nudging, and would be barred under a dark pattern ban.
And what about Apple’s consent mechanism for use of its Identifier for Advertisers, which warns users that they’re consenting to being tracked across the internet? Could it be argued that this is a dark pattern designed to push users away from consenting?
The other big challenge will be enforcement. Dark patterns may be questionable as a business practice, but they’re widespread. Enforcing a ban could mean taking action against thousands and thousands of different businesses.
Even the practices explicitly banned by GDPR, such as pre-ticked boxes, are still commonly used. So even if dark patterns were to be banned, it could be a long while before they disappear.