The European Commission has confirmed it has opened an investigation into short-form video app TikTok into numerous potential breaches of the new Digital Services Act. The Commission said the investigation relates to areas regulated by the DSA including protection of minors, advertising transparency, data access for researchers, and risk management of addictive design and harmful content.
The investigation will gather further evidence before ruling whether TikTok has breached the rules or not. If it is found guilty, the consequences could be very significant – the DSA enables the EU to impose fines of up to six percent of a company’s global turnover.
Down the rabbit hole
The DSA is one of two landmark pieces of regulation passed by the European Union in 2022 (alongside the Digital Markets Act) aimed at bringing regulation up to speed with the modern tech landscape. While the Digital Markets Act sets out rules and restrictions for the very largest platforms, the DSA applies more broadly to digital businesses, applying new rules over how they protect their users and respect their rights. However the DSA does include some specific rules for the very largest tech companies – designated very large online platforms (VLOPs) and very large online search engines (VLOSEs). The idea is that these large platforms, given their size and influence, have more potential to harm consumers, and so must take greater care to protect their users.
TikTok has been designated as a VLOP by the EU, and as such had to send a risk assessment to the Commission last year in response to the DSA, while also answering formal requests for information from the Commission. It also outlined a number of measures it is taking to comply with the DSA, which came into full force on February 17th, including turning off ads personalisation for younger users.
But the Commission, based on what it’s seen, suspects TikTok has not done enough to comply.
One of the main concerns is TikTok’s design, which feeds users a constant, scrollable stream of short-form content. This feed is made more potent by TikTok’s algorithmic ranking, which personalises content for each individual user. The Commission says that TikTok may not have done enough to assess and mitigate the risks to users which could stem from this design, which it says “may stimulate behavioural addictions and/ or create so-called ‘rabbit hole effects'”. Similarly, the Commission believes TikTok may not have done enough to weigh up and mitigate risks relating to mental health, children’s welfare, and radicalisation.
As mentioned, TikTok did already put a number of measures in place to comply with the DSA. But the Commission says that some of these measures, including the use of age verification tools, may not be “reasonable, proportionate, and effective”.
The Commission is also investigating potential breaches of transparency requirements. The DSA requires TikTok to provide a searchable and reliable repository for ads run on the platform, and to give researchers access to its publicly accessible data.
Implications beyond TikTok
The Commission has not set a deadline for when its investigation will wrap up. But before the investigation closes, the EU has the power to take further enforcement steps, such as imposing interim measures and non-compliance decisions.
If TikTok is found to be in breach of the DSA, it will likely have implications beyond just TikTok. The steps which TikTok has taken to comply with the DSA look fairly similar to those taken by many of the other large platforms. TikTok isn’t alone, for example, in relying on age verification as part of its child protection policy. And it’s far from the only platform that can be accused of having an addictive design of algorithmically delivered content.