As with many online services, TikTok generates its revenue through the time individuals spend on its app, by collecting and using personal data and showing them advertisements. This business model creates an obvious incentive for TikTok to design its services in a way that maximises engagement and time spent on the service.

In practice, this translates to all kinds of highly dangerous practices, such as pushing extreme content, exploiting people’s vulnerabilities, or causing and nurturing digital addictions. In a 2023 study conducted by Amnesty International on a batch of fresh accounts pretending to be 13-year-old children, the platform was found to be mercilessly pushing them into ‘rabbit holes’ of potentially harmful content, depressive thinking, self-harm and suicide.

A smartphone with the TikTok app open.

This is a problem given who TikTok’s users are. With over half of German teenagers using the platform, and a third of British children between the ages of five and seven using the app, this is a problem of epic proportions. TikTok is letting down its most vulnerable consumers who need the most protection: children.

Consumer law applies in full

Users of online services often do not pay with their money but by dedicating their attention and engagement to the service. The recent decision by the Italian competition authority (AGCM) tackles this issue head-on with an in-depth analysis of TikTok’s business model, showing how important user activity is for the earnings of the company.

Hitting TikTok with a EUR 10M fine for threatening the mental and physical safety of children and unfair conditioning of users to exploit their vulnerability is a good first step.

The Italian authority also discusses how the algorithm ‘conditions’ users, such as by enabling personalised recommender systems by default and by amplifying harmful effects of content through repetition and personalisation. Despite being aware of these effects, TikTok was found to prioritise its own interests in generating and monetising digital addictions, particularly with minors, to increase business revenue.

The decision is also one of the first to consider that consumers, whose time and attention is monetised, are taking part in a transaction. As such, they have a right to be protected through consumer law just as if money had been exchanged, as per the European Commission’s guidance on unfair commercial practices.

Europe taking action: a new hope

The Italian decision arrives shortly after the launch of a European Commission investigation into TikTok’s obligations under the Digital Services Act and the European Parliament’s resolution on addictive design of digital services, calling for a ban on addictive techniques and an obligation to develop digital products that are fair and ethical by design.

These developments may mark the end of a long period of unethical and illegal social media business models successfully evading public scrutiny – despite their harms being already known. BEUC already alerted authorities in 2021 about TikTok’s worrying activities, including a 2021 complaint to consumer protection authorities (the Consumer Protection Cooperation network) flagging numerous violations including harms to children).

BEUC visual developed in 2021 to launch our complaint against the video-sharing platform.

After a long wait, things now seem to be moving. It’s high time social media platforms like TikTok are held to account for their harmful and unethical practices. Children are vulnerable consumers and need more protection on TikTok. With so much at stake regarding the health and wellbeing of the next generation, this will hopefully become the new global trend.

Posted by Kasper Drazewski and Linn Hogasen