TikTok, the popular Chinese-owned social media app, has been hit with a massive fine by the European Union for violating its privacy laws regarding the processing of children’s personal data. The €345 million ($370 million) fine is the result of a two-year investigation by Ireland’s Data Protection Commission (DPC), the lead regulator for TikTok in the bloc.
The DPC found that TikTok had breached several provisions of the General Data Protection Regulation (GDPR), the EU’s landmark data protection law that came into force in 2018. The GDPR sets strict rules for how companies can collect, use, and share personal data of EU citizens and imposes hefty fines for non-compliance.
According to the DPC, TikTok had failed to protect the privacy and safety of its underage users in several ways. For example, TikTok had set the accounts of users between the ages of 13 and 17 to public by default, allowing anyone to view and comment on their content. It also allowed features such as Duet and Stitch, which enable users to combine their content with other TikTokers, to be enabled by default for this age group. In addition, it was not verified that the adults who were given access to the children’s accounts through the “family pairing” scheme were actually their parents or guardians.
The DPC also found that TikTok had not adequately assessed the risks posed to children under 13 who were able to access the platform by disclosing their age. It found that TikTok had not taken adequate measures to prevent or detect such cases and had exposed these children to potential harm by placing them in a public setting by default.
The DPC said that its decision was based on an extensive investigation that involved multiple audits, inspections, and consultations with other EU data protection authorities. It said that it had taken into account TikTok’s cooperation and remedial actions, as well as the gravity and duration of the infringements.
TikTok said it respectfully disagreed with the decision and the level of the fine imposed. It said that the investigation had scrutinized its privacy setup between July and December 2020 and that the company had made significant changes since then to improve the protection of children’s data. Since 2021, it said, all existing and new accounts for 13- to 15-year-olds have been set to “private” by default, and additional safeguards such as age verification and parental controls have been put in place.
This is not the first time TikTok has faced scrutiny over its handling of children’s data. In April 2021, it was fined £12.7 million ($17.4 million) by the UK’s Information Commissioner’s Office for illegally processing the data of 1.4 million children under 13 without parental consent. In February 2019, the company was fined $5.7 million by the U.S. Federal Trade Commission for similar violations.
The EU fine is one of the largest ever imposed on a tech company for breaching data protection laws. It reflects the growing concern over the impact of social media platforms on children’s well-being and privacy, especially amid the COVID-19 pandemic that has increased their online activity. The EU has recently proposed new rules to regulate digital services and platforms, including stricter measures to protect children from harmful content and exploitation online.