TikTok was fined roughly $370 million on Friday by European Union regulators for having weak safeguards to protect the personal information of children using the platform, a sign of increased scrutiny facing the social media service.
TikTok’s default setting did not adequately protect children’s privacy, nor was the company transparent in explaining what it was doing with the data of users age 17 and younger, according to Ireland’s Data Protection Commission, which issued the penalty on behalf of the European Union. The fine of 345 million euros is the first one issued against TikTok by the 27 nation bloc for violating data protection laws.
TikTok is becoming a more frequent target of parents, policymakers and regulators who are wary of the company’s data-collection practices and the platform’s effect on the mental health of young people. In a 2022 survey, 67 percent of American teens said they use TikTok, with 16 percent saying they use it “almost constantly,” according to the Pew Research Center.
The concerns are intensified by TikTok’s links to China, where its parent company, ByteDance, is based. Irish regulators are separately investigating whether TikTok is unlawfully sending the data of users in the European Union to China, an inquiry that is slated to finish by the end of the year.
In the United States, state and federal policymakers have been wrestling with how to regulate TikTok. Numerous government agencies have prohibited the use of TikTok on work devices out of concerns that the app could give Beijing access to sensitive user data. The state of Montana passed a law that bans the app’s use in the state altogether.
In the European Union, where TikTok has more than 150 million monthly users, regulators said on Friday that the company was not doing enough to protect children. Although the service can be used by those older than 13, the company violated data protection rules by having settings that made videos and posts public by default, thus exposing information and data of its youngest users.
“The profile settings for child user accounts were set to public by default, meaning anyone (on or off TikTok) could view the content posted by the child user,” the regulators said.
The investigation covered the period from July 31, 2020, to Dec. 31, 2020. The regulators also said that TikTok did not adequately prevent its youngest users from using certain workarounds to sidestep age restrictions on the service, including the ability to send and receive direct messages. This included the use of a “family pairing” feature, in which a user not verified as a parent or guardian could turn off such limits for the child.
TikTok also used so-called dark patterns, a technique to nudge users to select more privacy-intrusive options during the sign-up process and when posting videos to the service, regulators said.
TikTok said the penalty was not relevant because the company had already changed policies related to children in 2021, including setting accounts to private by default for users aged 13-15 and providing young people more information about how their data is collected and used.
“We respectfully disagree with the decision, particularly the level of the fine imposed,” TikTok said in a statement.
This is not the first time TikTok has been punished for its handling of children’s data. In April, British regulators fined the company 12.7 million pounds, worth about $15.8 million today, for not preventing children under the age of 13 from signing up for the service. In 2019, Musical.ly, the service that would later become TikTok, agreed to pay $5.7 million to settle charges by the Federal Trade Commission for violating U.S. data protection rules for children.