Video sharing app Tik Tok has been fined $5.7m (£4.3m) for knowingly hosting content published by underage users.
Short form video sharing app Tik Tok has been fined $5.7m (£4.3m) for knowingly hosting content published by underage users, according to the Federal Trade Commission (FTA).
The company is also required to implement new measures to handle users who say they are under 13.
The FTA said Tik Tok would be fined for what was deemed as a failure to adhere to the basic principles of the Children’s Online Privacy Protection Act (COPPA).
Tik Tok has been ordered to delete the data in question as of Wednesday and US users will be required to verify their age when they open the app.
The Musical.ly app, which was later acquired and incorporated into Tik Tok, has an estimated 1 billion users worldwide and 65 million in the US, a “large percentage” of which were underage.
“For the first three years [of its existence], Musical.ly didn’t ask for the user’s age,” the FTC’s statement read.
“Since July 2017, the company has asked about age and prevents people who say they’re under 13 from creating accounts. But Musical.ly didn’t go back and request age information for people who already had accounts.”
The settlement does not constitute an admission of guilt, but it is understood that the app does not plan to contest any of the FTC’s allegations.
There is no estimate on how long it will take to delete the data in question, and regulations in the UK and elsewhere will remain the same as the settlement only applies to the US.
"We care deeply about the safety and privacy of our users," Tik Tok said. "This is an ongoing commitment, and we are continuing to expand and evolve our protective measures in support of this."Return to internet news headlines
View Internet News Archive