#KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action
Photo credit: Kassandra Acuna, ESRB Privacy Certified
There are hundreds of hot hashtags on TikTok, the viral video-sharing platform with over 1 billion users worldwide, but it’s safe to say that #kidsprivacy (or even #privacy) isn’t among them. The top hashtags in the U.S. for the past week (which change almost daily) collectively generated 286K posts over seven days while #childprivacy and variants have a grand total of 182 posts from all over the world, for all (TikTok) time. Still, children’s privacy is a trending topic for the platform, which has been facing global scrutiny over its children’s data privacy practices.
To date, TikTok has paid out roughly half a billion dollars in children’s privacy suits brought by regulators (and private plaintiffs) in the United States, as well as the United Kingdom and the European Union. Last week, TikTok’s privacy woes exploded when the U.S. Department of Justice (DOJ), acting on behalf of the Federal Trade Commission (FTC), filed a complaint in a federal court in California against TikTok, its Chinese parent, ByteDance Ltd., and several related entities (collectively, TikTok) alleging “unlawful massive-scale invasions of children’s privacy” affecting millions of children under the age of 13.
As expected, the government alleged that TikTok “flagrantly violat[ed]” the Children’s Online Privacy Protection Act (COPPA), and the COPPA Rule. The government also alleged that TikTok infringed a settlement agreement with the FTC over an earlier COPPA lawsuit that arose from the FTC’s 2019 investigation of TikTok’s predecessor company, Musical.ly.
The FTC’s original 2019 complaint alleged that the video sharing platform shared extensive personal information from children under the age 13 without verifiable parental consent (VPC) as required by COPPA. User accounts were public by default, which meant that other users could see a child’s personal information, including their profile bio, username, picture, and videos. Although the app allowed users to change their default setting from public to private so that only approved users could follow them, kids’ profile pictures and bios remained public, and strangers could still send them direct messages.
TikTok ultimately entered into a consent order with the FTC, forking over $5.7 million in civil monetary penalties to resolve the action, the largest COPPA fine at that time. (Since then, the FTC has obtained much larger monetary settlements in COPPA cases against Google/YouTube ($170 million) and Epic Games ($275 million)). The 2019 order also required TikTok, among other things, to destroy all personal information collected from users under age 13 or obtain parental consent for those accounts.
The main claims in the new lawsuit are that Tiktok: (1) knowingly created accounts for children and collected data from those children without first notifying their parents and obtaining verifiable parental consent (VPC); (2) failed to honor parents’ requests to delete their children’s accounts and information; and (3) failed to delete the accounts and information of users they know are children. As the FTC put it in its press release announcing the new case, TikTok was “aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about . . . compliance failures that put children’s data and privacy at risk. Instead of complying . . . TikTok spent years knowingly allowing millions of children under 13 on their platform . . . in violation of COPPA . . . .”
Unlike the 2019 case, the new TikTok action is not a settlement, and the government will need to prove its allegations in court to prevail on its claims. TikTok has made clear that it disagrees with the complaint’s allegations, stating that many “relate to past events and practices that are factually inaccurate or have been addressed.” What will happen next, though, is unclear.
Although we expect that TikTok will file a motion to dismiss the complaint, TikTok is facing much larger stakes than COPPA’s $51,744 per violation civil penalty. (Even if you only calculate violations per kid, that’s an astronomical amount given that they were “ubiquitous” on the platform.) The COPPA case is playing out alongside TikTok’s existential tangle with the U.S. government over Congress’ “ban or divest” law. TikTok has challenged the constitutionality of that law, which requires ByteDance to divest its U.S. TikTok assets by January 19, 2025 or face a ban on the app.
Regardless of what happens, the government’s complaint provides insights into the FTC’s views on what companies can and can’t do with kids’ data under COPPA. Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel:
1) You Can’t Use Kids’ Information for Profiling and Marketing Under the “Internal Operations” Exception: Following the 2019 settlement, TikTok used an age gate (in this case, a date of birth prompt) to identify U.S. users under the age of 13 and created “TikTok for Younger Users” (what the complaint calls “Kids Mode”), a limited experience that allows kids to view videos but does not allow them to create or upload videos, post information publicly, or message other users. Although TikTok touted its “safety and privacy protections designed specifically for an audience that is under 13 years old,” according to the complaint, it still collected and used “extensive” personal information – “far more data than it needed” – from Kids Mode account holders without first providing parental notice or obtaining VPC.
The information collected included username, password, and date of birth along with persistent identifiers like IP address and unique device identifiers. According to the complaint, TikTok combined this information with app activity data, device information, mobile carrier information, and app information to amass profiles on children and share it with third parties. In one outrageous example, the complaint alleges that TikTok shared kids’ profiles with the analytics and marketing measurement platform AppsFlyer and with Facebook, so they could “retarget” (lure back) users whose engagement had declined.
As the complaint makes clear, TikTok’s use of persistent identifiers like device ID from Kids Mode users does not comport with the “internal operations” exception, which only permits companies to use such identifiers without VPC if they do not collect any other personal information and only “for the sole purpose” of providing support for an online service’s internal operations. Although there is some scope for companies to collect and use kids’ information for internal operations without VPC, companies cannot interpret the internal operations exception broadly to cover the collection and use of persistent identifiers for profiling and marketing.
2) You Can’t Allow Kids to Circumvent COPPA: Although COPPA does not require companies to validate users’ ages, you can’t allow users to circumvent COPPA by building, “back doors that allowed users to bypass the age gate . . . .” In the complaint, the government alleges that by allowing users to use login credentials from certain third-party online services, including Instagram and Google, TikTok allowed users to avoid the age gate altogether and set up regular accounts. These policies and practices led to the creation of millions of “unknown user” accounts that allowed children to gain access to adult content and features of the general TikTok platform. TikTok, in turn, then collected and maintained vast amounts of personal information from the children who created and used these regular TikTok accounts without their parents’ consent.
3) Make Sure Your Age Gates Work: The complaint alleges that kids could easily retry the age gate. TikTok did not prevent children who initially put in an under-13 birth date from restarting the account creation process and providing a new birth date that would make them old enough to lift Kids Mode. As the FTC’s COPPA FAQs have long recommended, you should use technical means, such as a cookie, to prevent children from back-buttoning to enter a different age.
4) Don’t Make Deletion Difficult – and Do It!: Much of the complaint focuses on TikTok’s failure to delete accounts and information that “even their own employees and systems identify as belonging to children” as well as its other failures to delete children’s personal data upon parental request. The government alleges, for example, that TikTok required parents to “navigate a convoluted process” to request the deletion of personal information collected from their children. TikTok often did not honor parents’ requests, either by not responding to their requests at all, or by only deleting accounts if there were “objective indicators” that the account holder was under 13 or the parent completed a form certifying under penalty of perjury that they were the parent or guardian of the account holder. Alongside these allegations, the complaint also alleges that TikTok retained kids’ data in databases long after purportedly deleting their accounts.
- One interesting claim in the complaint is that TikTok should have deleted children’s personal information – such as photos and voice recordings – incorporated into other users’ videos and comments on other users’ posts. TikTok allegedly possessed identifiers linking the incorporated information to an account that they deleted because it belonged to a child.
5) Don’t Mislead Regulators: The government’s complaint also details the ways in which TikTok failed to maintain records and communications relating to its children’s privacy practices and compliance with the 2019 order. More critically, the complaint alleges that TikTok made false statements that it had removed child accounts and deleted the associated data. Instead, as the complaint states, TikTok retained and had been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed],” including the data of child, teen, and adult users, including IP addresses, device IDs, device models, and advertising IDs. If true, that’s TikTok cringe-worthy.
- Despite this reference to teens’ data (and an earlier reference to the number of teens – two-thirds- that report using TikTok), it’s notable that the government’s action does not include a claim under Section 5 of the FTC Act concerning TikTok’s privacy and marketing practices to teens, similar to those it advanced in other, recent COPPA actions.
We’re sure there’s lots more to learn from the complaint, but for now we’ll stick with these five takeaways. We’ll be following the case closely as it plays out in the federal court and providing other pointers to ESRB Privacy Certified members. And maybe we’ll check out next week’s top hashtags to see if #kidsprivacy makes it to the top ten, unlikely as that seems.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.