On February 27, 2019, the U.S. Federal Trade Commission announced a record-setting $5.7 million fine to popular short-form video sharing platform TikTok, formerly known as Musical.ly, as part of a consent order over allegations the company violated the Children's Online Privacy Protection Act (COPPA). The settlement is now the largest COPPA penalty ever obtained by the FTC, beating the previous 2011 record of $3 million. In addition to reinforcing the FTC's commitment to enforcement with respect to children's privacy issues, the FTC here took an expansive view of what services qualify as "directed to children" under COPPA. Although that expansive view is expressed in the form of a complaint and not a judicial opinion, it nonetheless changes the risk landscape for online service providers, particularly in the social media and videogame industries. Many companies have previously felt comfortable not addressing COPPA in their sites and services, arguing they are not "directed to children." However, following TikTok, it is now time for such companies to re-evaluate their exposure to and steps needed to comply with COPPA.

COPPA

COPPA, passed in 1998 and revised in 2012, is one of the United States' older consumer-facing privacy laws. COPPA applies to the operator of any website or online service "directed to children" that collects personal information from children, or any website or online service that has actual knowledge that it is collecting personal information from children. Unless an exception applies, an operator subject to COPPA must obtain verifiable parental consent before collecting any personal information from a child in those circumstances. For an online service that is not directed at children, but may nonetheless appeal to them, or where a substantial number of users are under 13, the FTC permits companies to implement an "age-screen" or "age-gate," where users are asked to enter their age and their experience and data is handled accordingly.

Although COPPA is a U.S.-specific law, it has extraterritorial application to companies outside the U.S. that collect personal information from children in the U.S. It has also served as a model for many international children's privacy laws, including the EU's GDPR. The FTC has repeatedly enforced COPPA in the online/games space; for example, against TinyCo and Yelp in 2014, and LAI Systems and Retro Dreamer in 2015. The FTC has increased its focus in this area on the heels of child-protection aspects of the GDPR and the California Consumer Protection Act.

Factual Background

The TikTok/Musical.ly App (App) allows users to create and share short videos with other users. The videos typically involve the users lip-syncing to popular music. To create an account, the App requires an email address, phone number, username, first and last name, a short biography and a profile picture. In addition to creating and sharing videos, the App allows users to interact by commenting on their videos and sending direct messages. User accounts are public by default.

Initially, the App did not request age information from its users. This was changed in July 2017, but the App did not request age information from existing users who created their accounts prior to the change. While Musical.ly's 2018 Privacy Policy states that the platform "is not directed at children under the age of 13," according to the FTC complaint, Musical.ly was aware of the popularity of its platform with children, as shown in a number of ways:

  • Since 2014, Musical.ly received "thousands" of complaints from parents (over 300 in a single two-week period) that their children under 13 had created Since 2014, Musical.ly received "thousands" of complaints from parents (over 300 in a single two-week period) that their children under 13 had created Musical.ly accounts. (Since 2014, Musical.ly received "thousands" of complaints from parents (over 300 in a single two-week period) that their children under 13 had created Musical.ly accounts. (Musical.ly closed the accounts in response to the parents' complaints, but left the videos of the kids publicly available on the platform.)
  • Press articles published between 2016 and 2018 highlighted the popularity of the App among tweens and younger children.
  • Musical.ly published guidance in 2016 and 2017, stating, "If you have a young child on Musical.ly published guidance in 2016 and 2017, stating, "If you have a young child on Musical.ly, please be sure to monitor their activity on the App."
  • A third party met with Musical.ly's co-founder and noted that seven of the App's most popular users appeared to be children under 13; the company then performed an internal audit and found an additional 39 underage users among its most popular users.

A Broader Interpretation of "Directed to Children"

This settlement is notable because of the FTC's broad interpretation of what makes an online service "directed to children" under the App. For context, the Amended COPPA rule explains that the FTC will evaluate whether a site or service will is "directed to children" based on the "subject matter of the site or service, its visual content, the use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities or celebrities who appeal to children, language or other characteristics of the website or online service," or the content of advertising on the service. In addition to these factors, the FTC will also rely on other "competent and reliable empirical evidence regarding audience composition."

Filed in February, the FTC complaint, weighing the factors, concluded that Musical.ly is a child-directed service due to the activity involved (creating lip-syncing videos) and the presence of emojis like "cute animals and smiley faces," "simple tools" for sharing content, songs related to "Disney" and "school," and kid-friendly celebrities like Katy Perry, Selena Gomez, Ariana Grande, Meghan Trainor and others. However, this type of content (lip-syncing, approachable design, bright colors, emojis, presence of pop music, etc.) can arguably be found on many sites and games not directed to children as well. (Take, for example, RuPaul's Drag Race and its associated App).

Either the FTC is now adopting a more expansive definition than previously understood, or it may instead be relying less on the subject-matter factors and more heavily on "other competent and reliable empirical evidence" of audience composition; specifically, the complaints from parents, press articles, Musical.ly's own admissions in its guidance to parents, and Either the FTC is now adopting a more expansive definition than previously understood, or it may instead be relying less on the subject-matter factors and more heavily on "other competent and reliable empirical evidence" of audience composition; specifically, the complaints from parents, press articles, Musical.ly's own admissions in its guidance to parents, and Musical.ly executives' actual knowledge of popular underage users on their platform.

Five Practical Tips for Reducing Your Risk under COPPA and Similar Laws

Absent further guidance from the FTC, service providers should view this decision with caution and take steps to protect their site or service now, especially if there is a chance the site or service may be seen as child-directed.

  1. Be careful before using your next emoji: Re-analyze your site/services' appeal to children. Given that the FTC seems to have relied heavily on empirical evidence of audience composition in this case, keep on the lookout for evidence like that called out above. For example, talk to marketing and see if they have data regarding the target demographics of your app. Keep up on the news and see if the app is starting to become popular with younger users. Determine if your app is being featured on any "Children's" or "Families" lists. In any case, counsel should be involved in this investigation to preserve the attorney-client privilege.
  2. Reconsider whether you need an age-gate in light of Musical.ly. Even an app that explicitly states in its privacy policy that it is not directed to children may nonetheless be found child-directed by the FTC. Service providers do not want to be in a position of having to argue that their game does not appeal to kids. If you suspect children make up a significant portion of your audience, it's easier to add an age-gate sooner rather than later in development.
  3. Age-gate the right way – it's more than just asking about age. When implementing an age-gate, the FTC has stated a service provider cannot encourage children to lie about their age, or make it easy for the child to circumvent the gate (for instance, by clicking the "back" button and trying again). If implementing an age-gate for a service that is already live, make sure the gate is presented to existing users as well as new users. Once you have users' ages, delete any personal information you may have collected from underage users.
  4. Buyer Beware: Perform Privacy due diligence during and after M&A. Musical.ly was acquired by ByteDance in August 2018 and merged with the TikTok app under the TikTok name. If you acquire a company, make sure you do thorough diligence on any privacy issues your target might be bringing along. Acquirers should conduct a post-close privacy assessment to evaluate and remediate any risks. COPPA Safe Harbor programs are slowly gaining popularity with some companies; consider if they might be right for you.
  5. Train, Train, Train: Teach your customer service reps to handle COPPA. It's important that customer service reps are trained to both handle complaints from parents, and to know what to do when it sounds like a child might be on the other end of a customer support line. Customer support should also keep in close contact with your legal team and flag if they sense that a game might be unexpectedly popular with kids.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.