ESRB Privacy Certified | ESRB Ratings https://www.esrb.org/tag/esrb-privacy-certified/ Provides ratings for video games and apps, including age ratings, content descriptors and interactive elements. Fri, 09 Aug 2024 21:47:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://www.esrb.org/wp-content/uploads/2019/06/cropped-Favicon.png ESRB Privacy Certified | ESRB Ratings https://www.esrb.org/tag/esrb-privacy-certified/ 32 32 #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action https://www.esrb.org/privacy-certified-blog/kidsprivacy-trending-for-tiktok-top-takeaways-from-the-new-coppa-enforcement-action/ Fri, 09 Aug 2024 19:56:28 +0000 https://www.esrb.org/?p=6038 Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel.

The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.

]]>
Photo credit: Kassandra Acuna, ESRB Privacy Certified

There are hundreds of hot hashtags on TikTok, the viral video-sharing platform with over 1 billion users worldwide, but it’s safe to say that #kidsprivacy (or even #privacy) isn’t among them.  The top hashtags in the U.S. for the past week (which change almost daily) collectively generated 286K posts over seven days while #childprivacy and variants have a grand total of 182 posts from all over the world, for all (TikTok) time. Still, children’s privacy is a trending topic for the platform, which has been facing global scrutiny over its children’s data privacy practices.

To date, TikTok has paid out roughly half a billion dollars in children’s privacy suits brought by regulators (and private plaintiffs) in the United States, as well as the United Kingdom and the European Union. Last week, TikTok’s privacy woes exploded when the U.S. Department of Justice (DOJ), acting on behalf of the Federal Trade Commission (FTC), filed a complaint in a federal court in California against TikTok, its Chinese parent, ByteDance Ltd., and several related entities (collectively, TikTok) alleging “unlawful massive-scale invasions of children’s privacy” affecting millions of children under the age of 13.

As expected, the government alleged that TikTok “flagrantly violat[ed]” the Children’s Online Privacy Protection Act (COPPA), and the COPPA Rule.  The government also alleged that TikTok infringed a settlement agreement with the FTC over an earlier COPPA lawsuit that arose from the FTC’s 2019 investigation of TikTok’s predecessor company, Musical.ly.

The FTC’s original 2019 complaint alleged that the video sharing platform shared extensive personal information from children under the age 13 without verifiable parental consent (VPC) as required by COPPA. User accounts were public by default, which meant that other users could see a child’s personal information, including their profile bio, username, picture, and videos. Although the app allowed users to change their default setting from public to private so that only approved users could follow them, kids’ profile pictures and bios remained public, and strangers could still send them direct messages.

TikTok ultimately entered into a consent order with the FTC, forking over $5.7 million in civil monetary penalties to resolve the action, the largest COPPA fine at that time. (Since then, the FTC has obtained much larger monetary settlements in COPPA cases against Google/YouTube ($170 million) and Epic Games ($275 million)). The 2019 order also required TikTok, among other things, to destroy all personal information collected from users under age 13 or obtain parental consent for those accounts.

The main claims in the new lawsuit are that Tiktok: (1) knowingly created accounts for children and collected data from those children without first notifying their parents and obtaining verifiable parental consent (VPC); (2) failed to honor parents’ requests to delete their children’s accounts and information; and (3) failed to delete the accounts and information of users they know are children. As the FTC put it in its press release announcing the new case, TikTok was “aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about . . . compliance failures that put children’s data and privacy at risk. Instead of complying . . . TikTok spent years knowingly allowing millions of children under 13 on their platform . . . in violation of COPPA . . . .”

Unlike the 2019 case, the new TikTok action is not a settlement, and the government will need to prove its allegations in court to prevail on its claims. TikTok has made clear that it disagrees with the complaint’s  allegations, stating that many “relate to past events and practices that are factually inaccurate or have been addressed.”  What will happen next, though, is unclear.

Although we expect that TikTok will file a motion to dismiss the complaint, TikTok is facing much larger stakes than COPPA’s $51,744 per violation civil penalty. (Even if you only calculate violations per kid, that’s an astronomical amount given that they were “ubiquitous” on the platform.)  The COPPA case is playing out alongside TikTok’s existential tangle with the U.S. government over Congress’ “ban or divest” law.  TikTok has challenged the constitutionality of that law, which requires ByteDance to divest its U.S. TikTok assets by January 19, 2025 or face a ban on the app.

Regardless of what happens, the government’s complaint provides insights into the FTC’s views on what companies can and can’t do with kids’ data under COPPA.  Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel:

1)  You Can’t Use Kids’ Information for Profiling and Marketing Under the “Internal Operations” Exception: Following the 2019 settlement, TikTok used an age gate (in this case, a date of birth prompt) to identify U.S. users under the age of 13 and created “TikTok for Younger Users” (what the complaint calls “Kids Mode”), a limited experience that allows kids to view videos but does not allow them to create or upload videos, post information publicly, or message other users. Although TikTok touted its “safety and privacy protections designed specifically for an audience that is under 13 years old,” according to the complaint, it still collected and used “extensive” personal information – “far more data than it needed” –  from Kids Mode account holders without first providing parental notice or obtaining VPC.

The information collected included username, password, and date of birth along with persistent identifiers like IP address and unique device identifiers. According to the complaint, TikTok combined this information with app activity data, device information, mobile carrier information, and app information to amass profiles on children and share it with third parties. In one outrageous example, the complaint alleges that TikTok shared kids’ profiles with the analytics and marketing measurement platform AppsFlyer and with Facebook, so they could “retarget” (lure back) users whose engagement had declined.

As the complaint makes clear, TikTok’s use of persistent identifiers like device ID from Kids Mode users does not comport with the “internal operations” exception, which only permits companies to use such identifiers without VPC if they do not collect any other personal information and only “for the sole purpose” of providing support for an online service’s internal operations. Although there is some scope for companies to collect and use kids’ information for internal operations without VPC, companies cannot interpret the internal operations exception broadly to cover the collection and use of persistent identifiers for profiling and marketing.

2) You Can’t Allow Kids to Circumvent COPPA: Although COPPA does not require companies to validate users’ ages, you can’t allow users to circumvent COPPA by building, “back doors that allowed users to bypass the age gate . . . .” In the complaint, the government alleges that by allowing users to use login credentials from certain third-party online services, including Instagram and Google, TikTok allowed users to avoid the age gate altogether and set up regular accounts. These policies and practices led to the creation of millions of “unknown user” accounts that allowed children to gain access to adult content and features of the general TikTok platform. TikTok, in turn, then collected and maintained vast amounts of personal information from the children who created and used these regular TikTok accounts without their parents’ consent.

3) Make Sure Your Age Gates Work: The complaint alleges that kids could easily retry the age gate. TikTok did not prevent children who initially put in an under-13 birth date from restarting the account creation process and providing a new birth date that would make them old enough to lift Kids Mode. As the FTC’s COPPA FAQs have long recommended, you should use technical means, such as a cookie, to prevent children from back-buttoning to enter a different age.

4) Don’t Make Deletion Difficult – and Do It!: Much of the complaint focuses on TikTok’s failure to delete accounts and information that “even their own employees and systems identify as belonging to children” as well as its other failures to delete children’s personal data upon parental request. The government alleges, for example, that TikTok required parents to “navigate a convoluted process” to request the deletion of personal information collected from their children. TikTok often did not honor parents’ requests, either by not responding to their requests at all, or by only deleting accounts if there were “objective indicators” that the account holder was under 13 or the parent completed a form certifying under penalty of perjury that they were the parent or guardian of the account holder. Alongside these allegations, the complaint also alleges that TikTok retained kids’ data in databases long after purportedly deleting their accounts.

  • One interesting claim in the complaint is that TikTok should have deleted children’s personal information – such as photos and voice recordings – incorporated into other users’ videos and comments on other users’ posts. TikTok allegedly possessed identifiers linking the incorporated information to an account that they deleted because it belonged to a child.

5) Don’t Mislead Regulators: The government’s complaint also details the ways in which TikTok failed to maintain records and communications relating to its children’s privacy practices and compliance with the 2019 order. More critically, the complaint alleges that TikTok made false statements that it had removed child accounts and deleted the associated data. Instead, as the complaint states, TikTok retained and had been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed],” including the data of child, teen, and adult users, including IP addresses, device IDs, device models, and advertising IDs. If true, that’s TikTok cringe-worthy.

  • Despite this reference to teens’ data (and an earlier reference to the number of teens – two-thirds- that report using TikTok), it’s notable that the government’s action does not include a claim under Section 5 of the FTC Act concerning TikTok’s privacy and marketing practices to teens, similar to those it advanced in other, recent COPPA actions.

We’re sure there’s lots more to learn from the complaint, but for now we’ll stick with these five takeaways. We’ll be following the case closely as it plays out in the federal court and providing other pointers to ESRB Privacy Certified members.  And maybe we’ll check out next week’s top hashtags to see if #kidsprivacy makes it to the top ten, unlikely as that seems.

• • •

Stacy Feuer HeadshotAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.

The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.

]]>
#KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action https://www.esrb.org/privacy-certified-blog/kidsprivacy-trending-for-tiktok-top-takeaways-from-the-new-coppa-enforcement-action/ Fri, 09 Aug 2024 19:56:28 +0000 https://www.esrb.org/?p=6038 Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel.

The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.

]]>
Photo credit: Kassandra Acuna, ESRB Privacy Certified

There are hundreds of hot hashtags on TikTok, the viral video-sharing platform with over 1 billion users worldwide, but it’s safe to say that #kidsprivacy (or even #privacy) isn’t among them.  The top hashtags in the U.S. for the past week (which change almost daily) collectively generated 286K posts over seven days while #childprivacy and variants have a grand total of 182 posts from all over the world, for all (TikTok) time. Still, children’s privacy is a trending topic for the platform, which has been facing global scrutiny over its children’s data privacy practices.

To date, TikTok has paid out roughly half a billion dollars in children’s privacy suits brought by regulators (and private plaintiffs) in the United States, as well as the United Kingdom and the European Union. Last week, TikTok’s privacy woes exploded when the U.S. Department of Justice (DOJ), acting on behalf of the Federal Trade Commission (FTC), filed a complaint in a federal court in California against TikTok, its Chinese parent, ByteDance Ltd., and several related entities (collectively, TikTok) alleging “unlawful massive-scale invasions of children’s privacy” affecting millions of children under the age of 13.

As expected, the government alleged that TikTok “flagrantly violat[ed]” the Children’s Online Privacy Protection Act (COPPA), and the COPPA Rule.  The government also alleged that TikTok infringed a settlement agreement with the FTC over an earlier COPPA lawsuit that arose from the FTC’s 2019 investigation of TikTok’s predecessor company, Musical.ly.

The FTC’s original 2019 complaint alleged that the video sharing platform shared extensive personal information from children under the age 13 without verifiable parental consent (VPC) as required by COPPA. User accounts were public by default, which meant that other users could see a child’s personal information, including their profile bio, username, picture, and videos. Although the app allowed users to change their default setting from public to private so that only approved users could follow them, kids’ profile pictures and bios remained public, and strangers could still send them direct messages.

TikTok ultimately entered into a consent order with the FTC, forking over $5.7 million in civil monetary penalties to resolve the action, the largest COPPA fine at that time. (Since then, the FTC has obtained much larger monetary settlements in COPPA cases against Google/YouTube ($170 million) and Epic Games ($275 million)). The 2019 order also required TikTok, among other things, to destroy all personal information collected from users under age 13 or obtain parental consent for those accounts.

The main claims in the new lawsuit are that Tiktok: (1) knowingly created accounts for children and collected data from those children without first notifying their parents and obtaining verifiable parental consent (VPC); (2) failed to honor parents’ requests to delete their children’s accounts and information; and (3) failed to delete the accounts and information of users they know are children. As the FTC put it in its press release announcing the new case, TikTok was “aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about . . . compliance failures that put children’s data and privacy at risk. Instead of complying . . . TikTok spent years knowingly allowing millions of children under 13 on their platform . . . in violation of COPPA . . . .”

Unlike the 2019 case, the new TikTok action is not a settlement, and the government will need to prove its allegations in court to prevail on its claims. TikTok has made clear that it disagrees with the complaint’s  allegations, stating that many “relate to past events and practices that are factually inaccurate or have been addressed.”  What will happen next, though, is unclear.

Although we expect that TikTok will file a motion to dismiss the complaint, TikTok is facing much larger stakes than COPPA’s $51,744 per violation civil penalty. (Even if you only calculate violations per kid, that’s an astronomical amount given that they were “ubiquitous” on the platform.)  The COPPA case is playing out alongside TikTok’s existential tangle with the U.S. government over Congress’ “ban or divest” law.  TikTok has challenged the constitutionality of that law, which requires ByteDance to divest its U.S. TikTok assets by January 19, 2025 or face a ban on the app.

Regardless of what happens, the government’s complaint provides insights into the FTC’s views on what companies can and can’t do with kids’ data under COPPA.  Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel:

1)  You Can’t Use Kids’ Information for Profiling and Marketing Under the “Internal Operations” Exception: Following the 2019 settlement, TikTok used an age gate (in this case, a date of birth prompt) to identify U.S. users under the age of 13 and created “TikTok for Younger Users” (what the complaint calls “Kids Mode”), a limited experience that allows kids to view videos but does not allow them to create or upload videos, post information publicly, or message other users. Although TikTok touted its “safety and privacy protections designed specifically for an audience that is under 13 years old,” according to the complaint, it still collected and used “extensive” personal information – “far more data than it needed” –  from Kids Mode account holders without first providing parental notice or obtaining VPC.

The information collected included username, password, and date of birth along with persistent identifiers like IP address and unique device identifiers. According to the complaint, TikTok combined this information with app activity data, device information, mobile carrier information, and app information to amass profiles on children and share it with third parties. In one outrageous example, the complaint alleges that TikTok shared kids’ profiles with the analytics and marketing measurement platform AppsFlyer and with Facebook, so they could “retarget” (lure back) users whose engagement had declined.

As the complaint makes clear, TikTok’s use of persistent identifiers like device ID from Kids Mode users does not comport with the “internal operations” exception, which only permits companies to use such identifiers without VPC if they do not collect any other personal information and only “for the sole purpose” of providing support for an online service’s internal operations. Although there is some scope for companies to collect and use kids’ information for internal operations without VPC, companies cannot interpret the internal operations exception broadly to cover the collection and use of persistent identifiers for profiling and marketing.

2) You Can’t Allow Kids to Circumvent COPPA: Although COPPA does not require companies to validate users’ ages, you can’t allow users to circumvent COPPA by building, “back doors that allowed users to bypass the age gate . . . .” In the complaint, the government alleges that by allowing users to use login credentials from certain third-party online services, including Instagram and Google, TikTok allowed users to avoid the age gate altogether and set up regular accounts. These policies and practices led to the creation of millions of “unknown user” accounts that allowed children to gain access to adult content and features of the general TikTok platform. TikTok, in turn, then collected and maintained vast amounts of personal information from the children who created and used these regular TikTok accounts without their parents’ consent.

3) Make Sure Your Age Gates Work: The complaint alleges that kids could easily retry the age gate. TikTok did not prevent children who initially put in an under-13 birth date from restarting the account creation process and providing a new birth date that would make them old enough to lift Kids Mode. As the FTC’s COPPA FAQs have long recommended, you should use technical means, such as a cookie, to prevent children from back-buttoning to enter a different age.

4) Don’t Make Deletion Difficult – and Do It!: Much of the complaint focuses on TikTok’s failure to delete accounts and information that “even their own employees and systems identify as belonging to children” as well as its other failures to delete children’s personal data upon parental request. The government alleges, for example, that TikTok required parents to “navigate a convoluted process” to request the deletion of personal information collected from their children. TikTok often did not honor parents’ requests, either by not responding to their requests at all, or by only deleting accounts if there were “objective indicators” that the account holder was under 13 or the parent completed a form certifying under penalty of perjury that they were the parent or guardian of the account holder. Alongside these allegations, the complaint also alleges that TikTok retained kids’ data in databases long after purportedly deleting their accounts.

  • One interesting claim in the complaint is that TikTok should have deleted children’s personal information – such as photos and voice recordings – incorporated into other users’ videos and comments on other users’ posts. TikTok allegedly possessed identifiers linking the incorporated information to an account that they deleted because it belonged to a child.

5) Don’t Mislead Regulators: The government’s complaint also details the ways in which TikTok failed to maintain records and communications relating to its children’s privacy practices and compliance with the 2019 order. More critically, the complaint alleges that TikTok made false statements that it had removed child accounts and deleted the associated data. Instead, as the complaint states, TikTok retained and had been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed],” including the data of child, teen, and adult users, including IP addresses, device IDs, device models, and advertising IDs. If true, that’s TikTok cringe-worthy.

  • Despite this reference to teens’ data (and an earlier reference to the number of teens – two-thirds- that report using TikTok), it’s notable that the government’s action does not include a claim under Section 5 of the FTC Act concerning TikTok’s privacy and marketing practices to teens, similar to those it advanced in other, recent COPPA actions.

We’re sure there’s lots more to learn from the complaint, but for now we’ll stick with these five takeaways. We’ll be following the case closely as it plays out in the federal court and providing other pointers to ESRB Privacy Certified members.  And maybe we’ll check out next week’s top hashtags to see if #kidsprivacy makes it to the top ten, unlikely as that seems.

• • •

Stacy Feuer HeadshotAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.

The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.

]]>
Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes https://www.esrb.org/privacy-certified-blog/probing-the-ftcs-coppa-proposals-updates-to-kids-privacy-rule-follows-agencys-focus-on-technological-changes/ Mon, 08 Jan 2024 21:04:24 +0000 https://www.esrb.org/?p=5791 As a longstanding FTC-authorized COPPA Safe Harbor program, we follow the agency’s COPPA work closely. We’ve delved into the Notice of Proposed Rulemaking (NPRM) to understand what it will mean for our member video game and toy companies – and for the millions of kids and teens (and their parents) that play games. Read our summary of the most important provisions from the 164-page NRPM document here.

The post Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes appeared first on ESRB Ratings.

]]>
Photo by Igor Starkov on Unsplash

With calls to strengthen kids’ online privacy and safety protections growing louder by the day, 2023 was supposed to be the year that Congress would pass new legislation. That didn’t happen. Enter the Federal Trade Commission (FTC).

The agency pursued several blockbuster children’s privacy enforcement actions in 2023, including two against video game companies, that resulted in hundreds of millions in fines and landmark legal remedies. Then, at the very end of the year, the agency issued long-awaited proposals for changes to the Children’s Online Privacy Protection Rule, a process it began in 2019.

The COPPA Rule, last updated in 2013, implements the Children’s Online Privacy Protection Act, which dates back even earlier — to 1999. Although the agency can’t change the Act itself (that’s Congress’ job), it can make far-reaching changes to the Rule. It’s still unclear what a final rule will look like and when (or whether) it will arrive, but the FTC’s cusp of the year move means that 2024 will certainly be a consequential year for children’s privacy.

As a longstanding FTC-authorized COPPA Safe Harbor program, we follow the agency’s COPPA work closely. We’ve delved into the Notice of Proposed Rulemaking (NPRM) to understand what the NPRM will mean for our member video game and toy companies – and for the millions of kids and teens (and their parents) that play games. (Although the average age of a gamer is 32, 76% of people under the age of 18 play video games.) We plan to file a comment on the proposed rule changes within the 60-day comment period that will start to run once the NPRM is published in the Federal Register, most likely later this week.

Although we’re still considering our responses to the NPRM, we’re providing a summary of the most important provisions to spare you reading all 164 pages of the document. (LinkedIn estimated that it would take me 228 minutes to read the NPRM. Once. I’ve already ready it multiple times.) So, if you don’t have four – or forty – hours to devote to COPPA Rule reform, read on. It shouldn’t take four hours, but this blog is on the longer side. For convenience, we’ve divided it into three categories: (1) Changes; (2) Emphasis; and (3) Status Quo.

CHANGES
First up, notable changes to definitions and substantive aspects of the Rule:

  • Personal Information: Currently, the COPPA Rule’s definition of personal information includes information collected from a child such as name, address, online contact information, screen or user names (when they function as contact information), phone numbers, social security numbers, geolocation information and photography, video, or audio files that contain a child’s image or voice. The Rule also includes “persistent identifiers” (such as IP addresses) that can be used to recognize users over time and across different web sites or online services in the definition of personal information.
    • Proposal: In the NPRM, the agency proposes expanding this definition to include biometric identifiers and all forms of government identification, not just SSNs. The FTC’s inclusion of biometric identifiers including “fingerprints or handprints; retina and iris patterns; genetic data, including a DNA sequence; or data derived from voice data, gait data, or facial data” as personal information is not surprising. In its May 2023 Biometric Policy Statement, the FTC articulated its concerns about the “new and increasing risks associated with the collection and use of biometric information” and FTC Commissioner Alvaro Bedoya has regularly sounded the alarm bell on how “companies are protecting children’s biometric data against breaches, fraud, and abuse.”
    • Questions: Beyond biometric information, the agency raises questions about two other categories of information – avatars and online screen or user names – that may be of interest to video game companies:First, the NPRM asks whether screen or user names should be treated as online contact information “even if the screen or user name does not allow one user to contact another user through the operator’s website or online service, when the screen or user name could enable one user to contact another by assuming that the user to be contacted is using the same screen or user name on another website or online service that does allow such contact?”
      Second, referring to the popularity of avatars in online services such as video games, the NPRM asks whether the Rule should explicitly designate avatars generated from a child’s image as personal information “even if the photograph of the child is not itself uploaded to the site or service and no other personal information is collected from the child.” The agency is interested in receiving specific feedback on these issues.
  • Target Audience: The target audience for a digital service is key to determining when an online service is “directed to children.”
    • Proposal: Although the FTC does not propose moving away from the multi-factor test it uses to determine whether a site is child-directed, it proposes adding a list of examples of evidence that the agency will consider in analyzing audience composition and intended audience. This will include “marketing or promotional materials or plans, representations to consumers or to third parties, reviews by users or third parties, and the age of users on similar websites or services.”
    • Questions: The NPRM also seeks feedback on whether the FTC should provide an exemption from designation as a child-directed service, for companies that have empirical evidence that no more than a specific percentage of its users are likely to be children under the age of 13. It also asks a number of questions about the contours of such an exemption.
      • Mixed Audience: The NPRM also proposes adding an express definition of “mixed audience” sites to the Rule. As with the current Rule, mixed audience services are directed to children, but do not target children as their primary audience. Such services cannot collect, use, or disclose users’ information without verifiable parental consent unless they use a neutral method “that does not default to a set age or encourage visitors to falsify age information” to collect a user’s age or use another method “reasonably calculated to determine if the user is a child.” This would permit companies to apply COPPA protections only to users under the age of 13.

 

  • Verifiable Parental Consent: One of the fundamental features of the COPPA Rule is the requirement that companies obtain verifiable parental consent (VPC) from parents for the collection and use of children’s personal information.
    • Proposal: The NPRM focuses on the sharing of children’s information with third parties, especially with advertisers, by requiring companies to obtain a separate VPC for disclosures of a child’s personal information unless such disclosures are “integral to the nature of the website or online service.” (The NPRM provides the example of an “online messaging forum” as an example of a situation where information disclosure would be “integral.”) As the FTC explains in its Business Blog, this means that “COPPA-covered companies’ default settings would have to disallow third-party behavioral advertising and allow it only when parents expressly opt in.” In addition, as the NPRM makes clear, this requirement is feature-specific. So, if a company implements a “chatbot or other feature that simulates conversation” it must obtain VPC.
    • Questions: Interestingly, although the NPRM states several times that COPPA permits contextual advertising without VPC, the FTC is seeking comment on this issue. Question 10 asks, “Operators can collect persistent identifiers for contextual advertising purposes without parental consent so long as they do not also collect other personal information. Given the sophistication of contextual advertising today, including that personal information collected from users may be used to enable companies to target even contextual advertising to some extent, should the Commission consider changes to the Rule’s treatment of contextual advertising?”

 

  • Internal Operations and Notice: The COPPA Rule has long allowed companies to collect and use persistent identifiers without first getting VPC if they don’t collect any other personal information and use the persistent identifiers only to provide support for internal operations. In the NPRM, the agency expressly declined to provide a narrowed or expanded definition of “internal operations.” It also stated that it believes that the practice of ad attribution, which allows an advertiser to associate a consumer’s action with a particular ad, “currently falls within the support for the internal operations definition” except when it is used for behavioral advertising, amassing a profile on a specific individual, or directly contacting an individual.
    • Proposal: To increase transparency around the internal operations exception, however, the agency would require companies to specifically identify the way in which they will use a collected personal identifier in their online notices. In addition, the company must “describe the means it uses to ensure that it does not use or disclose the persistent identifier to contact a specific individual, including through behavioral advertising, to amass a profile on a specific individual, in connection with processes that encourage or prompt use of a website or online service, or for any other purpose, except as permitted by the support for the internal operations exception.”

 

  • Internal Operations and Engagement: As foreshadowed in the quoted language immediately above, the FTC is interested in issues that go beyond pure privacy concerns like “nudging.”
    • Proposal: The NPRM also proposes to expand the Rule’s restrictions on the internal operations exception to processes (including machine learning processes) that would “encourage or prompt” a child’s use of an online service. This would include “push notifications” that encourage kids to use their service more. Companies that use persistent identifiers to send these push notifications would also be required to flag that use in their direct and online notices. This would ensure parents are aware of, and have consented to, these processes.
    • Questions: Here, too, the agency seeks additional comment, asking how companies are currently using persistent identifiers to maximize user engagement and how it could distinguish between “user-driven” personalization versus personalization driven by a business. In a separate question, the NPRM also asks whether the Rule should address other engagement techniques, as well as whether the Rule should “differentiate between techniques used solely to promote a child’s engagement with the website or online service and those techniques that provide other functions, such as to personalize the child’s experience on the website or online service?”

 

  • Data security: Consistent with concerns that the FTC has raised about data security in the recent COPPA enforcement cases, the proposed Rule significantly expands the COPPA Rule’s existing data security requirement.
  • Proposal: The NPRM requires companies to have written comprehensive security programs that are proportional to the “sensitivity of children’s information and to the operator’s size, complexity, and nature and scope of activities.” It also sets out requirements for performing annual data security assessments, implementing and testing safeguards, and evaluating and modifying their info security programs on an annual basis. The proposed Rule would also require companies to obtain written assurances from third parties to whom they transfer personal information, such as service providers, to maintain the confidentiality, security, and integrity of information.

 

  • Safe Harbor oversight: Of particular interest to us are the additional reporting and transparency requirements for Safe Harbor programs. Several of the proposals reflect comments that we have made to the FTC and to members of Congress inviting additional oversight to ensure that all Safe Harbor programs fulfill their responsibilities under the COPPA Rule. Others may present operational challenges. We will provide detailed responses to these proposals in our public comment on the NPRM.

 

  • Online contact information: Recognizing the significant convenience and utility of text communications, the FTC also proposes adding mobile telephone numbers to the list of identifiers that constitute “online contact information” so that parents can provide consent via text message. The NPRM makes clear, however, that companies may only use a child’s number to send a text message, and that the agency will not permit companies to collect and use a child’s mobile telephone number to communicate with the child, unless it has obtained verifiable parental consent to do so.

EMPHASIS
Beyond these proposed changes, it’s worth noting what is staying the same, but with more emphasis. Two issues stand out:

    • Data minimization: The Rule has long prohibited companies from collecting more personal information than is reasonably necessary for a child to participate in a game, offering of a prize, or another activity. The NPRM reinforces this prohibition, making it clear that it applies even if a company has obtained VPC.

 

    • Data retention and deletion: The NPRM emphasizes the FTC’s focus on data retention in recent enforcement actions. Companies can only retain personal information for as long as necessary to fulfill the purpose for which it was collected: they cannot hold on to it indefinitely or use it for any secondary purpose. This means that a company that collects a child’s email address for account creation purposes, cannot use it for marketing purposes without VPC. The proposal would also require companies to post a data retention policy for children’s personal information to enhance parents’ ability to make informed decisions about data collection.

STATUS QUO
Finally, here’s what isn’t changing, at least not as part of the FTC’s rulemaking process:

    • Teens: First, despite making clear, in a variety of contexts (such as the Epic Games settlement and last year’s Advance Notice of Proposed Rulemaking on Commercial Surveillance and Lax Security Practices), that teens should benefit from privacy protections, the NPRM does not address raising the age of a “child” beyond 12, as urged by many commentors. This is because the agency does not have the authority to change the age of a child, which is established in the Act.

 

    • Knowledge Standard: Currently, COPPA only applies to “child-directed” services or when an operator has “actual knowledge.” Despite many comments urging the FTC to change the standard from the “actual knowledge” standard to a “constructive knowledge” or another less definite standard, the agency declined to do so. Instead, it includes a long discussion of the legislative history of the Act on this point, sending a strong signal to Congress that the ball is in its court on that issue – and other issues like teen privacy that would require Congressional amendment of the Act (as opposed to FTC modification of the Rule) — when it reconvenes for 2024.

 

    • Inferred Data: Similarly, the NPRM declines to include “inferred data” in the definition of personal information because the Act makes clear that COPPA applies to information collected from a child, not about a child.

 

    • Rebuttable presumption: The agency also declined to permit general audience platforms to rebut the presumption that all users of child-directed content are children, finding that the “reality of parents and children sharing devices, along with account holders remaining perpetually logged into their accounts, could make it difficult for an operator to distinguish reliably between those users who are children and those who are not.”

• • • • •
In announcing the NPRM, FTC Chair Lina Khan stated that, “The proposed changes to COPPA are much-needed, especially in an era where online tools are essential for navigating daily life . . . .” We agree that the COPPA Rule needs updating. As we have said in other comments, kids’ privacy rules should be modernized “to meet the challenges of social media, mobility, ad tech, and immersive technologies – issues that weren’t present when COPPA was enacted nearly 25 years ago.” As the FTC’s rulemaking unfolds, we’ll be following closely and providing guidance to our program members on complying with any new rules and implementing stronger protections for children’s privacy. To learn more about ESRB Privacy Certified’s compliance and certification program, please visit our website, find us on LinkedIn, or contact us at privacy@esrb.org.

Stacy Feuer HeadshotAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.

The post Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes appeared first on ESRB Ratings.

]]>
A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern https://www.esrb.org/privacy-certified-blog/a-new-season-for-kids-privacy-court-enjoins-californias-landmark-youth-privacy-law-but-protecting-children-online-remains-a-prime-concern/ Tue, 19 Sep 2023 21:21:19 +0000 https://www.esrb.org/?p=5631 Read our analysis of the NetChoice decision and tips about what it might mean for your kids’ privacy program.

The post A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern appeared first on ESRB Ratings.

]]>
Summer is definitely over. With the autumnal equinox just days away (Saturday, September 23, to be exact), there’s been a definite shift in the air – and in the children’s privacy world. Just as the fastest sunsets and sunrises of the year happen at the equinoxes, kids’ privacy developments are piling on rapidly right now.

Since the beginning of September, we’ve seen the Irish Data Protection Commission issue a huge, €345 million ($367 million) fine against TikTok for using unfair design practices that violate kids’ privacy. Delaware’s governor just signed a new privacy law that bans profiling and targeted advertising for users under the age of 18 unless they opt-in. And the Dutch data protection authority, just this week, announced an investigation into businesses’ use of generative AI in apps directed at young children.

As I was catching up with these matters yesterday, news broke that a federal district court judge in California had granted a preliminary injunction (“PI”) prohibiting the landmark California Age Appropriate Design Code Act (“CAADCA”) from going into effect on July 1, 2024. The judge ruled that the law violates the First Amendment’s free speech guarantees.

As ESRB Privacy Certified blog readers might recall, in September 2022, California enacted the CAADCA, establishing a far-reaching privacy framework that requires businesses to prioritize the “best interests of the child” when designing, developing, and providing online services. At the time, I wrote that the California law had the “potential to transform data privacy protections for children and teens in the United States.”

In particular, I pointed to the law’s coverage of children under the age of 18, its applicability to all online services “likely to be accessed by a minor,” and its requirement that businesses set default privacy settings that offer a “high level” of privacy protection (e.g., turning off geolocation and app tracking settings) unless the business can present a “compelling reason” that different settings are in the best interests of children. I also noted the Act’s provisions on age estimation/verification, data protection impact assessments (“DPIAs”), and data minimization as significant features.

In December 2022, tech industry organization NetChoice filed a lawsuit challenging the CAADCA on a wide range of constitutional and other grounds. In addition to a cluster of First Amendment arguments, NetChoice asserted that the Children’s Online Privacy Protection Act (“COPPA”), which is enforced primarily by the Federal Trade Commission (“FTC”), preempts the California law. The State of California, represented by the Office of the Attorney General, defended the law, arguing that the “Act operates well within constitutional parameters.”

Yesterday’s PI shifts the “atmospherics” of the kids’ privacy landscape dramatically. But the injunction doesn’t mean that businesses and privacy practitioners can ignore the underlying reasons for the CAADCA (which was passed overwhelmingly by the California legislature) or the practices and provisions it contains. Here’s a very rough analysis of the decision and some tips about what it might mean for your kids’ privacy program.

The Court’s Holding: In her 45-page written opinion, Judge Beth Labson Freeman held that “NetChoice has shown that it is likely to succeed on the merits of its argument that the provisions of the CAADCA intended to achieve [the purpose of protecting children when they are online] likely violates the First Amendment.” The Court held that the CAADCA is a regulation of protected expression, and not simply a regulation of non-expressive conduct, i.e., activity without a significant expressive element. Because she viewed the statute as implicating “commercial speech,” the Court analyzed the CAADCA under an “intermediate scrutiny standard of review.”

The Relevant Test: Under that standard (often referred to as the Central Hudson test based on the name of the Supreme Court case that formulated it), if the challenged regulation concerns lawful activity and speech that is not misleading, the government bears the burden of proving that (i) it has a “substantial interest” in the regulation advanced, (ii) that the regulation directly and materially advance the government’s substantial interest, and (iii) that the regulation is “narrowly tailored” to achieve that interest.

The Court recognized that California would likely succeed in establishing a substantial interest in protecting minors from harms to their physical and psychological well-being caused by lax data and privacy protections online. Reviewing the CAADCA’s specific provisions, however, it found that that many of the provisions  challenged by NetChoice did not meet the remaining prongs of the intermediate scrutiny test.

The Court’s Central Hudson Analysis: The Court made findings on each of the specific provisions challenged by NetChoice keyed to the Central Hudson factors. I highlight a few here:

  • Data Protection Impact Assessments (DPIAs): The Court held that California did not meet its burden to demonstrate that the requirement for businesses to assess their practices in DPIAs would alleviate any harms from the design of digital products, services, and features, to a material degree.
  • Age Estimation: Judge Freeman also found that the statutory requirement to estimate the age of child users with a “reasonable level of certainty” would likely fail the Central Hudson test: “[T]he CAADCA’s age estimation provision appears not only unlikely to materially alleviate the harm of insufficient data and privacy protections for children, but actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information.”
    • The Court also found that the age estimation provision would likely fail to meet the Central Hudson test because the effect of a business choosing not to estimate age, but instead to apply privacy and data protections broadly, would impermissibly shield adults from that same content. In reaching this conclusion, Judge Freeman rejected California’s argument that the “CAADCA does not prevent any specific content from being displayed to a consumer, even if the consumer is a minor; it only prohibits a business from profiling a minor and using that information to provide targeted content.”
    • Notably, later in the decision, Judge Freeman held that the age estimation provision is the “linchpin” of most of most of the CAADCA’s provisions and therefore determined it is not “functionally severable” from the remainder of the statute.
  • High Default Privacy Settings: The Court found that the CAADCA’s requirement for “high default privacy settings” would be likely to cause at least some businesses to prohibit children from accessing their services and products altogether.
  • Profiling by Default: Here, Judge Freeman held that the provision banning profiling of children by default could discard “beneficial aspects” of targeted information to certain categories of children, e.g., pregnant teenagers.
  • Dark Patterns: The Judge held that California did not meet its burden to establish that prohibitions on the use of dark patterns to lead or encourage children to provide unnecessary personal information would ameliorate a causally connected harm.

COPPA Preemption: Although the Court granted the injunction based on First Amendment considerations alone, it did, briefly, address NetChoice’s argument that the COPPA preempts the CAADCA. The Court rejected this argument at the PI stage, explaining: “In the Court’s view, it is not clear that the cited provisions of the CAADCA contradict, rather than supplement, those of COPPA. Nor is it clear that the cited provisions of the CAADCA would stand as an obstacle to enforcement of COPPA. An online provider might well be able to comply with the provisions of both the CAADCA and COPPA . . . . “

  • N.B. Judge Freeman’s decision to act cautiously on this claim makes sense. Recently, the Ninth Circuit Court of Appeals, in Google v. Jones, overturned her decision that COPPA preempted state law claims asserted in a class action alleging that Google/You Tube used persistent identifiers to collect data and track children’s online behavior surreptitiously and without their consent – conduct that also violates COPPA. Interestingly, in that case, the Ninth Circuit invited the FTC, which enforces COPPA, to express its views on the preemption issue. The FTC accepted, stating that “Congress did not intend to wholly foreclose state protection of children’s online privacy, and the panel properly rejected an interpretation of COPPA that would achieve that outcome.”


Takeaways:
The CAADCA litigation is far from over, and it is likely that the California Attorney General will seek an immediate interlocutory appeal. It is clear, though, that the district court’s decision will have consequences in the short term for state privacy laws that are scheduled to come into effect soon as well as for efforts underway in Congress on child-related online privacy and safety legislation. Here are a few takeaways:

  • Privacy Laws Can Still Pack a Punch: Regardless of whether the Court ultimately strikes down the CAADCA or not, many of the concepts in the design code are already embedded in other privacy laws that apply to game and toy companies’ activities, both without and within the United States. On the U.S. front, there are newly enacted child privacy provisions in state laws that should be able to withstand constitutional challenge. Plus, the NetChoice ruling might loosen the California’s Congressional delegation’s resistance to bipartisan federal legislation. Although today’s some may view the Court’s ruling as a reprieve, companies still need to meet other legal obligations.
    • For example, Connecticut recently passed child privacy amendments (scheduled to go into effect on October 1, 2024) to its privacy law that skirt some of the elements Judge Freeman found provisionally unconstitutional. Unlike the CAADCA, the Connecticut law does not require that companies estimate the age of their users; it applies only to companies that have “actual knowledge” of or “willfully disregard” the presence of minor users, and it does not regulate “potentially harmful” (as opposed to illegal) content. Instead of using the CAADCA “best interest of the child” standard, the Connecticut law establishes a duty to avoid a “heightened risk of harm” to minors and delineates potential harms.
  • DPIAs are still a “Must Do”: Most of the new state privacy laws passed in the last year contain requirements for data protection impact assessments, similar to those already required by the European Union’s General Data Protection Regulation (GDPR). At the beginning of September, the California Privacy Protection Agency published draft regulations that contain practical examples of how DPIAs should work under California’s comprehensive privacy law. Regardless of what happens with the CAADCA, statutory requirements for more focused DPIAs such as those in the California Consumer Privacy Act will likely remain.
    • Judge Freeman’s skepticism about the CAADCA’s DPIA provision aside, DPIAs can be a useful accountability tool for identifying privacy risks, working out when, where, and how likely they are to occur, and assessing the impact of such risks on your customers and business.
  • COPPA Continues to Be Relevant: It will probably take years for the court battle over the CAADCA to play out. In the meantime, if you know that children — or teenagers — are using your products, expect the FTC to enforce COPPA and other privacy protections aggressively. (For quick review of the FTC’s recent COPPA cases, see my previous blog post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions.)
    • Indeed, it’s likely the FTC will use both the substantive provisions of COPPA and the “unfairness” and “deception” prongs of Section 5 of the FTC Act to set requirements for child-friendly privacy disclosures, mandates for high privacy default settings, and prohibitions against manipulative dark patterns through its child-focused investigations and enforcement actions.
    • The NetChoice ruling – coupled with Congressional inaction – could also spur the FTC to complete its now-four-years-old COPPA Rule review and act on (at least parts of) last year’s privacy rulemaking proposal.

While this all unfolds, ESRB Privacy Certified will continue to help its program members comply with existing laws and adopt and implement best practices for children’s privacy. As privacy protections for kids and teens continue to evolve, we’ll be following closely and providing guidance to our program members on all of the moving parts of the complex children’s privacy landscape. To learn more about ESRB Privacy Certified’s compliance and certification program, please visit our website, find us on LinkedIn, or contact us at privacy@esrb.org.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.

The post A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern appeared first on ESRB Ratings.

]]>
Former FTC Regulator Stacy Feuer Joins ESRB as Senior Vice President, Privacy Certified https://www.esrb.org/blog/former-ftc-regulator-stacy-feuer-joins-esrb-as-senior-vice-president-privacy-certified/ Tue, 04 Jan 2022 16:14:47 +0000 https://www.esrb.org/?p=4628 NEW YORK, Jan. 4, 2022 – The Entertainment Software Rating Board (ESRB) today announced that Stacy Feuer has joined the organization as Senior Vice President, Privacy Certified, a leading online and mobile privacy compliance program. Established in 1999, the ESRB Privacy Certified program helps members navigate privacy protection laws in the U.S. and internationally, and […]

The post Former FTC Regulator Stacy Feuer Joins ESRB as Senior Vice President, Privacy Certified appeared first on ESRB Ratings.

]]>
NEW YORK, Jan. 4, 2022 – The Entertainment Software Rating Board (ESRB) today announced that Stacy Feuer has joined the organization as Senior Vice President, Privacy Certified, a leading online and mobile privacy compliance program. Established in 1999, the ESRB Privacy Certified program helps members navigate privacy protection laws in the U.S. and internationally, and was one of the first of its kind to be authorized by the Federal Trade Commission as a Safe Harbor under the Children’s Online Privacy Protection Act (COPPA).

Feuer brings more than two decades of experience in consumer protection and privacy policy and enforcement to the ESRB. In her past role as the Assistant Director for International Consumer Protection at the Federal Trade Commission (FTC), she represented the U.S. and the FTC internationally on consumer-related advertising, marketing, and data privacy issues involving new and emerging digital technologies. She also investigated and litigated advertising cases and coordinated the FTC’s work on the U.S. SAFE WEB Act, legislation that enhances the FTC’s cross-border cooperation powers.

“The ESRB Privacy Certified program continues to set a high bar with its self-regulatory standards and commitment to best practices,” said Feuer. “As a result, consumers, parents, and caregivers can be assured that their and their children’s personal data will be protected whenever they see Privacy Certified seals displayed. I am thrilled to join ESRB at this pivotal moment for data privacy to help Privacy Certified members meet ongoing and future compliance challenges creatively.”

“Stacy’s deep expertise in navigating the domestic and global regulatory landscape for privacy, consumer protection and e-commerce makes her a perfect choice to lead the Privacy Certified program,” said ESRB President Patricia Vance. “Stacy will bring enormous value to our member companies, helping guide them on compliance with an ever-increasingly complex array of consumer privacy regulations on the state, federal and global levels.”

Before joining the FTC, Stacy practiced international law at a Washington, DC firm, and served as a law clerk for a federal district court judge. Stacy graduated from Cornell University and the New York University School of Law. She holds a CIPP-US accreditation from the International Association of Privacy Professionals.


About ESRB

The ESRB is a non-profit, self-regulatory body that independently assigns age and content ratings for video games and mobile apps so parents can make informed choices. It also enforces advertising guidelines adopted by the video game industry and helps companies implement responsible online, mobile and internet connected device privacy practices under its Privacy Certified program. Visit www.esrb.org for more information.

About Privacy Certified

ESRB’s Privacy Certified program, an authorized Safe Harbor under the Children’s Online Privacy Protection Act (COPPA), helps companies comply with online and mobile privacy protection laws in the United States and beyond. Privacy Certified protects consumer privacy and is consistent with ESRB’s mission to help interactive entertainment companies conduct business responsibly while assuring consumers, especially parents, that their personal data is collected and managed responsibly. Look for the Privacy Certified seal. For more information, visit esrb.org/privacy.

Contact:

Johner Riehl
858.220.5626
johner@zebrapartners.net

The post Former FTC Regulator Stacy Feuer Joins ESRB as Senior Vice President, Privacy Certified appeared first on ESRB Ratings.

]]>
The UK Age Appropriate Design Code: Childproofing the Digital World https://www.esrb.org/privacy-certified-blog/the-uk-age-appropriate-design-code-childproofing-the-digital-world/ Thu, 21 Jan 2021 15:47:36 +0000 https://www.esrb.org/?p=4046 “A generation from now, I believe we will look back and find it peculiar that online services weren’t always designed with children in mind. When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthy, […]

The post The UK Age Appropriate Design Code: Childproofing the Digital World appeared first on ESRB Ratings.

]]>

“A generation from now, I believe we will look back and find it peculiar that online services weren’t always designed with children in mind. When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthy, get a good education or buckle up in the back of a car.”
– Information Commissioner Elizabeth Denham

In May 2018, the European Union’s General Data Protection Regulation (GDPR) went into effect, recognizing for the first time within the European Union (EU) that children’s personal data warrants special protection. The United Kingdom’s Data Protection Act 2018 adopted GDPR within the United Kingdom and, among other things, charged the Information Commissioner’s Office (ICO) with developing a code of practice to protect children’s personal data online. The result is the Age Appropriate Design Code (also referred to as the Children’s Code), an ambitious attempt to childproof the digital world.

The Internet was not built with children in mind, yet children are prolific users of the Internet. The Children’s Code, which is comprised of fifteen “Standards,” seeks to correct that incongruity by requiring online services that children are likely to use to be designed with their best interests in mind.

For over the last twenty years, the U.S. Children’s Online Privacy Protection Act (COPPA) has been the primary source of protection for children’s privacy online. COPPA protects the privacy of internet users under 13 years old, primarily by requiring informed, verifiable consent from a parent or guardian. The Children’s Code, however, has much grander aspirations. It protects all children under 18 years old, asking companies to reimagine their online services from the bottom up.

The foundational principle of the Children’s Code calls for online services likely to be accessed by children under 18 years old to be designed and developed with the best interests of the child as a primary consideration. The Children’s Code is grounded in the United Nations Convention on the Rights of the Child (UNRC), which recognizes that children have several rights, including the rights to privacy and to be free from economic exploitation; to access information; to associate with others and play; and to have a voice in matters that affect them.

To meet the best interests of the child, online services must comply with each of the applicable fifteen Standards. Those Standards are distilled below.

1. Assessing and Mitigating Risks
Compliance with the Children’s Code begins with a Data Protection Impact Assessment (DPIA), a roadmap to compliance and a requirement for all online services that are likely to be accessed by children under 18 years old. The DPIA must identify the risks the online service poses to children, the ways in which the online service mitigates those risks, and how it balances the varying and sometimes competing rights and interests of children of different age groups. If the ICO conducts an audit of an online service or investigates a consumer complaint, the DPIA will be among the first documents requested.
The ICO suggests involving experts and consulting research to help with this process. This might not be feasible for all companies. At a minimum, however, small- and medium-sized companies with online services that create risks to children will be expected to keep up to date with resources that are publicly available. More will be expected of larger companies.
While the Internet cannot exist without commercial interests, the primary consideration must be the best interests of the child. If there is a conflict between the commercial interests of an online service and the child’s interests, the child’s interests must prevail.

2. Achieving Risk-Proportionate Age Assurance
To adequately assess and mitigate risk, an online service must have a level of confidence in the age range(s) of its users that is proportionate to the risks posed by the online service. The greater the risk, the more confidence the online service must have.
The ICO identifies several options to obtain what it calls “age assurance,” which can be used alone or in combination depending on the circumstances. Age assurance options include self-declaration by users (a/k/a age gates), artificial intelligence (AI), third-party verification services, and hard identifiers (e.g., government IDs). Less reliable options, like age gates, are only permitted in low-risk situations or when combined with other age assurance mechanisms.

Achieving an adequate level of confidence will be challenging. The Verification of Children Online (VoCO), a multi-stakeholder child online safety research project led by the U.K.’s Department for Digital, Culture, Media & Sport (DCMS), is attempting to address that challenge. The VoCO Phase 2 Report provided the following potential flow as an example:
[F]or a platform that needs a medium level of confidence, a user could initially declare their age as part of the onboarding process, and alongside this an automated age assurance method (such as using AI analysis) could be used to confirm the declared age. If this measure suggests a different age band than that stated, which reduces confidence in the initial assessment, a request could be made to validate the user’s age through a verified digital parent.

Ultimately, if an online service is unable to reach an appropriate level of confidence, it has two options: 1) take steps to adequately reduce the level of risk; or 2) apply the Children’s Code to all users, even adults.

3. Setting High Privacy by Default
For all children, high privacy must be the default setting. This means an online service may only collect the minimum amount of personal data needed to provide the core or most basic service. Additional, optional elements of the online service, for example to personalize offerings, would have to be individually selected and activated by the child. To illustrate this point, the ICO uses the example of a music download service.

An example of privacy settings that could apply to a music service

High privacy by default also means that children’s personal information cannot be used in ways that have been shown to be detrimental. Based on specific Standards within the Children’s Code, this means the following must be turned off by default:

  • Profiling (for example, behavioral advertising);
  • Geolocation tracking;
  • Marketing and advertising that does not comply with The Committee of Advertising Practice (CAP) Code in the United Kingdom;
  • Sharing children’s personal data; and
  • Utilizing nudge techniques that lead children to make poor choices.

To turn these on, the online service must be able to demonstrate a compelling reason and adequate safeguards.

4. Making Online Tools Available
Children must be given the tools to exercise their privacy rights, whether it be opting into optional parts of a service or asking to delete or get access to their personal information. The tools should be highlighted during the start-up process and must be prominently placed on the user’s screen. They must also be tailored to the age ranges of the users that access the online service. The ICO encourages using easily identifiable icons and other age-appropriate mechanisms.

5. Communicating Age-Appropriate Privacy Information
The Children’s Code requires all privacy-related information to be communicated to children in a way they can understand. This includes traditional privacy policies, as well as bite-sized, just-in-time notices. To help achieve this daunting task, the ICO provides age-based guidance. For example, for children 6 to 9 years old, the ICO recommends providing complete privacy disclosures for parents, while explaining the basic concepts to the children. If a child in this age range attempts to change a default setting, the ICO recommends using a prompt to get the child’s attention, explaining what will happen and instructing the child to get a trusted adult. The ICO also encourages the use of cartoons, videos and audio materials to help make the information understandable to children in different age groups and at different stages of development.

For connected toys and devices, the Children’s Code requires notice to be provided at the point of purchase, for example, a disclosure or easily identifiable icon on the packaging of the physical product. Disclosures about the collection and use of personal data should also be provided prior to setup (e.g., in the instructions or a special insert). Anytime a connected device is collecting information, it should be obvious to the user (e.g., a light goes on), and collection should always be avoided when in standby mode.

6. Being Fair
The Children’s Code expects online services to act fairly when processing children’s personal data. In essence, this means online services must say what they do, and do what they say. This edict applies not just to privacy disclosures, but to all published terms, policies and community standards. If, for example, an online service’s community standards prohibit bullying, the failure to enforce that standard could result in a finding that the online service unfairly collected a child’s personal data.

Initial implementation of the Children’s Code will be a challenge. User disruption is inevitable, as are increased compliance and engineering costs. The return on that initial investment, however, will hopefully make it all worthwhile. If Commissioner Denham’s vision is realized, the digital world will become a safe place for children to socialize, create, play, and learn.

This article has been published in PLI Chronicle, https://plus.pli.edu.

If you have more questions about the Age Appropriate Design Code or you want to learn more about our Program, please reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post The UK Age Appropriate Design Code: Childproofing the Digital World appeared first on ESRB Ratings.

]]>
5 Privacy Tips for Parents with Children on Mobile Devices https://www.esrb.org/blog/5-privacy-tips-for-parents-with-children-on-mobile-devices/ Mon, 30 Mar 2020 16:00:27 +0000 https://www.esrb.org/?p=2719 1. Review the Product Detail Information in the App Stores App storefronts contain important information about the mobile apps they offer, including: The company or individual that is offering or selling the app; Whether the app has been updated recently or if it has been months or even years since its last update; The permissions […]

The post 5 Privacy Tips for Parents with Children on Mobile Devices appeared first on ESRB Ratings.

]]>
1. Review the Product Detail Information in the App Stores

App storefronts contain important information about the mobile apps they offer, including:

  • The company or individual that is offering or selling the app;
  • Whether the app has been updated recently or if it has been months or even years since its last update;
  • The permissions the app requires; and
  • A link to the app’s privacy policy, which should include a clear explanation of the information the app collects, how that information is used, and how it is shared.

Parents should not skip over this information.

2. Download the App with Your Child

The first time a child opens an app, a number of things might happen.  The child might be asked to give the app various permissions, agree to a privacy policy and terms of service, provide age information, and possibly create an account.  Parents should be part of this process.

3. Review the App’s Privacy Policy

No one expects parents to review every word of a privacy policy, but they should look through them for some key information.  Most importantly, many privacy policies specifically identify how the app developer treats children’s information.  Run a search for the words “child” and “kid,” which should help you identify any sections of the privacy policy that specifically address how children’s information is handled.  If the privacy policy says the app is not intended for children under 13 years old, this means it likely does not abide by restrictions required by the Children’s Online Privacy Protection Act, which is a federal law intended to protect the online privacy of children under 13.

4. Enter Accurate Age Information

Many apps that are intended for children ask for age information.  Be sure your child’s correct age is entered.  Apps may utilize this information to determine how to treat your child’s information.  For example, under U.S. law, children under 13 are entitled to additional privacy protections.  Specifically, if your child is under 13, the app developer must either restrict the information it collects from your child or provide you with notice and get your consent before it collects the information.  If, however, you or your child enter an age of 13+, the app will treat your child as an adult from a privacy standpoint.  This means the app will collect and use your child’s information the same way it would collect and use your information.  It may also activate features of the app that are only available to users 13 and over, such as chat and account creation.

5. Look for a Safe Harbor Seal

Privacy Certified Kids Seal

The Privacy Certified Kids Seal.

The Children’s Online Privacy Protection Act empowers the Federal Trade Commission to designate Safe Harbor providers to certify that apps (and other online services, like websites) meet the requirements of the law.  This means an independent third party has already taken the time to vet the app from a privacy standpoint.  ESRB is one of the approved Safe Harbors, and we take our responsibility very seriously.  If you see our Privacy Certified Kids Seal, you can feel confident your children’s privacy will be protected.

The post 5 Privacy Tips for Parents with Children on Mobile Devices appeared first on ESRB Ratings.

]]>
2019 – Another Historic Year for Children’s Privacy https://www.esrb.org/privacy-certified-blog/2019-another-historic-year-for-childrens-privacy-coppa/ Mon, 24 Feb 2020 21:19:40 +0000 https://www.esrb.org/?p=2656 2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if […]

The post 2019 – Another Historic Year for Children’s Privacy appeared first on ESRB Ratings.

]]>
2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if passed, would reshape COPPA. Outside the United States, the United Kingdom’s Information Commissioner’s Office (ICO) has been driving the conversation with its game-changing Age Appropriate Design Code (Code), which is awaiting Parliament’s approval. And the world’s largest mobile storefronts—Apple and Google Play—have taken steps to be more protective of children’s privacy.

FTC Sets COPPA Record, Then Breaks It
The FTC remains the world’s top cop when it comes to children’s privacy. It began 2019 by breaking the record for the largest monetary penalty in a COPPA case when it agreed to a settlement with the operators of TikTok (f/k/a Musical.ly) for $5.7 million. TikTok demonstrated a more aggressive approach both in how the FTC defines an online service “directed to children” and also in how it applies COPPA’s “actual knowledge” standard.

The record set in TikTok, however, did not last long. In September 2019, the FTC and the New York Attorney General announced a COPPA settlement with Google and YouTube, which included a $136 million penalty paid to the FTC and $34 million penalty paid to New York—either of which on its own would have been the largest ever monetary penalty in a COPPA case by a significant margin.

More significantly, the settlement required YouTube to materially change its business. Going forward, all YouTube channel owners must specify whether their channels are directed to children. If a channel is not child-directed, the channel owner must still identify individual videos that are child-directed. When content is identified as child-directed, YouTube turns off several features of the platform, including (i) the collection of personal data for behavioral advertising, and (ii) the ability for users to leave public comments.

The FTC also settled a third COPPA case in 2019 against the operators of i-Dressup.com, a dress-up website that had already been shut down. While that case settled for the relatively modest sum of $35,000, it has important symbolic value insofar as it shows even small companies can land on the FTC’s radar.

FTC Solicits COPPA Comments
In addition to its enforcement activities, the FTC was extremely busy in 2019 soliciting public comments on its application of the COPPA Rule and hosting a COPPA workshop. The request for comments, which was published in July 2019, surprised the privacy community and industry because it occurred three years before the FTC’s typical 10-year cycle. The FTC received over 175,000 submissions by the December deadline, including one submission from us.

U.S. Lawmakers Seek to Update COPPA
In March 2019, Senators Markey and Hawley introduced what became known as COPPA 2.0. The bill would, among other things:

    • •Extend COPPA protections to minors 13 to 15 years old;
    • •Extend COPPA to operators of online services that have “constructive knowledge” they are collecting personal information from children or minors; and
    • •Prohibit the use of children’s personal information for targeted marketing and place limits on the use of minors’ personal information for that purpose.

(Bonus material: In January 2020, lawmakers introduced two bills that would amend COPPA: The Protect Kids Act, which was introduced by Congressmen Tim Walberg and Bobby Rush, would, among other things, extend COPPA’s protections to all children under 16 years old. The PRIVCY Act, introduced by Congresswoman Kathy Castor, would essentially re-write COPPA. Among other things, it would extend protections to children under 18 years old and remove the concept of “child-directed” online services in favor of an “actual and constructive knowledge” standard.)

Outside the U.S., the ICO is Trying to Re-Define How Online Services Approach Children’s Privacy
In 2019, the ICO released its long-awaited proposal for an Age Appropriate Design Code—a set of 15 “standards” aimed at placing the best interests of the child above all other considerations for online services “likely to be accessed” by children under 18 years old. The standards would require, among other things, high-privacy default settings, communicating privacy disclosures and choices in ways that are appropriate to the ages of the children likely to access the online service, and eliminating uses detrimental to children’s wellbeing.

Following the initial consultation period, in November 2019, the ICO revised the Code and submitted the final version to the Secretary of State. The Code now awaits approval by Parliament.

Storefronts Take Steps to Strengthen Children’s Privacy
Apple and Google Play also took steps to strengthen children’s privacy. In May 2019, for example, the FTC announced Apple and Google Play removed three dating apps from their storefronts after the FTC warned the apps were violating COPPA.

In addition, both Apple and Google Play revised their developer policies. On Google Play, developers must now identify the target audience for each of their apps. Apps targeted to children have certain restrictions, including with respect to the types of adverts served and networks used. For its part, Apple has placed restrictions on the use of third-party analytics and advertising in apps directed to children.

Have more questions about recent developments in the area of children’s privacy? Feel free to reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post 2019 – Another Historic Year for Children’s Privacy appeared first on ESRB Ratings.

]]>
ICO Publishes the Final Version of the Age Appropriate Design Code https://www.esrb.org/privacy-certified-blog/ico-publishes-the-final-version-of-the-age-appropriate-design-code/ Thu, 23 Jan 2020 22:16:50 +0000 https://www.esrb.org/?p=2615 On January 21, the UK’s Information Commissioner’s Office (ICO) published the final version of its Age Appropriate Design Code (the Code). The Code, which was first released for comment in April 2019, is comprised of 15 Standards that will impact the way in which companies assess the age of and risks to their users; the […]

The post ICO Publishes the Final Version of the Age Appropriate Design Code appeared first on ESRB Ratings.

]]>
On January 21, the UK’s Information Commissioner’s Office (ICO) published the final version of its Age Appropriate Design Code (the Code). The Code, which was first released for comment in April 2019, is comprised of 15 Standards that will impact the way in which companies assess the age of and risks to their users; the types of personal data they collect; how that data is used and shared; how they present privacy disclosures, tools, and choices to their users; and the overall design of their products and services. The overarching principle of the Code is that online products and services “likely to be accessed by children” under 18 years old must be designed with the best interests of those children in mind.

The Standards set forth in the final version of the Code are largely unchanged from the initial draft in April. However, after a long consultation period, the final version of the Code does reflect some important compromises by the ICO.

First, while the ICO makes clear that compliance with the Standards will be required, it has clarified that the additional 100+ pages of guidance in the Code is just that, guidance. Companies will have some flexibility to come up with their own methods to comply with the Standards. That said, companies would be shortsighted not to give proper weight to the ICO’s guidance.

Second, the initial draft of the Code essentially placed the burden on companies to prove their online products and services were not likely to be accessed by children. In the final version, the ICO clarifies that the analysis will likely depend on:

  • the nature and content of the service and whether [it] has particular appeal for children; and
  • the way in which the service is accessed and any measures [the company] put[s] in place to prevent children gaining access.

These factors allow companies far more flexibility than the presumptive approach taken in the initial draft, which will hopefully reduce the amount of unnecessary data collection done solely to confirm a user’s age.

Third, and related, the final version of the Code takes a risk-based, proportionate approach to age verification. Whether and how a company verifies a user’s age will depend on (i) the age range(s) of the users; (ii) the level of certainty the company has about the age range(s); and (iii) the risks the online products and services pose to those users. Under certain low-risk circumstances, for example, a traditional age gate, where a user’s self-declared age is accepted without verification, might be appropriate. In contrast, the initial draft of the Code seemingly banned traditional age gates, which would have required companies to employ more intrusive verification methods.

The Code still has some final hurdles to overcome, including approval by Parliament, and will begin with a 12-month transition period. Companies, however, will likely need all that time, and possibly more!

Have more questions about the Age Appropriate Design Code? Feel free to reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post ICO Publishes the Final Version of the Age Appropriate Design Code appeared first on ESRB Ratings.

]]>
Does an E or E10+ Rating Mean a Game or App is “Directed to Children” for Purposes of COPPA? https://www.esrb.org/privacy-certified-blog/does-an-e-or-e10-rating-mean-a-game-or-app-is-directed-to-children-for-purposes-of-coppa/ Wed, 18 Dec 2019 15:16:23 +0000 https://www.esrb.org/?p=2497 The Federal Trade Commission (FTC) recently requested comments on its implementation of the regulations under the Children’s Online Privacy Protection Act (COPPA). As the self-regulatory body that independently assigns age ratings for video games and mobile apps, and an FTC-approved Safe Harbor program under COPPA, ESRB was one of many to submit comments to the […]

The post Does an E or E10+ Rating Mean a Game or App is “Directed to Children” for Purposes of COPPA? appeared first on ESRB Ratings.

]]>
The Federal Trade Commission (FTC) recently requested comments on its implementation of the regulations under the Children’s Online Privacy Protection Act (COPPA). As the self-regulatory body that independently assigns age ratings for video games and mobile apps, and an FTC-approved Safe Harbor program under COPPA, ESRB was one of many to submit comments to the FTC. Our comments addressed several issues, none more important than the need to maintain the distinction between online services that are “child friendly” or “child ready” and those that are directed to children for purposes of COPPA.

Taking a step back, COPPA is triggered when either:

(i) the operator of an online service, like a website or app, has actual knowledge it has collected personal information from a child under 13 years old; or
(ii) the online service is directed to children under 13 years old (“Children”).

 

To determine whether an online service is directed to Children, the FTC considers “its subject matter, visual content, use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities who appeal to children, language or other characteristics . . . , as well as whether the advertising promoting or appearing . . . is directed to children” and “empirical evidence.” These factors are considered in their totality.

Too often we have heard the argument that an ESRB age rating, specifically an E or E10+ rating means a game or app is directed to Children. As we did in our comments to the FTC, we write now to dispel that fallacy.

The ESRB E (Everyone) rating indicates the content of a game or app is generally suitable for all ages, and may contain minimal cartoon, fantasy or mild violence, or infrequent use of mild language. The ESRB E10+ (Everyone 10+) rating indicates the content is generally suitable for ages 10 and up, and may contain more cartoon, fantasy or mild violence, or minimal suggestive themes. Neither of these ratings, nor any of the other age ratings assigned by ESRB, however, indicate that a game or app is directed to Children for purposes of COPPA.

Indeed, ESRB rates numerous games and apps E or E10+ that are directed to a general or older audience. These include, for example, travel apps, maps and navigation apps, ride-sharing apps, sports news and entertainment apps, retailer apps, card games, puzzle games, flight simulators, and many others. Although those online services do not contain content that would be inappropriate for Children, and indeed in some cases many Children do use them, no one would argue they are directed to Children for purposes of COPPA.

To be clear, when assigning ratings, ESRB does not consider the intended or even actual audience of a video game or mobile app. Nor does ESRB directly consider any of the factors the FTC weighs under COPPA. Rather, when assigning age rating information, ESRB considers the presence, type, degree, and context of violence, blood, gore, language, suggestive themes, crude humor, simulated or actual gambling, drugs, alcohol, tobacco, nudity, and sexual content that may be included in a game or mobile app. ESRB’s Ratings Guide makes this abundantly clear.

Simply put, ESRB’s E and E10+ rating categories are not evidence an online service is directed to Children.

ESRB Privacy Certified is a privacy-focused compliance and certification program. We work with our member companies to navigate these sometimes-difficult issues, including to identify online service that are “directed to Children” for purposes of COPPA and to ensure those services comply with COPPA’s requirements. If you have any questions about our program, please reach out to us at privacy@esrb.org.

The post Does an E or E10+ Rating Mean a Game or App is “Directed to Children” for Purposes of COPPA? appeared first on ESRB Ratings.

]]>