Privacy Certified Blog Archives | ESRB Ratings Provides ratings for video games and apps, including age ratings, content descriptors and interactive elements. Fri, 09 Aug 2024 21:47:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://www.esrb.org/wp-content/uploads/2019/06/cropped-Favicon.png Privacy Certified Blog Archives | ESRB Ratings 32 32 #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action https://www.esrb.org/privacy-certified-blog/kidsprivacy-trending-for-tiktok-top-takeaways-from-the-new-coppa-enforcement-action/ Fri, 09 Aug 2024 19:56:28 +0000 https://www.esrb.org/?p=6038 Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel.

The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.

]]>
Photo credit: Kassandra Acuna, ESRB Privacy Certified

There are hundreds of hot hashtags on TikTok, the viral video-sharing platform with over 1 billion users worldwide, but it’s safe to say that #kidsprivacy (or even #privacy) isn’t among them.  The top hashtags in the U.S. for the past week (which change almost daily) collectively generated 286K posts over seven days while #childprivacy and variants have a grand total of 182 posts from all over the world, for all (TikTok) time. Still, children’s privacy is a trending topic for the platform, which has been facing global scrutiny over its children’s data privacy practices.

To date, TikTok has paid out roughly half a billion dollars in children’s privacy suits brought by regulators (and private plaintiffs) in the United States, as well as the United Kingdom and the European Union. Last week, TikTok’s privacy woes exploded when the U.S. Department of Justice (DOJ), acting on behalf of the Federal Trade Commission (FTC), filed a complaint in a federal court in California against TikTok, its Chinese parent, ByteDance Ltd., and several related entities (collectively, TikTok) alleging “unlawful massive-scale invasions of children’s privacy” affecting millions of children under the age of 13.

As expected, the government alleged that TikTok “flagrantly violat[ed]” the Children’s Online Privacy Protection Act (COPPA), and the COPPA Rule.  The government also alleged that TikTok infringed a settlement agreement with the FTC over an earlier COPPA lawsuit that arose from the FTC’s 2019 investigation of TikTok’s predecessor company, Musical.ly.

The FTC’s original 2019 complaint alleged that the video sharing platform shared extensive personal information from children under the age 13 without verifiable parental consent (VPC) as required by COPPA. User accounts were public by default, which meant that other users could see a child’s personal information, including their profile bio, username, picture, and videos. Although the app allowed users to change their default setting from public to private so that only approved users could follow them, kids’ profile pictures and bios remained public, and strangers could still send them direct messages.

TikTok ultimately entered into a consent order with the FTC, forking over $5.7 million in civil monetary penalties to resolve the action, the largest COPPA fine at that time. (Since then, the FTC has obtained much larger monetary settlements in COPPA cases against Google/YouTube ($170 million) and Epic Games ($275 million)). The 2019 order also required TikTok, among other things, to destroy all personal information collected from users under age 13 or obtain parental consent for those accounts.

The main claims in the new lawsuit are that Tiktok: (1) knowingly created accounts for children and collected data from those children without first notifying their parents and obtaining verifiable parental consent (VPC); (2) failed to honor parents’ requests to delete their children’s accounts and information; and (3) failed to delete the accounts and information of users they know are children. As the FTC put it in its press release announcing the new case, TikTok was “aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about . . . compliance failures that put children’s data and privacy at risk. Instead of complying . . . TikTok spent years knowingly allowing millions of children under 13 on their platform . . . in violation of COPPA . . . .”

Unlike the 2019 case, the new TikTok action is not a settlement, and the government will need to prove its allegations in court to prevail on its claims. TikTok has made clear that it disagrees with the complaint’s  allegations, stating that many “relate to past events and practices that are factually inaccurate or have been addressed.”  What will happen next, though, is unclear.

Although we expect that TikTok will file a motion to dismiss the complaint, TikTok is facing much larger stakes than COPPA’s $51,744 per violation civil penalty. (Even if you only calculate violations per kid, that’s an astronomical amount given that they were “ubiquitous” on the platform.)  The COPPA case is playing out alongside TikTok’s existential tangle with the U.S. government over Congress’ “ban or divest” law.  TikTok has challenged the constitutionality of that law, which requires ByteDance to divest its U.S. TikTok assets by January 19, 2025 or face a ban on the app.

Regardless of what happens, the government’s complaint provides insights into the FTC’s views on what companies can and can’t do with kids’ data under COPPA.  Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel:

1)  You Can’t Use Kids’ Information for Profiling and Marketing Under the “Internal Operations” Exception: Following the 2019 settlement, TikTok used an age gate (in this case, a date of birth prompt) to identify U.S. users under the age of 13 and created “TikTok for Younger Users” (what the complaint calls “Kids Mode”), a limited experience that allows kids to view videos but does not allow them to create or upload videos, post information publicly, or message other users. Although TikTok touted its “safety and privacy protections designed specifically for an audience that is under 13 years old,” according to the complaint, it still collected and used “extensive” personal information – “far more data than it needed” –  from Kids Mode account holders without first providing parental notice or obtaining VPC.

The information collected included username, password, and date of birth along with persistent identifiers like IP address and unique device identifiers. According to the complaint, TikTok combined this information with app activity data, device information, mobile carrier information, and app information to amass profiles on children and share it with third parties. In one outrageous example, the complaint alleges that TikTok shared kids’ profiles with the analytics and marketing measurement platform AppsFlyer and with Facebook, so they could “retarget” (lure back) users whose engagement had declined.

As the complaint makes clear, TikTok’s use of persistent identifiers like device ID from Kids Mode users does not comport with the “internal operations” exception, which only permits companies to use such identifiers without VPC if they do not collect any other personal information and only “for the sole purpose” of providing support for an online service’s internal operations. Although there is some scope for companies to collect and use kids’ information for internal operations without VPC, companies cannot interpret the internal operations exception broadly to cover the collection and use of persistent identifiers for profiling and marketing.

2) You Can’t Allow Kids to Circumvent COPPA: Although COPPA does not require companies to validate users’ ages, you can’t allow users to circumvent COPPA by building, “back doors that allowed users to bypass the age gate . . . .” In the complaint, the government alleges that by allowing users to use login credentials from certain third-party online services, including Instagram and Google, TikTok allowed users to avoid the age gate altogether and set up regular accounts. These policies and practices led to the creation of millions of “unknown user” accounts that allowed children to gain access to adult content and features of the general TikTok platform. TikTok, in turn, then collected and maintained vast amounts of personal information from the children who created and used these regular TikTok accounts without their parents’ consent.

3) Make Sure Your Age Gates Work: The complaint alleges that kids could easily retry the age gate. TikTok did not prevent children who initially put in an under-13 birth date from restarting the account creation process and providing a new birth date that would make them old enough to lift Kids Mode. As the FTC’s COPPA FAQs have long recommended, you should use technical means, such as a cookie, to prevent children from back-buttoning to enter a different age.

4) Don’t Make Deletion Difficult – and Do It!: Much of the complaint focuses on TikTok’s failure to delete accounts and information that “even their own employees and systems identify as belonging to children” as well as its other failures to delete children’s personal data upon parental request. The government alleges, for example, that TikTok required parents to “navigate a convoluted process” to request the deletion of personal information collected from their children. TikTok often did not honor parents’ requests, either by not responding to their requests at all, or by only deleting accounts if there were “objective indicators” that the account holder was under 13 or the parent completed a form certifying under penalty of perjury that they were the parent or guardian of the account holder. Alongside these allegations, the complaint also alleges that TikTok retained kids’ data in databases long after purportedly deleting their accounts.

  • One interesting claim in the complaint is that TikTok should have deleted children’s personal information – such as photos and voice recordings – incorporated into other users’ videos and comments on other users’ posts. TikTok allegedly possessed identifiers linking the incorporated information to an account that they deleted because it belonged to a child.

5) Don’t Mislead Regulators: The government’s complaint also details the ways in which TikTok failed to maintain records and communications relating to its children’s privacy practices and compliance with the 2019 order. More critically, the complaint alleges that TikTok made false statements that it had removed child accounts and deleted the associated data. Instead, as the complaint states, TikTok retained and had been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed],” including the data of child, teen, and adult users, including IP addresses, device IDs, device models, and advertising IDs. If true, that’s TikTok cringe-worthy.

  • Despite this reference to teens’ data (and an earlier reference to the number of teens – two-thirds- that report using TikTok), it’s notable that the government’s action does not include a claim under Section 5 of the FTC Act concerning TikTok’s privacy and marketing practices to teens, similar to those it advanced in other, recent COPPA actions.

We’re sure there’s lots more to learn from the complaint, but for now we’ll stick with these five takeaways. We’ll be following the case closely as it plays out in the federal court and providing other pointers to ESRB Privacy Certified members.  And maybe we’ll check out next week’s top hashtags to see if #kidsprivacy makes it to the top ten, unlikely as that seems.

• • •

Stacy Feuer HeadshotAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.

The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.

]]>
Proposed American Privacy Rights Act (APRA) signals new phase of privacy regulation: ESRB Privacy Certified welcomes COPPA-style compliance mechanism https://www.esrb.org/privacy-certified-blog/proposed-american-privacy-rights-act-apra-signals-new-phase-of-privacy-regulation-esrb-privacy-certified-welcomes-coppa-style-compliance-mechanism/ Thu, 18 Apr 2024 15:28:51 +0000 https://www.esrb.org/?p=5891 Photo credit: NASA/Jordan Salkin Last week, on the eve of the total solar eclipse, a bipartisan, bicameral “discussion draft” for a comprehensive federal privacy law – the American Privacy Rights Act of 2024 (APRA) – landed in Congress. Accompanied by almost as much excitement as the celestial event itself (at least among privacy professionals, a […]

The post Proposed American Privacy Rights Act (APRA) signals new phase of privacy regulation: ESRB Privacy Certified welcomes COPPA-style compliance mechanism appeared first on ESRB Ratings.

]]>
Photo credit: NASA/Jordan Salkin

Last week, on the eve of the total solar eclipse, a bipartisan, bicameral “discussion draft” for a comprehensive federal privacy law – the American Privacy Rights Act of 2024 (APRA) – landed in Congress. Accompanied by almost as much excitement as the celestial event itself (at least among privacy professionals, a cohort as intense as eclipse-chasers (umbraphiles)), the APRA would create a long-elusive national data privacy regime and reduce the ever-growing patchwork of state comprehensive privacy laws.

Top Provisions

The discussion draft introduced by Senate Committee on Commerce, Science and Transportation Chair Maria Cantwell (D-Wash.) and House Energy & Commerce Chair Cathy McMorris Rogers (R-Wash.) on April 7 (since updated slightly) would markedly remake the landscape of data privacy in the United States. Substantively, it would:

  • Cover a wide swath of businesses (including non-profits and “common carriers” currently excluded from Federal Trade Commission (FTC) jurisdiction) defined as “covered entities” and impose additional requirements on data brokers, “large data holders” and “high-impact social media companies.” (Conversely, it would exempt small businesses from many requirements.);
  • Adopt a broad definition of covered data (i.e., data that “identifies or is linked or reasonably linkable” to an individual or device, either alone or in combination with other information);
  • Impose strong transparency and strict data minimization by default requirements, prohibiting the processing of consumers’ personal data unless it meets general data minimization principles or one of 15 specified purposes;
  • Provide heightened protections for a long list of types of “sensitive personal data,” including the data of minors under the age of 17 and including requiring express affirmative consent for transfers of such data. (The APRA does not contain extensive provisions on children’s and teen’s data, with many observers positing that the drafters plan to port pending provisions from pending online privacy and safety bills into the legislation. Indeed, at yesterday’s House Innovation, Data, and Commerce Subcommittee’s hearing on the APRA and several kids’ privacy and safety bills, many lawmakers and witnesses expressed concerns about kids’ privacy, especially targeted advertising.); and
  • Grant consumers the right to access, correct, delete, and export their data, including the right to opt out of targeted advertising, consistent with many of the state laws enacted over the past two years.

And, although not featured in most analysis of the bill’s provisions to date, the draft bill would lay the foundation for a “regulated, self-regulatory” (or co-regulatory) approach to privacy compliance and enforcement – a critical feature from our privacy compliance and certification perspective. (There’s much, much more including provisions on algorithms and civil rights, dark patterns, data security, enforcement (including a limited private right of action for certain provisions), executive responsibility, and pre-emption of most, but not all, state privacy laws. For a helpful overview, see this “top takeaways” summary from the International Association of Privacy Professionals (IAPP) as well as their other APRA resources, including this cheat sheet.)

This isn’t Congress’ first go at comprehensive federal privacy legislation. Members of both chambers have introduced literally dozens and dozens of bills over the years. Not one has passed. Two years ago, another bipartisan, bicameral group of legislators introduced the American Data Protection and Privacy Act (ADPPA) during the “hot privacy summer” of ’22. The ADPPA advanced out of the House Energy & Commerce committee but never made it to the House floor, due, in part, to strong opposition by the California congressional delegation and Senator Cantwell. The APRA appears to be the successor to that legislation, although there are some significant differences, especially on the flash-point issues of federal preemption and private rights of action.

Co-regulatory Compliance Codes

Will APRA face the same fate as its predecessor? It’s too early to know. There are a number of factors that could doom this attempt as well. Still, from ESRB Privacy Certified’s perspective as an FTC-authorized COPPA Safe Harbor program, it’s encouraging that the APRA – like the ADPPA two years ago – provides for business or industry-developed “compliance codes” based on the Act’s requirements as a mechanism for companies to comply with the APRA’s highly-complex privacy rules. Critically, the APRA would provide code participants with a “rebuttable presumption” of compliance if they adhere to an FTC-approved compliance code rooted in the APRA’s provisions and submit to compliance assessments by an independent organization.

Sounds familiar? Undoubtedly, for participants in COPPA Safe Harbor programs like ESRB Privacy Certified, the concept of a privacy compliance and certification scheme for a federal privacy law echoes the COPPA framework. Although the APRA’s provisions are not precisely the same as the those required by COPPA, they follow a similar model. In short, the APRA would permit “covered entities” – other than data brokers and large data holders – to submit compliance guidelines “governing the collection, processing, retention, and transfer of covered data” to the FTC for approval. Once approved by the FTC, companies would be required to publicly self-certify compliance with the approved guidelines subject to oversight by an independent organization.

Compliance Codes Criteria

To obtain FTC approval, a covered entity would need to submit:

  • a description of how the proposed guidelines will “meet or exceed” the requirements of the APRA;
  • a description of the entities or activities the proposed guidelines are designed to cover;
  • a list of the covered entities, to the extent known at the time of application, that intend to adhere to the proposed guidelines;
  • a description of an independent organization, not associated with any of the participating covered entities, that will administer the proposed guidelines; and
  • a description of how the independent organization would assess participating covered entities for adherence to the proposed guidelines.

In turn, the draft requires the FTC to approve an application for proposed compliance guidelines, including the independent organization that will administer the guidelines, if the applicant demonstrates that the proposed guidelines:

  • “Meet or exceed” the APRA’s requirements;
  • provide for the “regular review and validation” by an independent organization; and
  • include a means of enforcement if a covered entity does not meet or exceed the requirements in the guidelines, which may include referral FTC or appropriate state attorney general for enforcement.

The discussion draft further provides for a “rebuttable presumption” of compliance for covered entities that participate in and are assessed to be in compliance with such guidelines. The APRA framework differs slightly from the ADPPA model, which contained separate sections on “technical compliance programs” (Sec. 303) and “Commission approved compliance guidelines” (Sec. 304). For the former, the draft legislation directed the FTC and other enforcement agencies to consider the history of any covered entity’s history of compliance with any technical compliance program approved by the FTC, as well as any action taken by the covered entity to remedy noncompliance with the program before commencing an investigation or enforcement action, and in determining liability or a penalty. For the latter, the bill stated that a covered entity that participates in Commission-approved compliance guidelines would “be deemed in compliance with the relevant provisions of this Act if such covered entity is in compliance with such guidelines.” Although these differences suggest that some fine tuning of the APRA “safe harbor” provision might be necessary, the practical effect of the APRA’s provision would largely be the same.

Regulated Self-Regulation in Modern Privacy Frameworks

We welcome APRA’s recognition that compliance and certification mechanisms should be part of a modern privacy law, especially at a time when some observers have criticized “pure” privacy self-regulation. Last fall, FTC Bureau of Consumer Protection Director Sam Levine made headlines when he stated bluntly that “self-regulation around digital privacy is not working,” a criticism he repeated more recently when addressing the privacy challenges posed by artificial intelligence. But Levine’s statements were much more nuanced than reported.

Although Levine bemoaned Congress’ failure to pass comprehensive privacy legislation and instead leave the development of privacy rules to “a handful of tech giants,” he offered a model of successful self-regulation that essentially is “regulated self-regulation.” Levine explained that,

“[S]elf-regulation can be successful when there are clear, meaningful policy objectives; a dedicated, independent institutional structure to develop and enforce rules; and, most importantly, when there is a clear legal framework underlying the scheme and an external enforcer – like the FTC – able to act as a cop on the beat to enforce the law effectively.”

In sum, the FTC’s chief privacy enforcer recognized that self-regulation can be an important and effective complement to government regulation and agency oversight when (i) underpinned by explicit legislative authorization, (ii) implemented by independent organizations, and (iii) backstopped by a privacy enforcement authority.

The FTC followed this approach in the Notice of Proposed Rulemaking it issued in December to update the COPPA Rule. There, the agency recognized that the COPPA Safe Harbor program “serves an important function in helping companies comply with COPPA . . . .” As part of its review, the agency made several recommendations for “enhanced oversight and transparency” to “further strengthen the COPPA safe harbor program.” ESRB Privacy Certified filed a comment agreeing, with a few exceptions, with the FTC’s proposed changes. We also offered some additional proposals to improve the Safe Harbor program, including, for example, by including in the COPPA Rule minimum expectations for Safe Harbor programs’ technological capabilities and assessment mechanisms.

COPPA, of course, is not the only privacy law that incorporates robust self-regulatory mechanisms to ensure accountability. The European Union’s General Data Protection Regulation (“GDPR”) provides for codes of conduct and certification mechanisms such as seals and marks in Articles 40 and 42 of the GDPR, respectively, noting, in Recital 100, that certification schemes allow consumers to “quickly assess the level of data protection of relevant products and services” and “enhance transparency.” The European Data Protection Board and EU member state data protection authorities have approved several industry codes (including EU-wide cloud computing codes) as well as data privacy seal programs under these provisions.

And the United Kingdom, which has long featured codes of conduct and certification schemes as part of its consumer protection laws, has embraced this type of self-regulation under the UK GDPR. The UK’s data protection authority, the Information Commissioner’s Office (“ICO”) has encouraged the development of sector-specific codes of conduct and certification schemes. Although the ICO has not yet approved any codes of conduct, it has authorized five certification schemes, including the Age Appropriate Design Code Certification Scheme (AADC certification), which aims to help businesses comply with the United Kingdom’s Age Appropriate Design Code, aka the “Children’s Code.” (ESRB Privacy Certified has an arrangement with the UK Age Check Certification Scheme, an accredited conformity assessment body, which administers the AADC certification. Contact us here for more information about the program.)

There will undoubtedly be a lot to consider as the APRA makes its way through Congress from policy, enforcement, and operational perspectives. The text of the discussion draft will likely change as legislators and a wide array of stakeholders debate any resulting bill. We don’t know if the APRA will be enacted this year or ever. (We are taking bets, though, on whether Congress will pass comprehensive privacy legislation before the next total solar eclipse over North America. It’s in 2044.) What we do know is that APRA’s inclusion of a regulated self-regulatory mechanism is an important feature that can help companies in the video game industry and beyond comply with a comprehensive federal privacy law and, in turn, help provide greater privacy protections for consumers.

• • •

Stacy Feuer HeadshotAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.

The post Proposed American Privacy Rights Act (APRA) signals new phase of privacy regulation: ESRB Privacy Certified welcomes COPPA-style compliance mechanism appeared first on ESRB Ratings.

]]>
ESRB Privacy Certified partners with leading media and entertainment law firm Frankfurt Kurnit https://www.esrb.org/privacy-certified-blog/esrb-privacy-certified-partners-with-leading-media-and-entertainment-law-firm-frankfurt-kurnit/ Mon, 01 Apr 2024 12:00:32 +0000 https://www.esrb.org/?p=5878 ESRB Privacy Certified is excited to partner with Frankfurt Kurnit’s Interactive Entertainment and Data Strategy, Privacy & Security Groups to enhance cooperation on between our program and the firm on privacy issues in the video game industry. Read about it here.

The post ESRB Privacy Certified partners with leading media and entertainment law firm Frankfurt Kurnit appeared first on ESRB Ratings.

]]>
ESRB Privacy Certified is excited to partner with Frankfurt Kurnit’s Interactive Entertainment and Data Strategy, Privacy & Security Groups to enhance cooperation on between our program and the firm on privacy issues in the video game industry. Read about it here.

The post ESRB Privacy Certified partners with leading media and entertainment law firm Frankfurt Kurnit appeared first on ESRB Ratings.

]]>
Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes https://www.esrb.org/privacy-certified-blog/probing-the-ftcs-coppa-proposals-updates-to-kids-privacy-rule-follows-agencys-focus-on-technological-changes/ Mon, 08 Jan 2024 21:04:24 +0000 https://www.esrb.org/?p=5791 As a longstanding FTC-authorized COPPA Safe Harbor program, we follow the agency’s COPPA work closely. We’ve delved into the Notice of Proposed Rulemaking (NPRM) to understand what it will mean for our member video game and toy companies – and for the millions of kids and teens (and their parents) that play games. Read our summary of the most important provisions from the 164-page NRPM document here.

The post Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes appeared first on ESRB Ratings.

]]>
Photo by Igor Starkov on Unsplash

With calls to strengthen kids’ online privacy and safety protections growing louder by the day, 2023 was supposed to be the year that Congress would pass new legislation. That didn’t happen. Enter the Federal Trade Commission (FTC).

The agency pursued several blockbuster children’s privacy enforcement actions in 2023, including two against video game companies, that resulted in hundreds of millions in fines and landmark legal remedies. Then, at the very end of the year, the agency issued long-awaited proposals for changes to the Children’s Online Privacy Protection Rule, a process it began in 2019.

The COPPA Rule, last updated in 2013, implements the Children’s Online Privacy Protection Act, which dates back even earlier — to 1999. Although the agency can’t change the Act itself (that’s Congress’ job), it can make far-reaching changes to the Rule. It’s still unclear what a final rule will look like and when (or whether) it will arrive, but the FTC’s cusp of the year move means that 2024 will certainly be a consequential year for children’s privacy.

As a longstanding FTC-authorized COPPA Safe Harbor program, we follow the agency’s COPPA work closely. We’ve delved into the Notice of Proposed Rulemaking (NPRM) to understand what the NPRM will mean for our member video game and toy companies – and for the millions of kids and teens (and their parents) that play games. (Although the average age of a gamer is 32, 76% of people under the age of 18 play video games.) We plan to file a comment on the proposed rule changes within the 60-day comment period that will start to run once the NPRM is published in the Federal Register, most likely later this week.

Although we’re still considering our responses to the NPRM, we’re providing a summary of the most important provisions to spare you reading all 164 pages of the document. (LinkedIn estimated that it would take me 228 minutes to read the NPRM. Once. I’ve already ready it multiple times.) So, if you don’t have four – or forty – hours to devote to COPPA Rule reform, read on. It shouldn’t take four hours, but this blog is on the longer side. For convenience, we’ve divided it into three categories: (1) Changes; (2) Emphasis; and (3) Status Quo.

CHANGES
First up, notable changes to definitions and substantive aspects of the Rule:

  • Personal Information: Currently, the COPPA Rule’s definition of personal information includes information collected from a child such as name, address, online contact information, screen or user names (when they function as contact information), phone numbers, social security numbers, geolocation information and photography, video, or audio files that contain a child’s image or voice. The Rule also includes “persistent identifiers” (such as IP addresses) that can be used to recognize users over time and across different web sites or online services in the definition of personal information.
    • Proposal: In the NPRM, the agency proposes expanding this definition to include biometric identifiers and all forms of government identification, not just SSNs. The FTC’s inclusion of biometric identifiers including “fingerprints or handprints; retina and iris patterns; genetic data, including a DNA sequence; or data derived from voice data, gait data, or facial data” as personal information is not surprising. In its May 2023 Biometric Policy Statement, the FTC articulated its concerns about the “new and increasing risks associated with the collection and use of biometric information” and FTC Commissioner Alvaro Bedoya has regularly sounded the alarm bell on how “companies are protecting children’s biometric data against breaches, fraud, and abuse.”
    • Questions: Beyond biometric information, the agency raises questions about two other categories of information – avatars and online screen or user names – that may be of interest to video game companies:First, the NPRM asks whether screen or user names should be treated as online contact information “even if the screen or user name does not allow one user to contact another user through the operator’s website or online service, when the screen or user name could enable one user to contact another by assuming that the user to be contacted is using the same screen or user name on another website or online service that does allow such contact?”
      Second, referring to the popularity of avatars in online services such as video games, the NPRM asks whether the Rule should explicitly designate avatars generated from a child’s image as personal information “even if the photograph of the child is not itself uploaded to the site or service and no other personal information is collected from the child.” The agency is interested in receiving specific feedback on these issues.
  • Target Audience: The target audience for a digital service is key to determining when an online service is “directed to children.”
    • Proposal: Although the FTC does not propose moving away from the multi-factor test it uses to determine whether a site is child-directed, it proposes adding a list of examples of evidence that the agency will consider in analyzing audience composition and intended audience. This will include “marketing or promotional materials or plans, representations to consumers or to third parties, reviews by users or third parties, and the age of users on similar websites or services.”
    • Questions: The NPRM also seeks feedback on whether the FTC should provide an exemption from designation as a child-directed service, for companies that have empirical evidence that no more than a specific percentage of its users are likely to be children under the age of 13. It also asks a number of questions about the contours of such an exemption.
      • Mixed Audience: The NPRM also proposes adding an express definition of “mixed audience” sites to the Rule. As with the current Rule, mixed audience services are directed to children, but do not target children as their primary audience. Such services cannot collect, use, or disclose users’ information without verifiable parental consent unless they use a neutral method “that does not default to a set age or encourage visitors to falsify age information” to collect a user’s age or use another method “reasonably calculated to determine if the user is a child.” This would permit companies to apply COPPA protections only to users under the age of 13.

 

  • Verifiable Parental Consent: One of the fundamental features of the COPPA Rule is the requirement that companies obtain verifiable parental consent (VPC) from parents for the collection and use of children’s personal information.
    • Proposal: The NPRM focuses on the sharing of children’s information with third parties, especially with advertisers, by requiring companies to obtain a separate VPC for disclosures of a child’s personal information unless such disclosures are “integral to the nature of the website or online service.” (The NPRM provides the example of an “online messaging forum” as an example of a situation where information disclosure would be “integral.”) As the FTC explains in its Business Blog, this means that “COPPA-covered companies’ default settings would have to disallow third-party behavioral advertising and allow it only when parents expressly opt in.” In addition, as the NPRM makes clear, this requirement is feature-specific. So, if a company implements a “chatbot or other feature that simulates conversation” it must obtain VPC.
    • Questions: Interestingly, although the NPRM states several times that COPPA permits contextual advertising without VPC, the FTC is seeking comment on this issue. Question 10 asks, “Operators can collect persistent identifiers for contextual advertising purposes without parental consent so long as they do not also collect other personal information. Given the sophistication of contextual advertising today, including that personal information collected from users may be used to enable companies to target even contextual advertising to some extent, should the Commission consider changes to the Rule’s treatment of contextual advertising?”

 

  • Internal Operations and Notice: The COPPA Rule has long allowed companies to collect and use persistent identifiers without first getting VPC if they don’t collect any other personal information and use the persistent identifiers only to provide support for internal operations. In the NPRM, the agency expressly declined to provide a narrowed or expanded definition of “internal operations.” It also stated that it believes that the practice of ad attribution, which allows an advertiser to associate a consumer’s action with a particular ad, “currently falls within the support for the internal operations definition” except when it is used for behavioral advertising, amassing a profile on a specific individual, or directly contacting an individual.
    • Proposal: To increase transparency around the internal operations exception, however, the agency would require companies to specifically identify the way in which they will use a collected personal identifier in their online notices. In addition, the company must “describe the means it uses to ensure that it does not use or disclose the persistent identifier to contact a specific individual, including through behavioral advertising, to amass a profile on a specific individual, in connection with processes that encourage or prompt use of a website or online service, or for any other purpose, except as permitted by the support for the internal operations exception.”

 

  • Internal Operations and Engagement: As foreshadowed in the quoted language immediately above, the FTC is interested in issues that go beyond pure privacy concerns like “nudging.”
    • Proposal: The NPRM also proposes to expand the Rule’s restrictions on the internal operations exception to processes (including machine learning processes) that would “encourage or prompt” a child’s use of an online service. This would include “push notifications” that encourage kids to use their service more. Companies that use persistent identifiers to send these push notifications would also be required to flag that use in their direct and online notices. This would ensure parents are aware of, and have consented to, these processes.
    • Questions: Here, too, the agency seeks additional comment, asking how companies are currently using persistent identifiers to maximize user engagement and how it could distinguish between “user-driven” personalization versus personalization driven by a business. In a separate question, the NPRM also asks whether the Rule should address other engagement techniques, as well as whether the Rule should “differentiate between techniques used solely to promote a child’s engagement with the website or online service and those techniques that provide other functions, such as to personalize the child’s experience on the website or online service?”

 

  • Data security: Consistent with concerns that the FTC has raised about data security in the recent COPPA enforcement cases, the proposed Rule significantly expands the COPPA Rule’s existing data security requirement.
  • Proposal: The NPRM requires companies to have written comprehensive security programs that are proportional to the “sensitivity of children’s information and to the operator’s size, complexity, and nature and scope of activities.” It also sets out requirements for performing annual data security assessments, implementing and testing safeguards, and evaluating and modifying their info security programs on an annual basis. The proposed Rule would also require companies to obtain written assurances from third parties to whom they transfer personal information, such as service providers, to maintain the confidentiality, security, and integrity of information.

 

  • Safe Harbor oversight: Of particular interest to us are the additional reporting and transparency requirements for Safe Harbor programs. Several of the proposals reflect comments that we have made to the FTC and to members of Congress inviting additional oversight to ensure that all Safe Harbor programs fulfill their responsibilities under the COPPA Rule. Others may present operational challenges. We will provide detailed responses to these proposals in our public comment on the NPRM.

 

  • Online contact information: Recognizing the significant convenience and utility of text communications, the FTC also proposes adding mobile telephone numbers to the list of identifiers that constitute “online contact information” so that parents can provide consent via text message. The NPRM makes clear, however, that companies may only use a child’s number to send a text message, and that the agency will not permit companies to collect and use a child’s mobile telephone number to communicate with the child, unless it has obtained verifiable parental consent to do so.

EMPHASIS
Beyond these proposed changes, it’s worth noting what is staying the same, but with more emphasis. Two issues stand out:

    • Data minimization: The Rule has long prohibited companies from collecting more personal information than is reasonably necessary for a child to participate in a game, offering of a prize, or another activity. The NPRM reinforces this prohibition, making it clear that it applies even if a company has obtained VPC.

 

    • Data retention and deletion: The NPRM emphasizes the FTC’s focus on data retention in recent enforcement actions. Companies can only retain personal information for as long as necessary to fulfill the purpose for which it was collected: they cannot hold on to it indefinitely or use it for any secondary purpose. This means that a company that collects a child’s email address for account creation purposes, cannot use it for marketing purposes without VPC. The proposal would also require companies to post a data retention policy for children’s personal information to enhance parents’ ability to make informed decisions about data collection.

STATUS QUO
Finally, here’s what isn’t changing, at least not as part of the FTC’s rulemaking process:

    • Teens: First, despite making clear, in a variety of contexts (such as the Epic Games settlement and last year’s Advance Notice of Proposed Rulemaking on Commercial Surveillance and Lax Security Practices), that teens should benefit from privacy protections, the NPRM does not address raising the age of a “child” beyond 12, as urged by many commentors. This is because the agency does not have the authority to change the age of a child, which is established in the Act.

 

    • Knowledge Standard: Currently, COPPA only applies to “child-directed” services or when an operator has “actual knowledge.” Despite many comments urging the FTC to change the standard from the “actual knowledge” standard to a “constructive knowledge” or another less definite standard, the agency declined to do so. Instead, it includes a long discussion of the legislative history of the Act on this point, sending a strong signal to Congress that the ball is in its court on that issue – and other issues like teen privacy that would require Congressional amendment of the Act (as opposed to FTC modification of the Rule) — when it reconvenes for 2024.

 

    • Inferred Data: Similarly, the NPRM declines to include “inferred data” in the definition of personal information because the Act makes clear that COPPA applies to information collected from a child, not about a child.

 

    • Rebuttable presumption: The agency also declined to permit general audience platforms to rebut the presumption that all users of child-directed content are children, finding that the “reality of parents and children sharing devices, along with account holders remaining perpetually logged into their accounts, could make it difficult for an operator to distinguish reliably between those users who are children and those who are not.”

• • • • •
In announcing the NPRM, FTC Chair Lina Khan stated that, “The proposed changes to COPPA are much-needed, especially in an era where online tools are essential for navigating daily life . . . .” We agree that the COPPA Rule needs updating. As we have said in other comments, kids’ privacy rules should be modernized “to meet the challenges of social media, mobility, ad tech, and immersive technologies – issues that weren’t present when COPPA was enacted nearly 25 years ago.” As the FTC’s rulemaking unfolds, we’ll be following closely and providing guidance to our program members on complying with any new rules and implementing stronger protections for children’s privacy. To learn more about ESRB Privacy Certified’s compliance and certification program, please visit our website, find us on LinkedIn, or contact us at privacy@esrb.org.

Stacy Feuer HeadshotAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.

The post Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes appeared first on ESRB Ratings.

]]>
A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern https://www.esrb.org/privacy-certified-blog/a-new-season-for-kids-privacy-court-enjoins-californias-landmark-youth-privacy-law-but-protecting-children-online-remains-a-prime-concern/ Tue, 19 Sep 2023 21:21:19 +0000 https://www.esrb.org/?p=5631 Read our analysis of the NetChoice decision and tips about what it might mean for your kids’ privacy program.

The post A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern appeared first on ESRB Ratings.

]]>
Summer is definitely over. With the autumnal equinox just days away (Saturday, September 23, to be exact), there’s been a definite shift in the air – and in the children’s privacy world. Just as the fastest sunsets and sunrises of the year happen at the equinoxes, kids’ privacy developments are piling on rapidly right now.

Since the beginning of September, we’ve seen the Irish Data Protection Commission issue a huge, €345 million ($367 million) fine against TikTok for using unfair design practices that violate kids’ privacy. Delaware’s governor just signed a new privacy law that bans profiling and targeted advertising for users under the age of 18 unless they opt-in. And the Dutch data protection authority, just this week, announced an investigation into businesses’ use of generative AI in apps directed at young children.

As I was catching up with these matters yesterday, news broke that a federal district court judge in California had granted a preliminary injunction (“PI”) prohibiting the landmark California Age Appropriate Design Code Act (“CAADCA”) from going into effect on July 1, 2024. The judge ruled that the law violates the First Amendment’s free speech guarantees.

As ESRB Privacy Certified blog readers might recall, in September 2022, California enacted the CAADCA, establishing a far-reaching privacy framework that requires businesses to prioritize the “best interests of the child” when designing, developing, and providing online services. At the time, I wrote that the California law had the “potential to transform data privacy protections for children and teens in the United States.”

In particular, I pointed to the law’s coverage of children under the age of 18, its applicability to all online services “likely to be accessed by a minor,” and its requirement that businesses set default privacy settings that offer a “high level” of privacy protection (e.g., turning off geolocation and app tracking settings) unless the business can present a “compelling reason” that different settings are in the best interests of children. I also noted the Act’s provisions on age estimation/verification, data protection impact assessments (“DPIAs”), and data minimization as significant features.

In December 2022, tech industry organization NetChoice filed a lawsuit challenging the CAADCA on a wide range of constitutional and other grounds. In addition to a cluster of First Amendment arguments, NetChoice asserted that the Children’s Online Privacy Protection Act (“COPPA”), which is enforced primarily by the Federal Trade Commission (“FTC”), preempts the California law. The State of California, represented by the Office of the Attorney General, defended the law, arguing that the “Act operates well within constitutional parameters.”

Yesterday’s PI shifts the “atmospherics” of the kids’ privacy landscape dramatically. But the injunction doesn’t mean that businesses and privacy practitioners can ignore the underlying reasons for the CAADCA (which was passed overwhelmingly by the California legislature) or the practices and provisions it contains. Here’s a very rough analysis of the decision and some tips about what it might mean for your kids’ privacy program.

The Court’s Holding: In her 45-page written opinion, Judge Beth Labson Freeman held that “NetChoice has shown that it is likely to succeed on the merits of its argument that the provisions of the CAADCA intended to achieve [the purpose of protecting children when they are online] likely violates the First Amendment.” The Court held that the CAADCA is a regulation of protected expression, and not simply a regulation of non-expressive conduct, i.e., activity without a significant expressive element. Because she viewed the statute as implicating “commercial speech,” the Court analyzed the CAADCA under an “intermediate scrutiny standard of review.”

The Relevant Test: Under that standard (often referred to as the Central Hudson test based on the name of the Supreme Court case that formulated it), if the challenged regulation concerns lawful activity and speech that is not misleading, the government bears the burden of proving that (i) it has a “substantial interest” in the regulation advanced, (ii) that the regulation directly and materially advance the government’s substantial interest, and (iii) that the regulation is “narrowly tailored” to achieve that interest.

The Court recognized that California would likely succeed in establishing a substantial interest in protecting minors from harms to their physical and psychological well-being caused by lax data and privacy protections online. Reviewing the CAADCA’s specific provisions, however, it found that that many of the provisions  challenged by NetChoice did not meet the remaining prongs of the intermediate scrutiny test.

The Court’s Central Hudson Analysis: The Court made findings on each of the specific provisions challenged by NetChoice keyed to the Central Hudson factors. I highlight a few here:

  • Data Protection Impact Assessments (DPIAs): The Court held that California did not meet its burden to demonstrate that the requirement for businesses to assess their practices in DPIAs would alleviate any harms from the design of digital products, services, and features, to a material degree.
  • Age Estimation: Judge Freeman also found that the statutory requirement to estimate the age of child users with a “reasonable level of certainty” would likely fail the Central Hudson test: “[T]he CAADCA’s age estimation provision appears not only unlikely to materially alleviate the harm of insufficient data and privacy protections for children, but actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information.”
    • The Court also found that the age estimation provision would likely fail to meet the Central Hudson test because the effect of a business choosing not to estimate age, but instead to apply privacy and data protections broadly, would impermissibly shield adults from that same content. In reaching this conclusion, Judge Freeman rejected California’s argument that the “CAADCA does not prevent any specific content from being displayed to a consumer, even if the consumer is a minor; it only prohibits a business from profiling a minor and using that information to provide targeted content.”
    • Notably, later in the decision, Judge Freeman held that the age estimation provision is the “linchpin” of most of most of the CAADCA’s provisions and therefore determined it is not “functionally severable” from the remainder of the statute.
  • High Default Privacy Settings: The Court found that the CAADCA’s requirement for “high default privacy settings” would be likely to cause at least some businesses to prohibit children from accessing their services and products altogether.
  • Profiling by Default: Here, Judge Freeman held that the provision banning profiling of children by default could discard “beneficial aspects” of targeted information to certain categories of children, e.g., pregnant teenagers.
  • Dark Patterns: The Judge held that California did not meet its burden to establish that prohibitions on the use of dark patterns to lead or encourage children to provide unnecessary personal information would ameliorate a causally connected harm.

COPPA Preemption: Although the Court granted the injunction based on First Amendment considerations alone, it did, briefly, address NetChoice’s argument that the COPPA preempts the CAADCA. The Court rejected this argument at the PI stage, explaining: “In the Court’s view, it is not clear that the cited provisions of the CAADCA contradict, rather than supplement, those of COPPA. Nor is it clear that the cited provisions of the CAADCA would stand as an obstacle to enforcement of COPPA. An online provider might well be able to comply with the provisions of both the CAADCA and COPPA . . . . “

  • N.B. Judge Freeman’s decision to act cautiously on this claim makes sense. Recently, the Ninth Circuit Court of Appeals, in Google v. Jones, overturned her decision that COPPA preempted state law claims asserted in a class action alleging that Google/You Tube used persistent identifiers to collect data and track children’s online behavior surreptitiously and without their consent – conduct that also violates COPPA. Interestingly, in that case, the Ninth Circuit invited the FTC, which enforces COPPA, to express its views on the preemption issue. The FTC accepted, stating that “Congress did not intend to wholly foreclose state protection of children’s online privacy, and the panel properly rejected an interpretation of COPPA that would achieve that outcome.”


Takeaways:
The CAADCA litigation is far from over, and it is likely that the California Attorney General will seek an immediate interlocutory appeal. It is clear, though, that the district court’s decision will have consequences in the short term for state privacy laws that are scheduled to come into effect soon as well as for efforts underway in Congress on child-related online privacy and safety legislation. Here are a few takeaways:

  • Privacy Laws Can Still Pack a Punch: Regardless of whether the Court ultimately strikes down the CAADCA or not, many of the concepts in the design code are already embedded in other privacy laws that apply to game and toy companies’ activities, both without and within the United States. On the U.S. front, there are newly enacted child privacy provisions in state laws that should be able to withstand constitutional challenge. Plus, the NetChoice ruling might loosen the California’s Congressional delegation’s resistance to bipartisan federal legislation. Although today’s some may view the Court’s ruling as a reprieve, companies still need to meet other legal obligations.
    • For example, Connecticut recently passed child privacy amendments (scheduled to go into effect on October 1, 2024) to its privacy law that skirt some of the elements Judge Freeman found provisionally unconstitutional. Unlike the CAADCA, the Connecticut law does not require that companies estimate the age of their users; it applies only to companies that have “actual knowledge” of or “willfully disregard” the presence of minor users, and it does not regulate “potentially harmful” (as opposed to illegal) content. Instead of using the CAADCA “best interest of the child” standard, the Connecticut law establishes a duty to avoid a “heightened risk of harm” to minors and delineates potential harms.
  • DPIAs are still a “Must Do”: Most of the new state privacy laws passed in the last year contain requirements for data protection impact assessments, similar to those already required by the European Union’s General Data Protection Regulation (GDPR). At the beginning of September, the California Privacy Protection Agency published draft regulations that contain practical examples of how DPIAs should work under California’s comprehensive privacy law. Regardless of what happens with the CAADCA, statutory requirements for more focused DPIAs such as those in the California Consumer Privacy Act will likely remain.
    • Judge Freeman’s skepticism about the CAADCA’s DPIA provision aside, DPIAs can be a useful accountability tool for identifying privacy risks, working out when, where, and how likely they are to occur, and assessing the impact of such risks on your customers and business.
  • COPPA Continues to Be Relevant: It will probably take years for the court battle over the CAADCA to play out. In the meantime, if you know that children — or teenagers — are using your products, expect the FTC to enforce COPPA and other privacy protections aggressively. (For quick review of the FTC’s recent COPPA cases, see my previous blog post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions.)
    • Indeed, it’s likely the FTC will use both the substantive provisions of COPPA and the “unfairness” and “deception” prongs of Section 5 of the FTC Act to set requirements for child-friendly privacy disclosures, mandates for high privacy default settings, and prohibitions against manipulative dark patterns through its child-focused investigations and enforcement actions.
    • The NetChoice ruling – coupled with Congressional inaction – could also spur the FTC to complete its now-four-years-old COPPA Rule review and act on (at least parts of) last year’s privacy rulemaking proposal.

While this all unfolds, ESRB Privacy Certified will continue to help its program members comply with existing laws and adopt and implement best practices for children’s privacy. As privacy protections for kids and teens continue to evolve, we’ll be following closely and providing guidance to our program members on all of the moving parts of the complex children’s privacy landscape. To learn more about ESRB Privacy Certified’s compliance and certification program, please visit our website, find us on LinkedIn, or contact us at privacy@esrb.org.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.

The post A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern appeared first on ESRB Ratings.

]]>
COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions https://www.esrb.org/privacy-certified-blog/coppa-battlegrounds-the-quest-to-uncover-the-secrets-of-the-ftcs-kids-privacy-actions/ Wed, 05 Jul 2023 17:02:32 +0000 https://www.esrb.org/?p=5573 At ESRB, the non-profit, self-regulatory body for the video game industry, kids’ privacy is serious business. We do take breaks, though, from reviewing privacy policies, preparing compliance assessments, and absorbing the onslaught of privacy developments. Some of us even play and design video games when we’re not working. We are the Entertainment Software Rating Board […]

The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.

]]>
At ESRB, the non-profit, self-regulatory body for the video game industry, kids’ privacy is serious business. We do take breaks, though, from reviewing privacy policies, preparing compliance assessments, and absorbing the onslaught of privacy developments. Some of us even play and design video games when we’re not working. We are the Entertainment Software Rating Board after all!

So, for a little fun, we decided to create an imaginary video game – COPPA Battlegrounds. Join the ESRB Privacy Certified team as we dive deeply into the ongoing saga of the Federal Trade Commission’s kids’ privacy enforcement actions – cases that have resulted in hundreds of millions of dollars in fines and landmark legal remedies. Venture into new privacy territory, unlocking the mysteries of “personal information,” “privacy by default,” “data retention,” and more! Collect XPs as you explore strategies and best practices to protect young gamers’ privacy.

The Players

The “COPPA Controller”: The Federal Trade Commission (FTC) is the U.S. government agency charged with protecting consumers and competition. It is the chief federal agency that works to protect consumer privacy. Over the years, it has brought hundreds of privacy and data security cases to protect consumers and their data.

The “Digital Defendants”: Several well-known tech companies have been hit with FTC actions alleging violations of children’s privacy law in the past half year. Two – Epic Games and Microsoft Xbox – are popular video game publishers. Amazon, Meta, and Edtech company, Edmodo, are also in the line-up.

The Weapons and Equipment

The “Sword of COPPA”: The Children’s Online Privacy Protection Act of 1998 (COPPA) and its implementing COPPA Rule (updated in 2013) provide the FTC with a powerful weapon to protect the privacy of children under the age of 13. The law and rule (together, COPPA) require companies that offer services “directed to children,” or that have knowledge that kids under 13 are using their services, to provide notice of their data practices. They must also obtain verifiable parental consent (VPC) from parents before collecting personal information from children. COPPA also contains strong substantive protections, mandating that companies minimize the data they collect from children, honor parents’ data deletion requests, and implement strong security safeguards. To date, the FTC has brought nearly 40 COPPA enforcement actions.

The “Section 5 Superweapon”: The FTC’s true superweapon comes from Section 5 of the Federal Trade Commission Act, which prohibits unfair or deceptive practices in the marketplace. Since the advent of the internet, the FTC has used Section 5 to address a wide range of issues that affect people online, including the privacy of people purchasing and playing video games.

Policy Statement “Power-ups”: From time to time, the FTC releases policy statements that explain how the agency applies the laws it enforces. These potent statements put companies on notice that they will face legal action if they ignore the FTC’s prescriptions. In May, the FTC issued a Policy Statement on Biometric Information, which sets out a list of unfair practices relating to the collection and use of such data. Earlier, the FTC issued a Policy Statement on COPPA and EdTech that emphasized COPPA’s limits on companies’ ability to collect, use, and retain children’s data.

The Backstory

The FTC’s quest to secure a safer online environment for kids and their personal information has been ongoing since Congress passed COPPA in 1998. Previous blockbuster titles in the COPPA franchise include the FTC’s landmark 2019 settlement with Google/You Tube and the 2018 VTech and Musical.ly/TikTok actions.

COPPA has been extremely effective in giving parents information about and control over their kids’ data. There’s been an emerging consensus, however, that the legal framework for children’s privacy should be updated to include teenagers and meet the challenges of social media, mobility, ad tech, and immersive technologies – issues that weren’t present when Congress enacted the law 25 years ago. Despite the introduction of several bills in Congress to update COPPA, none have yet become law. The FTC therefore has proposed several new ideas to protect the privacy of not only children under the age of 13 but teens too. These are now playing out in the FTC’s enforcement actions.

 Multiplayer Actions

During the past half year or so, the FTC has announced four new COPPA actions, plus a an order against Meta/Facebook relating to a previous settlement. For video game companies, two stand out: the Epic Games/Fortnite settlement (see our earlier blog) and the Microsoft/Xbox Live settlement, announced in June. The FTC’s settlements with Amazon/Alexa and Edmodo also provide some clues to unlocking the secrets of the FTC’s COPPA enforcement mode. Consistent with ESRB Privacy Certified’s focus on privacy compliance in video games, we’ll focus our analysis on the two gaming cases. But we’ll add some insights from the NPCs (here, nonplayable “cases”), too.

Epic Games/Fortnite

Late last year, the FTC filed a two-count complaint and proposed settlement order against Epic Games. It alleged that Epic knew its massively popular game Fortnite was “directed to children” and unlawfully collected personal data from them without VPC. The FTC also charged Epic with violating the FTC Act by using unfair “on by default” voice and text chat settings that led to children and teens being bullied, threatened, and harassed within Fortnite. Epic settled with the FTC, agreeing to pay a $275 million civil penalty and to standard injunctive relief. (In the privacy area, this includes monitoring, reports, a comprehensive privacy plan, and regular, independent audits.) The final court Order entered in February also required Epic to implement privacy-protective default settings for children and teens. It also required the company to delete personal information previously collected from children in Fortnite unless the company obtains parental consent to retain such data or the user identifies as 13 or older.

Microsoft/Xbox Live

In the beginning of June, the FTC filed a one-count complaint and proposed settlement order against Microsoft alleging that its Xbox Live online service violated COPPA in three ways: (i) by collecting personal information (i.e., email address, first and last name, date of birth, and phone number) from kids under 13 before notifying their parents and getting VPC; (ii) by failing to provide clear and complete information about its data practices in COPPA’s required notices, i.e., that it didn’t tell parents that it would disclose Xbox’s customer unique persistent identifier to third-party game and app developers; and (iii)  by holding on to kids’ data for years even when parents did not complete the account creation process.

Microsoft, which has long had a comprehensive privacy program, settled with the FTC for $20 million. It agreed to implement new business practices to increase privacy protections for Xbox users under 13. For example, the Order requires Microsoft to tell parents that a separate child account will provide significant privacy protections for their child by default. The company also must maintain a system to delete, within two weeks from the collection date, all personal information collected from kids for the purpose of obtaining parental consent. In addition, Microsoft must honor COPPA’s data deletion requirements by deleting all other personal data collected from children after it no longer needs it for the purpose collected.

Unearthing the Seven COPPA Revelations

Beyond the allegations and remedies of the enforcement actions, there’s a wealth of information about the FTC’s kids’ privacy priorities and practices you might want to adopt – or avoid – if you want to stay out of the sites of the COPPA Controller. Here are COPPA Battlegrounds seven lessons for COPPA compliance based on the FTC’s recent kids’ privacy actions:

1. Sequence your game play to obtain VPC before you collect ANY personal information from a child: The FTC’s complaint in the Xbox action emphasized that – even though Microsoft had a VPC program in place – it violated COPPA by not obtaining parental consent before it collected any personal information from kids besides their data of birth. Xbox did require children to involve their parents in the registration process, but the FTC found that Microsoft’s initial collection of kids’ email addresses, their first and last name, and phone number before obtaining consent violated COPPA’s VPC requirements. The FTC also blasted Microsoft for requiring kids to agree to the company’s service agreement, which, until 2019, included a pre-checked box allowing Microsoft to send them promotional messages and to share user data with advertisers. The FTC’s approach indicates that they will look closely at companies’ verifiable parental consent sequences, and that they will strictly enforce COPPA’s prohibition on collecting any personal information before obtaining VPC (unless an exception to VPC exists).

2. The FTC views COPPA’s “actual knowledge” standard broadly and so should you: When the FTC announced its Epic Games settlement, we reminded companies that you can’t disclaim COPPA by declaring that you don’t process children’s information or by ignoring evidence that children are playing your games. Now, with the Xbox Live settlement, the FTC has affirmed that it will enforce COPPA against any company with “actual knowledge” that the company is handling children’s personal information, regardless of whether that company has directed its service to children intentionally. Significantly, the settlement requires Microsoft – when it discloses personal information about children to other video game publishers – to tell them that the user is a child. The FTC’s requirement for Microsoft to share information about children on its platform with third parties is a game-changing move. In the FTC’s words, “[I]t will put [third-party] publishers on notice that they, too, must apply COPPA protections to that child.”

3. Your COPPA notices must be clear, understandable, and complete: The FTC emphasized that it’s not enough under COPPA’s notice provisions to summarize your collection, use, and disclosure practices generally. Instead, your direct notice must be complete. The FTC faulted Microsoft for failing to tell parents about its collection of personal information children shared through their profile or Xbox Live usage, such as their “gamertags,” photos, which kids used to create avatars, and voice recordings from video messages. The agency also alleged that Microsoft’s notice failed to inform parents that it created persistent identifiers for children, which it combined with other information, and shared with third-party game and app developers. Going forward, it’s important for companies to specify, in a clear and complete way, their practices in the notices required by COPPA, and not just provide parents with a link to a densely worded privacy policy.

4. Privacy by default is not a fad: In Epic Games, the FTC focused for the first time not just on “privacy by design” but on “privacy by default,” finding that Epic did not have “privacy-protective” default settings in Fortnite that limited kids’ contact with strangers and otherwise protected their privacy. The FTC went further in Xbox Live, emphasizing that, even though Xbox had default settings that only allowed a child to disclose their activity feed or otherwise communicate with parent-approved “friends,” Microsoft configured other defaults in a way that did not protect children sufficiently. As the FTC emphasized in a blog about the Amazon case, “[C]ompanies that ignore consumers’ rights to control their data do so at their peril . . . The upshot is clear: Any company that undermines consumer control of their data can face FTC enforcement action.”

5. Take your data minimization and retention/deletion obligations seriously: The FTC’s recent cases also highlight COPPA’s substantive provisions on data minimization and data retention. The COPPA Rule prohibits conditioning a child’s participation in a game on the child “disclosing more personal information than is reasonably necessary to participate in such activity” and allows companies to keep it “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.” In the Edmodo complaint, for example, the agency said that Edmodo violated COPPA by using the personal information it collected for advertising instead of limiting it to educational purposes.

In the Xbox Live case, the agency chided Xbox for holding onto kids’ data when the parental verification process was incomplete, sometimes for years. Although Microsoft described this as a “technical glitch,” and explained that this data “was never used, shared, or monetized,” the FTC doubled down on its concerns with company data retention practices that violate COPPA. Indeed, in the Amazon Alexa case, the FTC charged that Amazon made it difficult for parents to exercise their right, under COPPA, to delete their children’s voice recording data. It further alleged that Amazon disregarded parents’ deletion requests, retained kids’ voice recordings indefinitely, and misled parents about its data deletion practices (e.g., by retaining copies of transcripts of voice recordings). The FTC is wielding the “Sword of COPPA” to press for meaningful data minimization, purpose limitation, and data retention/deletion practices.

6. Be especially careful when dealing with kids’ biometric data, algorithms, and machine learning: The FTC’s Xbox Live settlement covers biometric information like avatars generated from a child’s image and emphasizes COPPA’s strict limitations on the retention of this type of data from kids. In the Amazon case, the agency was clearly troubled by Amazon’s retention of kids’ voice recordings, which count as biometric info, indefinitely. One of the FTC Commissioners emphasized this point, stating that “Claims from businesses that data must be indefinitely retained to improve algorithms do not override legal bans on indefinite retention of data.” Consider yourself warned!

7. Privacy Innovation Can Help You Comply with COPPA: Not all the privacy-protective action in COPPA Battlegrounds comes from the FTC. Even before the settlement, Epic Games announced that it was creating “Cabined Accounts” to provide safe, tailored experiences for younger players. Following the FTC’s action, Microsoft unveiled its plans to test “next-generation identity and age validation” methods to create a “convenient, secure, one-time process for all players that will allow us to better deliver customized, safe, age-appropriate experiences.” Xbox explained that the entire games industry can benefit from advancing safe and innovative digital experiences that are accessible, simple to use, and benefit all players. We agree! Many ESRB Privacy Certified members are developing new strategies and tools to enhance kids’ privacy. Achievement unlocked!

The Final Conquest

Congratulations on completing the breakout version of COPPA Battlegrounds! You can now take your kids’ privacy program to the next level. Contact us at privacy@esrb.org if you’d like to discuss how your company can prevail in COPPA Battlegrounds – and its inevitable sequels.



As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule, and the general “Privacy Certified” seal.

The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.

]]>
IAPP or “AI”-PP?: Generative AI, Games, and the Global Privacy Summit https://www.esrb.org/privacy-certified-blog/iapp-or-ai-pp-generative-ai-games-and-the-global-privacy-summit/ Wed, 26 Apr 2023 13:47:12 +0000 https://www.esrb.org/?p=5488 As videogame companies increasingly embrace generative AI, privacy pros will need to drill down on regulatory enforcement and best practices.

The post IAPP or “AI”-PP?: Generative AI, Games, and the Global Privacy Summit appeared first on ESRB Ratings.

]]>
Image generated by Canva Text-to-Image AI.

With over 5,000 attendees, seemingly hundreds of panels and speakers, and a brilliant opening talk by South African comedian and philanthropist, Trevor Noah, the recent International Association of Privacy Professionals (IAPP) Global Privacy Summit (#GPS23) was a terrific opportunity for ESRB Privacy Certified’s videogame and toy-focused team to connect with the wider privacy world. Despite the vast array of privacy issues and resources on offer, there was one topic that topped everything – Artificial Intelligence. AI.

Especially generative AI, made famous by viral chatbot, ChatGPT. The incredible advances in generative AI that have catapulted into games and everything else in the last few months were top of mind. Almost every panel, even when not directly about AI, touched on it. I found myself counting the minutes, even seconds, it took for someone to mention ChatGPT in a hallway conversation between sessions. (The average was just under three minutes.) The takeaway? Privacy practitioners must understand and plan for AI-related privacy issues.

That’s especially true for privacy pros at game companies. Videogame companies are increasingly embracing technology’s possibilities to revolutionize the way we learn, work, and play. Already, videogame companies are using generative AI to speed up game development, reduce costs, and help players interact with characters in new interactive and immersive ways.

Generative AI’s use of gargantuan amounts of data – including personal data – however, raises complex privacy issues. For example, even if some of the underlying data is technically public (at least in the U.S.), generative AI models could combine and use this information in unknown ways. OpenAI, Inc., the company behind ChatGPT, acknowledges that it scoops up “publicly available personal information.” There are also privacy issues around transparency, bias, and consumers’ rights to access, correct, and delete information used by the models. And yes, ChatGPT records all of your “prompts.”

All this “underline[s] the urgent need for robust privacy and security measures in the development and deployment of generative AI technologies,” asserts IAPP Principal Technology Researcher, Katharina Koerner. Many large videogame companies have already developed principles for what’s been variously called “trustworthy” or “ethical” or “responsible” AI. Most address consumer privacy and data security at a high level. Still, as videogame companies increasingly embrace generative AI and roll out new products, privacy pros will need to drill down on regulatory enforcement and best practices in this area. So here, to get you started, are three top takeaways from IAPP GPS23, aka the “AI-PP”:

  1. Get Ready for Federal Trade Commission (FTC) Generative AI Action FTC Commissioner Alvaro Bedoya, in an entertaining DALL-E illustrated keynote speech titled “Early Thoughts on Generative AI,” emphasized that the FTC can regulate AI today. Taking on what he called a “powerful myth out there that ‘AI is unregulated,’” Commissioner Bedoya said:
    Unfair and deceptive trade practices laws apply to AI. At the FTC, our core section 5 jurisdiction extends to companies making, selling, or using AI. If a company makes a deceptive claim using (or about) AI, that company can be held accountable. If a company injures consumers in a way that satisfies our test for unfairness when using or releasing AI, that company can be held accountable. (Footnotes omitted.)

    Commissioner Bedoya also pointed to civil rights laws as well as tort and product liability laws. “Do I support stronger statutory protections?” he asked. “Absolutely. But AI does not, today, exist in a law-free environment.”

    A recent FTC Business Center blog emphasizes Bedoya’s point. The agency explained that new AI tools present “serious concerns, such as potential harms to children, teens, and other populations at risk when interacting with or subject to these tools.” It warned that, “Commission staff is tracking those concerns closely as companies continue to rush these products to market and as human-computer interactions keep taking new and possibly dangerous turns.” And just yesterday, the FTC, along with the Department of Justice and several other federal agencies, released a joint statement announcing their “resolve to monitor the development and use of automated systems . . . [and] vigorously
    use our collective authorities to protect individuals’ rights regardless of whether legal
    violations occur through traditional means or advanced technologies.”

    My reaction? Commissioner Bedoya’s “early thoughts speech” should be seen as a current heads-up. Especially in light of the Center for AI and Digital Policy’s recent complaint to the FTC. The group urged the agency to investigate OpenAI and GPT-4 and prevent the release of further generative AI products before the “establishment of necessary guardrails to protect consumers, businesses, and the commercial marketplace.”

  2. The Italian Data Protection Authority’s ChatGPT Injunction Is Just the Beginning of Worldwide Scrutiny
    Even though the FTC hasn’t yet acted on Commissioner Bedoya’s warning, other privacy authorities have already done so. GPS23 was filled with chatter about the action by the Italian Data Protection Authority (the Garante) against ChatGPT owner, OpenAI, under the General Data Protection Regulation (GDPR) temporarily banning ChatGPT in Italy.Since then, the agency has required OpenAI to comply with specific privacy requirements before lifting the ban. These include requiring the company to ask users for consent or establishing a legitimate interest for using consumer’s data, verify users’ ages to keep children off the platform, and provide users with access, correction, and deletion rights. Whether and how OpenAI can do so is an open, high-stakes, question.Meanwhile, more scrutiny is on the way. The U.K.’s Information Commissioner, John Edwards, and the President of the French CNIL (Commission Nationale Informatique & Libertés), Marie-Laure Denis, spent most of their session on Regulator Insights From Today to Help Design Privacy Rules for Tomorrow talking about the challenges of AI and the GDPR’s roadmap for compliance and enforcement. Last Thursday, the European Data Protection Board announced that it had launched a new task force to discuss a coordinated European approach to ChatGPT. And just this Monday, the Baden-Württemberg data protection authority announced it was seeking information from the company on behalf of Germany’s 16 state-run data protection authorities.In case you think only European agencies are investigating ChatGPT, Canadian Privacy Commissioner Philippe Dufresne announced his agency’s investigation into ChatGPT on the first morning of GPS23. There aren’t many details yet, but like the Italian Garante’s action, the Office of the Privacy Commissioner’s investigation appears to be focused on the product’s lack of transparency and failure to obtain consent from users for the data that powers the chatbot, which is trained on data collected from the open web.
  3. AI Governance and Risk Mitigation Are Key
    Although not as splashy as the main stage presentation by author and generative AI expert, Nina Schick, the panels that focused on the practical aspects of AI were invaluable. They also provided pointers on how to build a sturdy foundation for AI use, including by:

    • Adopting documented principles, policies, and procedures;
    • Establishing cross-functional teams;
    • Inventorying models, data and use cases;
    • Updating procurement and vendor oversight processes;
    • Providing employee training and awareness; and
    • Assessing risks.

    (Sounds a lot like a “how to” build a solid privacy program, no?) They also discussed the slew of AI legislation currently underway (e.g., the EU’s AI Act, California and other state bills) that will ultimately clarify the compliance landscape.

    At another session, panelists emphasized that there’s no one silver bullet for privacy issues in AI. Instead, practitioners will need to use some combination of privacy enhancing technologies (PETS), like differential privacy, and frameworks like the National Institute of Standards and Technology’s (NIST) AI Risk Management Framework and its Privacy Framework to help address the privacy challenges of generative AI.

*****
ChatGPT and other generative AI products can’t predict the future. Yet. Or, as ChatGPT itself told me, “[I]t is not capable of predicting the future with certainty.” But as IAPP GPS23 made clear, generative AI will certainly be part of the privacy discussion going forward.

• • •

If you have more questions about AI-related privacy issues or you want to learn more about our program, please reach out to us through our contact page. Be sure to follow us on LinkedIn for more privacy-related updates.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.

The post IAPP or “AI”-PP?: Generative AI, Games, and the Global Privacy Summit appeared first on ESRB Ratings.

]]>
Ready Player Go: Getting Your Privacy Program Metaverse-Ready in 2023 https://www.esrb.org/privacy-certified-blog/ready-player-go-getting-your-privacy-program-metaverse-ready-in-2023/ Thu, 26 Jan 2023 13:55:29 +0000 https://www.esrb.org/?p=5359 Even though there’s no consensus on exactly what the metaverse is, or how, when, and whether it will transform our lives, now’s the time – from our privacy compliance perspective – for companies and consumers alike to get ready.

The post Ready Player Go: Getting Your Privacy Program Metaverse-Ready in 2023 appeared first on ESRB Ratings.

]]>
As ESRB Privacy Certified celebrates Data Privacy Week 2023, the hype about the metaverse – a single, immersive, persistent and three-dimensional space that enables people to socialize, play, shop, and work in ways that transcend the limits of the physical world – is reaching a crescendo. It’s easy to ignore some of it as baseless buzz. But, from a practical standpoint, the video game industry has long created metaverse-like experiences, building expansive virtual worlds with players using their custom-designed digital avatars to connect, socialize and play with one another. Some companies are experimenting with Web 3.0 features like extended reality (XR), an umbrella term for virtual (VR), augmented (AR), and mixed reality (MR). Others are considering the use of the blockchain to prove “ownership” of virtual goods and property and non-fungible tokens (NFTs) to enable purchases. So, even though there’s no consensus on exactly what the metaverse is, or how, when, and whether it will transform our lives, now’s the time – from our privacy compliance perspective – for companies and consumers alike to get ready.

How much time is up for debate. A recent study by Pew Research Center and Elon University’s Imagining the Internet Center surveyed over 600 experts about the trajectory and impact of the metaverse. More than half of the experts (54%) predicted that the metaverse will be part of daily life for a half billion people globally by 2040. Slightly less than half (46%) disagreed. They predicted that even though more people will embrace XR tools and experiences by then, the fully immersive world that people imagine as “the metaverse” will take more than 20 years to come to fruition.

Whichever group is right, it’s certain that privacy (and data security) issues will loom large. The array of XR technologies that enable the metaverse will create vast new troves of digital data and real-world privacy concerns. Companies will be able to collect enormous amounts of biometric data such as user’s eye movements and hand positions and bodily data such as blood pressure, and respiration levels. Through the emerging area of inferential biometrics, XR technologies, combined with AI, could be used to make inferences about user’s emotions, mental and physical health, and personality traits, among other things.

Even if users create virtual life identities without providing real-world personal information or using their personal characteristics such as gender, race, or age in avatars, they will likely share information with other digital avatars. As with today’s smart phones and IoT devices, this may allow others to piece together users’ real-world identities or obtain sensitive information from them. Companies may sweep up data from bystanders who happen to be in the range of an XR user’s sensors. If companies choose to incorporate blockchain technologies and NFTs into their metaverse plans, they will present their own privacy and security challenges.

It’s critical for companies in the video game industry and beyond to start addressing these challenges now. In a KPMG survey of 1,000 U.S. adults conducted last fall, 80% percent of respondents said that privacy is their top concern in the metaverse, while 79% said that the security of their personal information is their biggest worry. So, while we’re waiting for the real metaverse to stand up, you can make sure your company is using today’s XR technologies in privacy-protective ways and getting ready for the next iteration(s) of the metaverse.

Photo credit: Julien Tromeur via Unsplash

Here are three ways to start:

  1. Incorporate global laws and “best practices” into your current privacy compliance strategy: The metaverse is likely to be more fully global than even today’s internet. This makes it unlikely that any one data privacy regime will apply clearly to metaverse platforms or companies that operate on those platforms. There aren’t any metaverse-specific privacy rules or standards, and there likely won’t be for a long time. Companies should therefore prepare by analyzing and adopting responsible and transparent “best practices” from existing data protection and privacy frameworks. Instead of complying only with the current law in any one jurisdiction or trying to avoid other laws through a “choice of law” clause in your terms of use, you should look to a variety of laws, international standards, and global best practices to provide a high level of privacy protection for your metaverse users. (You can’t just choose your favorite law, though: You’ll need to continue complying with privacy laws that do exist in your jurisdiction.)

    Informational privacy principles, contained in global guidelines such as the OECD (Organization for Economic Cooperation and Development) Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, can form the core of your metaverse data protection strategy. These concepts, such as data minimization, purpose limitation, use specification, data security, individual participation, and accountability, can guide your implementation in a specific metaverse application or environment. Indeed, most modern data protection laws, such as the European Union’s General Data Protection Regulation (GDPR), California’s Consumer Privacy Act (CCPA), and the proposed bipartisan American Data Protection and Privacy Act (ADPPA) from the last Congress, all incorporate these concepts. Of course, there may be situations when laws conflict or these principles will simply be inadequate to deal with new technological developments. By considering how you can use laws, standards, and best practices in your privacy program now, though, you’ll have a head start on compliance.
  2. Do the “Ds” – Deploy Privacy by Design and Default Principles and Data Protection Impact Assessments: The concept of the metaverse as an open, multi-layered universe means that existing methods of privacy protection that rely on privacy disclosures and user consent may be difficult to deploy. But privacy by design – the idea that companies should design products, processes, and policies to proactively manage and avoid privacy risks – seems tailor-made for this new medium. (And it’s long been a core part of our program’s compliance approach.) Privacy by default, a closely related concept, may be even more salient. It requires companies to protect their users’ personal data automatically, embedding privacy into technology and systems from the beginning, not after-the-fact. (The UK Information Commissioner’s Office has a helpful guidance and checklist that addresses these principles in the context of the GDPR.)

    An important piece of privacy by design and default is assessment. Many modern data protection laws, such as the GDPR and California’s Age Appropriate Design Code Act, require companies to conduct data protection impact assessments (DPIAs) to identify, analyze, and minimize privacy risks. Even if you’re not required to conduct DPIAs now, you should start to do them (if you’re not doing so already) for technologies like XR and features like NFTs that may be part of your metaverse offerings. (The International Association of Privacy Professionals (IAPP) maintains an extremely useful resource page on DPIAs.)
  3. Don’t forget children and teens: As complex as data privacy in the metaverse will be for adults, the challenge of protecting the privacy of kids and teens in the metaverse will be even greater. Companies will need to follow a mélange of rules and laws such as the Children’s Online Privacy Protection Act (COPPA) in the U.S., and the newer Age-Appropriate Design Codes in the UK, Ireland, and California. They will also need to follow related laws and rules on safety, advertising and marketing, and digital wellness to protect children and teens from real and perceived risks in the metaverse. As the Federal Trade Commission’s recent settlement with Epic Games for COPPA and other privacy violations involving Fortnite makes clear, poor privacy practices in virtual worlds can lead to real-life harms. One of the FTC commissioners explained how the company’s alleged practices, such as opting children into voice and text communications with players around the world, exposed children to bullying, threats, and harassment, and even coerced or enticed them into sharing sexually explicit images and meeting offline for sexual activity.

Companies must double down on privacy by design and default for children and teens, build sophisticated privacy and parental controls, implement multi-layered age verification methods, and develop mechanisms to obtain parental consent (when required). Some companies may want to build out child-and-teen friendly metaverse spaces and experiences. Given the complexities of doing so, it’s a good thing that a Ready Player One-like universe that crosses over physical and digital realms doesn’t really exist. Yet.

• • •

If you have more questions about kids’ privacy in the metaverse or you want to learn more about our program, please reach out to us through our contact page. Be sure to follow us on LinkedIn for more privacy-related updates.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.


Featured image credit: BrianPenny on Pixabay.

The post Ready Player Go: Getting Your Privacy Program Metaverse-Ready in 2023 appeared first on ESRB Ratings.

]]>
Wrapping Up 2022 with A Huge (Epic) Fortnite Privacy Case https://www.esrb.org/privacy-certified-blog/wrapping-up-2022-with-a-huge-epic-fortnite-privacy-case/ Wed, 21 Dec 2022 20:58:11 +0000 https://www.esrb.org/?p=5254 The Fortnite settlement gives insight into the FTC’s thinking on kids' and teens' privacy. Here are 7 takeaways from a case that will likely reverberate far past the New Year.

The post Wrapping Up 2022 with A Huge (Epic) Fortnite Privacy Case appeared first on ESRB Ratings.

]]>
With 2022 almost behind us, we’d planned on easing out of work mode and into festive celebrations this week for the end of this hectic and challenging privacy year. But Stacy’s former employer, the Federal Trade Commission (FTC), had other ideas. So, instead of wrapping presents, we’re wrapping up the year with an analysis of the FTC’s record-breaking $520 million settlements with Epic Games (Epic) for privacy and consumer protection violations in its wildly popular Fortnite video game.

The “s” in settlements is not a typo: On Monday, the FTC announced two separate enforcement actions against Epic. Consistent with ESRB Privacy Certified’s focus on privacy compliance, though, we’ll limit our analysis to the FTC’s privacy-related case. In short, the FTC (represented by the Department of Justice) filed a two-count Complaint and a Stipulated Order in federal court alleging that Epic violated the Children’s Online Privacy Protection Act (COPPA) and the related COPPA Rule. COPPA protects the personal information of children under the age of 13. The FTC asserted that Epic knew that Fortnite was “directed to children” and unlawfully collected personal data from them without verifiable parental consent (VPC).

The FTC also charged Epic with violating the FTC Act, which prohibits unfair and deceptive practices, by using unfair “on by default” voice and text chat settings in Fortnite that led to children and teens being bullied, threatened, and harassed within the game, including sexually. It charged that Epic’s privacy and parental controls did not meaningfully alleviate these harms or empower players to avoid them. If approved, this settlement will require Epic to pay $275 million in civil penalties. (The other $245 million is for the other case and is allotted for consumer refunds.)

Apart from the epic fine, the Fortnite action provides insight into the FTC’s thinking on children’s and teens’ privacy. Here are seven takeaways from a case that will likely reverberate far past the New Year:

  1. Declaring that your services are not directed to children is not enough: The FTC’s action makes clear that you can’t disclaim COPPA. In a paragraph that appeared on the next-to-last page of Epic’s lengthy global privacy policy, the company stated that it does not direct its websites, games, game engines, or applications to children or intentionally collect personal information from them. Although many companies make this claim in their privacy policies, it won’t help you if the facts show that your product is, in fact, child directed. (Remember, a mixed-audience product is one that targets children but not as the primary audience.)
  2. COPPA’s “actual knowledge” standard doesn’t allow you to ignore evidence that children are using your services – especially internal and empirical evidence: While many advocates and lawmakers have criticized COPPA’s “actual knowledge” standard, seeking to replace it with “constructive knowledge,” the Fortnite action shows the FTC will construe the standard broadly. The agency cited several of the standard COPPA Rule factors – subject matter, use of animation, child-oriented activities and language, and music content, evidence of intended audience, and empirical evidence about the game’s player demographics – to determine that Fortnite is directed to children. The key evidence, though, came from empirical evidence and Epic’s own internal documents including:

    • Demographic data: The FTC provided examples of public survey data, which Epic had reviewed, to demonstrate it knew a considerable portion of Fortnite players were under the age of 13. It pointed to publicly available survey results from a 2019 report showing that 53% of U.S. children aged 10-12 played Fortnite weekly, compared to 33% of U.S. teens aged 13-17, and 19% of the U.S. population aged 18-24. The agency alleged that these results also matched Epic’s internal data.
    • Advertising and marketing: The FTC homed in on Epic’s product licensing deals with a wide variety of companies for Fortnite-branded costumes, toys, books, youth-sized apparel, and “back to school” merchandise, many of which were targeted to the under-13 crowd. As in the FTC’s previous record-breaking COPPA matter, Google/YouTube ($170 million fine), the agency cited numerous internal statements and documentation that Epic had generated to emphasize Fortnite’s appeal to children to potential advertising and marketing partners.
    • Internal statements and events: The FTC also cited “ordinary course of business” communications such as consumer complaints and conversations among Epic employees that acknowledged explicitly that many of its users skewed younger. The FTC strung a number of them together (perhaps unfairly) but the phrases – “a large portion of our player base” consists of “underage kids,” / “high penetration among tweens/teens,” / “Fortnite is enjoyed by a very young audience at home and abroad” – convey, unmistakably, that Epic knew that it had a large user base of tweens and younger kids.
  3. Implement VPC and age gates from the get-go or make sure you apply them retroactively: The FTC faulted Epic for failing to obtain VPC for the personal information it collected from child users. In addition to data like name and email, the agency pointed to Epic’s broadcast of “display names” that put children and teens in direct, real-time contact with others through voice and text communication, as personal information that required parental consent. It also charged that even after Epic deployed age gates, it failed to deploy them retroactively to most of the hundreds of millions of Fortnite players who already had accounts. This is pretty much the same conduct that got TikTok (then Musical.ly) in trouble in an earlier, FTC COPPA case. (The $5.7 million civil penalty there was the largest ever fine at the time the case settled in 2019.) Like TikTok, Epic didn’t go back and request age information for people who already had accounts and adjust their default social features and privacy controls to comply with COPPA.
  4. Privacy by default is not just a catchphrase: Although the FTC has long emphasized privacy by design, the FTC hadn’t previously focused on “privacy-protective” default settings in games and other online services. Now it has. The FTC alleged that Epic’s default settings, which enabled live text and voice communications for all users – including children and teens – constituted an unfair practice that led kids and teens to be bullied, threatened, and harassed, including sexually, through Fortnite. Moreover, the agency, citing evidence from Epic’s own employees, alleged that Epic’s parental controls were insufficient. Even when Epic eventually added a button allowing users to turn voice chat off, the company made it difficult for users to find, according to the FTC.
  5. Injunctive relief can be tough – and retroactive: In addition to the whopping $275 million civil penalty, the proposed Stipulated Order sets out the standard injunctive relief the FTC has long obtained in privacy cases – requirements for FTC monitoring, reports, a comprehensive privacy plan, and regular, independent audits. The Order also requires Epic to implement privacy-protective default settings for children and teens. Following the agency’s newer trend of using injunctions to remedy past harms, the Order requires Epic to delete personal information previously collected from Fortnite users in violation of the COPPA Rule’s parental notice and consent requirements unless the company obtains parental consent to retain such data or the user identifies as 13 or older through a neutral age gate.
  6. Real-world harms matter a lot: Commissioner Christine Wilson, the only Republican currently on the Commission, issued a concurring statement supporting the agency’s action. Although she has cautioned the agency’s majority against overly-expansive uses of the FTC’s unfairness authority, Commissioner Wilson noted that the “elements of the unfairness test are clearly satisfied — because Epic Games allegedly opted children into voice and text communications with players around the world, children were exposed to bullying, threats, and harassment, and were enticed or coerced into sharing sexually explicit images and meeting offline for sexual activity.” Wilson also approved of the “novel injunctive mechanisms, which require Epic Games to implement heightened privacy default settings” for children and teens because they “directly address the privacy harms fostered by the company’s alleged business practices.”
  7. Failing to comply with COPPA can be expensive: There’s a clear upward trajectory from the $5.7 million civil penalty in the FTC’s TikTok/Musicl.ly action to the $170 million fine in Google/YouTube to the $275 million civil penalty that Epic will pay to resolve the FTC’s charges. That’s definitely something to remember as you make your plans for the New Year!

Following the FTC’s announcement, Epic explained that it had accepted the settlement agreements “because we want Epic to be at the forefront of consumer protection and provide the best experience for our players.” It set out – as a “helpful guide” to the industry – principles, policies, and recommendations that the company has instituted over the past few years to protect its players and meet regulators’ expectations globally. On the children’s privacy front, Epic recommended that game developers “proactively create age-appropriate ways for players to enjoy their games” – advice that mirrors our own. Maybe we can tie that up with a ribbon!

* * * * *

Wishing you and your loved ones a joyful and relaxing holiday season without any more blockbuster FTC announcements until 2023!


Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.

The post Wrapping Up 2022 with A Huge (Epic) Fortnite Privacy Case appeared first on ESRB Ratings.

]]>
What Parents Need to Know About Privacy in Mobile Games: Communicate with Your Kids https://www.esrb.org/privacy-certified-blog/what-parents-need-to-know-about-privacy-in-mobile-games-communicate-with-your-kids/ Fri, 28 Oct 2022 13:00:03 +0000 https://www.esrb.org/?p=4969 We’ve pulled together five tips to help protect your children’s privacy throughout this week. The final tip? Make sure you communicate with your kids about how they can protect their privacy online.

The post What Parents Need to Know About Privacy in Mobile Games: Communicate with Your Kids appeared first on ESRB Ratings.

]]>
We’ve pulled together five tips to help protect your children’s privacy throughout this week. Catch up on the first four tips here. The final tip? Make sure you communicate with your kids about how they can protect their privacy online.

Our first four tips are privacy-specific while this last one applies to many parenting challenges: Communicate with your kids! Talk with them about what they should know and can do to protect their privacy online. If your kids are young, you can tell them to come to you or simply say no to all in-game requests for information. If your children are older, you can teach them how to use privacy settings and permissions.

You can also educate them in an age-appropriate way about the consequences of sharing too much personal information in a game. These can range from compromising the security of online accounts to attracting cyberbullies to damaging their personal reputation. Let them know that they can come talk to you if they’ve posted something online that they later realize is too personal (you can help them get it deleted) or if they’re receiving inappropriate advertisements, messages, or other communications. (You can report inappropriate ads to Apple and Google.)

Make sure your kids know they can turn to you for help in protecting their personal data and preferences, and that you know where to find answers and advice.

Sometimes, in a rush to play a game, your child might simply click “yes” on permissions, or even falsify their age, but when they understand how their personal data and preferences may be used, or more importantly misused, most kids will become more interested in managing their own privacy online. Make sure they know they can turn to you for help, and that you know where to find answers and advice.

Protecting your kids’ privacy in mobile games may sound overwhelming, but the benefits of playing games far outweigh the risks. Our tips – together with ESRB’s Family Gaming Guide and our “What Parents Need to Know” blogs can help you protect your kids’ privacy online.

• • •

If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.

The post What Parents Need to Know About Privacy in Mobile Games: Communicate with Your Kids appeared first on ESRB Ratings.

]]>
What Parents Need to Know About Privacy in Mobile Games: Don’t Let Your Children Lie About Their Ages https://www.esrb.org/privacy-certified-blog/what-parents-need-to-know-about-privacy-in-mobile-games-dont-let-your-children-lie-about-their-ages/ Thu, 27 Oct 2022 13:00:52 +0000 https://www.esrb.org/?p=4968 We’ve pulled together five tips to help protect your children’s privacy and are rolling one out each day. Tip #4 is to prevent your children from lying about their ages online. It’s important that your child uses an accurate birthdate or age when signing up for a new game or mobile app. Learn why in our fourth privacy tip.

The post What Parents Need to Know About Privacy in Mobile Games: Don’t Let Your Children Lie About Their Ages appeared first on ESRB Ratings.

]]>
We’ve pulled together five tips to help protect your children’s privacy and are rolling one out each day this week. Yesterday, we covered what the ESRB Privacy Certified seals mean and where you should look for them. Our fourth tip is to prevent your children from lying about their ages online.

It’s important that your child uses an accurate birthdate or age when signing up for a new game or mobile app. When companies know that children under the age of 13 are playing their games, they are required by law to follow the federal Children’s Online Privacy Protection Act (COPPA). COPPA and its associated Rule issued by the Federal Trade Commission (FTC) gives parents control over what information companies can collect from kids under 13 years of age through their websites, apps, and other online services, including mobile games. Under COPPA, companies with games, apps, and other services “directed to children” or who know that kids under 13 are using their game must:

  1. Notify you of how they use your kid’s information;
  2. Get your express consent (known as “verifiable parental consent”) before collecting, using, or disclosing your child’s personal information;
  3. Allow you to review and request deletion of your child’s information.

Under COPPA, a game company can’t condition participation in a game on a child disclosing more information than is necessary. They’re also prohibited from using information for commercial purposes such as targeted marketing and advertising that are unrelated to gameplay. This is part of why it’s so important to make sure you or your kid enters an accurate birthdate or age when signing up for a new game!

Make sure your children enter their ages accurately so they can benefit from legal protections tailored to protect kids’ personal information.

Beyond COPPA, recently enacted privacy laws in states like California, Colorado, Connecticut, Utah, and Virginia give kids and their parents additional privacy rights. Some extend certain privacy rights to teens. For example, several of these state laws prohibit companies from selling or sharing teenagers’ (typically ages 13-16) personal information without their consent or the consent of their parent or guardian. You can ask that a mobile game company not sell or share your child’s information by making a request using a form or email address available from the company’s app or website. Other laws, such as California’s recently-passed Age Appropriate Design Code Act, require companies to set privacy controls in games and other products to the most-protective level for all users under the age of 18.

Companies that don’t follow these rules can get in a lot of trouble. The FTC and state law enforcers have slammed mobile game companies that failed to comply with COPPA with large fines and other penalties. And more enforcement is likely on the way. Along with our other tips, making sure that your children enter their ages accurately will help ensure that they benefit from legal privacy protections tailored for kids and teens.

Click here to continue to the final tip: Communicate with Your Kids.

• • •

If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.

The post What Parents Need to Know About Privacy in Mobile Games: Don’t Let Your Children Lie About Their Ages appeared first on ESRB Ratings.

]]>