kids privacy | ESRB Ratings https://www.esrb.org/tag/kids-privacy/ Provides ratings for video games and apps, including age ratings, content descriptors and interactive elements. Mon, 15 Apr 2024 14:53:48 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://www.esrb.org/wp-content/uploads/2019/06/cropped-Favicon.png kids privacy | ESRB Ratings https://www.esrb.org/tag/kids-privacy/ 32 32 COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions https://www.esrb.org/privacy-certified-blog/coppa-battlegrounds-the-quest-to-uncover-the-secrets-of-the-ftcs-kids-privacy-actions/ Wed, 05 Jul 2023 17:02:32 +0000 https://www.esrb.org/?p=5573 At ESRB, the non-profit, self-regulatory body for the video game industry, kids’ privacy is serious business. We do take breaks, though, from reviewing privacy policies, preparing compliance assessments, and absorbing the onslaught of privacy developments. Some of us even play and design video games when we’re not working. We are the Entertainment Software Rating Board […]

The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.

]]>
At ESRB, the non-profit, self-regulatory body for the video game industry, kids’ privacy is serious business. We do take breaks, though, from reviewing privacy policies, preparing compliance assessments, and absorbing the onslaught of privacy developments. Some of us even play and design video games when we’re not working. We are the Entertainment Software Rating Board after all!

So, for a little fun, we decided to create an imaginary video game – COPPA Battlegrounds. Join the ESRB Privacy Certified team as we dive deeply into the ongoing saga of the Federal Trade Commission’s kids’ privacy enforcement actions – cases that have resulted in hundreds of millions of dollars in fines and landmark legal remedies. Venture into new privacy territory, unlocking the mysteries of “personal information,” “privacy by default,” “data retention,” and more! Collect XPs as you explore strategies and best practices to protect young gamers’ privacy.

The Players

The “COPPA Controller”: The Federal Trade Commission (FTC) is the U.S. government agency charged with protecting consumers and competition. It is the chief federal agency that works to protect consumer privacy. Over the years, it has brought hundreds of privacy and data security cases to protect consumers and their data.

The “Digital Defendants”: Several well-known tech companies have been hit with FTC actions alleging violations of children’s privacy law in the past half year. Two – Epic Games and Microsoft Xbox – are popular video game publishers. Amazon, Meta, and Edtech company, Edmodo, are also in the line-up.

The Weapons and Equipment

The “Sword of COPPA”: The Children’s Online Privacy Protection Act of 1998 (COPPA) and its implementing COPPA Rule (updated in 2013) provide the FTC with a powerful weapon to protect the privacy of children under the age of 13. The law and rule (together, COPPA) require companies that offer services “directed to children,” or that have knowledge that kids under 13 are using their services, to provide notice of their data practices. They must also obtain verifiable parental consent (VPC) from parents before collecting personal information from children. COPPA also contains strong substantive protections, mandating that companies minimize the data they collect from children, honor parents’ data deletion requests, and implement strong security safeguards. To date, the FTC has brought nearly 40 COPPA enforcement actions.

The “Section 5 Superweapon”: The FTC’s true superweapon comes from Section 5 of the Federal Trade Commission Act, which prohibits unfair or deceptive practices in the marketplace. Since the advent of the internet, the FTC has used Section 5 to address a wide range of issues that affect people online, including the privacy of people purchasing and playing video games.

Policy Statement “Power-ups”: From time to time, the FTC releases policy statements that explain how the agency applies the laws it enforces. These potent statements put companies on notice that they will face legal action if they ignore the FTC’s prescriptions. In May, the FTC issued a Policy Statement on Biometric Information, which sets out a list of unfair practices relating to the collection and use of such data. Earlier, the FTC issued a Policy Statement on COPPA and EdTech that emphasized COPPA’s limits on companies’ ability to collect, use, and retain children’s data.

The Backstory

The FTC’s quest to secure a safer online environment for kids and their personal information has been ongoing since Congress passed COPPA in 1998. Previous blockbuster titles in the COPPA franchise include the FTC’s landmark 2019 settlement with Google/You Tube and the 2018 VTech and Musical.ly/TikTok actions.

COPPA has been extremely effective in giving parents information about and control over their kids’ data. There’s been an emerging consensus, however, that the legal framework for children’s privacy should be updated to include teenagers and meet the challenges of social media, mobility, ad tech, and immersive technologies – issues that weren’t present when Congress enacted the law 25 years ago. Despite the introduction of several bills in Congress to update COPPA, none have yet become law. The FTC therefore has proposed several new ideas to protect the privacy of not only children under the age of 13 but teens too. These are now playing out in the FTC’s enforcement actions.

 Multiplayer Actions

During the past half year or so, the FTC has announced four new COPPA actions, plus a an order against Meta/Facebook relating to a previous settlement. For video game companies, two stand out: the Epic Games/Fortnite settlement (see our earlier blog) and the Microsoft/Xbox Live settlement, announced in June. The FTC’s settlements with Amazon/Alexa and Edmodo also provide some clues to unlocking the secrets of the FTC’s COPPA enforcement mode. Consistent with ESRB Privacy Certified’s focus on privacy compliance in video games, we’ll focus our analysis on the two gaming cases. But we’ll add some insights from the NPCs (here, nonplayable “cases”), too.

Epic Games/Fortnite

Late last year, the FTC filed a two-count complaint and proposed settlement order against Epic Games. It alleged that Epic knew its massively popular game Fortnite was “directed to children” and unlawfully collected personal data from them without VPC. The FTC also charged Epic with violating the FTC Act by using unfair “on by default” voice and text chat settings that led to children and teens being bullied, threatened, and harassed within Fortnite. Epic settled with the FTC, agreeing to pay a $275 million civil penalty and to standard injunctive relief. (In the privacy area, this includes monitoring, reports, a comprehensive privacy plan, and regular, independent audits.) The final court Order entered in February also required Epic to implement privacy-protective default settings for children and teens. It also required the company to delete personal information previously collected from children in Fortnite unless the company obtains parental consent to retain such data or the user identifies as 13 or older.

Microsoft/Xbox Live

In the beginning of June, the FTC filed a one-count complaint and proposed settlement order against Microsoft alleging that its Xbox Live online service violated COPPA in three ways: (i) by collecting personal information (i.e., email address, first and last name, date of birth, and phone number) from kids under 13 before notifying their parents and getting VPC; (ii) by failing to provide clear and complete information about its data practices in COPPA’s required notices, i.e., that it didn’t tell parents that it would disclose Xbox’s customer unique persistent identifier to third-party game and app developers; and (iii)  by holding on to kids’ data for years even when parents did not complete the account creation process.

Microsoft, which has long had a comprehensive privacy program, settled with the FTC for $20 million. It agreed to implement new business practices to increase privacy protections for Xbox users under 13. For example, the Order requires Microsoft to tell parents that a separate child account will provide significant privacy protections for their child by default. The company also must maintain a system to delete, within two weeks from the collection date, all personal information collected from kids for the purpose of obtaining parental consent. In addition, Microsoft must honor COPPA’s data deletion requirements by deleting all other personal data collected from children after it no longer needs it for the purpose collected.

Unearthing the Seven COPPA Revelations

Beyond the allegations and remedies of the enforcement actions, there’s a wealth of information about the FTC’s kids’ privacy priorities and practices you might want to adopt – or avoid – if you want to stay out of the sites of the COPPA Controller. Here are COPPA Battlegrounds seven lessons for COPPA compliance based on the FTC’s recent kids’ privacy actions:

1. Sequence your game play to obtain VPC before you collect ANY personal information from a child: The FTC’s complaint in the Xbox action emphasized that – even though Microsoft had a VPC program in place – it violated COPPA by not obtaining parental consent before it collected any personal information from kids besides their data of birth. Xbox did require children to involve their parents in the registration process, but the FTC found that Microsoft’s initial collection of kids’ email addresses, their first and last name, and phone number before obtaining consent violated COPPA’s VPC requirements. The FTC also blasted Microsoft for requiring kids to agree to the company’s service agreement, which, until 2019, included a pre-checked box allowing Microsoft to send them promotional messages and to share user data with advertisers. The FTC’s approach indicates that they will look closely at companies’ verifiable parental consent sequences, and that they will strictly enforce COPPA’s prohibition on collecting any personal information before obtaining VPC (unless an exception to VPC exists).

2. The FTC views COPPA’s “actual knowledge” standard broadly and so should you: When the FTC announced its Epic Games settlement, we reminded companies that you can’t disclaim COPPA by declaring that you don’t process children’s information or by ignoring evidence that children are playing your games. Now, with the Xbox Live settlement, the FTC has affirmed that it will enforce COPPA against any company with “actual knowledge” that the company is handling children’s personal information, regardless of whether that company has directed its service to children intentionally. Significantly, the settlement requires Microsoft – when it discloses personal information about children to other video game publishers – to tell them that the user is a child. The FTC’s requirement for Microsoft to share information about children on its platform with third parties is a game-changing move. In the FTC’s words, “[I]t will put [third-party] publishers on notice that they, too, must apply COPPA protections to that child.”

3. Your COPPA notices must be clear, understandable, and complete: The FTC emphasized that it’s not enough under COPPA’s notice provisions to summarize your collection, use, and disclosure practices generally. Instead, your direct notice must be complete. The FTC faulted Microsoft for failing to tell parents about its collection of personal information children shared through their profile or Xbox Live usage, such as their “gamertags,” photos, which kids used to create avatars, and voice recordings from video messages. The agency also alleged that Microsoft’s notice failed to inform parents that it created persistent identifiers for children, which it combined with other information, and shared with third-party game and app developers. Going forward, it’s important for companies to specify, in a clear and complete way, their practices in the notices required by COPPA, and not just provide parents with a link to a densely worded privacy policy.

4. Privacy by default is not a fad: In Epic Games, the FTC focused for the first time not just on “privacy by design” but on “privacy by default,” finding that Epic did not have “privacy-protective” default settings in Fortnite that limited kids’ contact with strangers and otherwise protected their privacy. The FTC went further in Xbox Live, emphasizing that, even though Xbox had default settings that only allowed a child to disclose their activity feed or otherwise communicate with parent-approved “friends,” Microsoft configured other defaults in a way that did not protect children sufficiently. As the FTC emphasized in a blog about the Amazon case, “[C]ompanies that ignore consumers’ rights to control their data do so at their peril . . . The upshot is clear: Any company that undermines consumer control of their data can face FTC enforcement action.”

5. Take your data minimization and retention/deletion obligations seriously: The FTC’s recent cases also highlight COPPA’s substantive provisions on data minimization and data retention. The COPPA Rule prohibits conditioning a child’s participation in a game on the child “disclosing more personal information than is reasonably necessary to participate in such activity” and allows companies to keep it “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.” In the Edmodo complaint, for example, the agency said that Edmodo violated COPPA by using the personal information it collected for advertising instead of limiting it to educational purposes.

In the Xbox Live case, the agency chided Xbox for holding onto kids’ data when the parental verification process was incomplete, sometimes for years. Although Microsoft described this as a “technical glitch,” and explained that this data “was never used, shared, or monetized,” the FTC doubled down on its concerns with company data retention practices that violate COPPA. Indeed, in the Amazon Alexa case, the FTC charged that Amazon made it difficult for parents to exercise their right, under COPPA, to delete their children’s voice recording data. It further alleged that Amazon disregarded parents’ deletion requests, retained kids’ voice recordings indefinitely, and misled parents about its data deletion practices (e.g., by retaining copies of transcripts of voice recordings). The FTC is wielding the “Sword of COPPA” to press for meaningful data minimization, purpose limitation, and data retention/deletion practices.

6. Be especially careful when dealing with kids’ biometric data, algorithms, and machine learning: The FTC’s Xbox Live settlement covers biometric information like avatars generated from a child’s image and emphasizes COPPA’s strict limitations on the retention of this type of data from kids. In the Amazon case, the agency was clearly troubled by Amazon’s retention of kids’ voice recordings, which count as biometric info, indefinitely. One of the FTC Commissioners emphasized this point, stating that “Claims from businesses that data must be indefinitely retained to improve algorithms do not override legal bans on indefinite retention of data.” Consider yourself warned!

7. Privacy Innovation Can Help You Comply with COPPA: Not all the privacy-protective action in COPPA Battlegrounds comes from the FTC. Even before the settlement, Epic Games announced that it was creating “Cabined Accounts” to provide safe, tailored experiences for younger players. Following the FTC’s action, Microsoft unveiled its plans to test “next-generation identity and age validation” methods to create a “convenient, secure, one-time process for all players that will allow us to better deliver customized, safe, age-appropriate experiences.” Xbox explained that the entire games industry can benefit from advancing safe and innovative digital experiences that are accessible, simple to use, and benefit all players. We agree! Many ESRB Privacy Certified members are developing new strategies and tools to enhance kids’ privacy. Achievement unlocked!

The Final Conquest

Congratulations on completing the breakout version of COPPA Battlegrounds! You can now take your kids’ privacy program to the next level. Contact us at privacy@esrb.org if you’d like to discuss how your company can prevail in COPPA Battlegrounds – and its inevitable sequels.



As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule, and the general “Privacy Certified” seal.

The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.

]]>
COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions https://www.esrb.org/privacy-certified-blog/coppa-battlegrounds-the-quest-to-uncover-the-secrets-of-the-ftcs-kids-privacy-actions/ Wed, 05 Jul 2023 17:02:32 +0000 https://www.esrb.org/?p=5573 At ESRB, the non-profit, self-regulatory body for the video game industry, kids’ privacy is serious business. We do take breaks, though, from reviewing privacy policies, preparing compliance assessments, and absorbing the onslaught of privacy developments. Some of us even play and design video games when we’re not working. We are the Entertainment Software Rating Board […]

The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.

]]>
At ESRB, the non-profit, self-regulatory body for the video game industry, kids’ privacy is serious business. We do take breaks, though, from reviewing privacy policies, preparing compliance assessments, and absorbing the onslaught of privacy developments. Some of us even play and design video games when we’re not working. We are the Entertainment Software Rating Board after all!

So, for a little fun, we decided to create an imaginary video game – COPPA Battlegrounds. Join the ESRB Privacy Certified team as we dive deeply into the ongoing saga of the Federal Trade Commission’s kids’ privacy enforcement actions – cases that have resulted in hundreds of millions of dollars in fines and landmark legal remedies. Venture into new privacy territory, unlocking the mysteries of “personal information,” “privacy by default,” “data retention,” and more! Collect XPs as you explore strategies and best practices to protect young gamers’ privacy.

The Players

The “COPPA Controller”: The Federal Trade Commission (FTC) is the U.S. government agency charged with protecting consumers and competition. It is the chief federal agency that works to protect consumer privacy. Over the years, it has brought hundreds of privacy and data security cases to protect consumers and their data.

The “Digital Defendants”: Several well-known tech companies have been hit with FTC actions alleging violations of children’s privacy law in the past half year. Two – Epic Games and Microsoft Xbox – are popular video game publishers. Amazon, Meta, and Edtech company, Edmodo, are also in the line-up.

The Weapons and Equipment

The “Sword of COPPA”: The Children’s Online Privacy Protection Act of 1998 (COPPA) and its implementing COPPA Rule (updated in 2013) provide the FTC with a powerful weapon to protect the privacy of children under the age of 13. The law and rule (together, COPPA) require companies that offer services “directed to children,” or that have knowledge that kids under 13 are using their services, to provide notice of their data practices. They must also obtain verifiable parental consent (VPC) from parents before collecting personal information from children. COPPA also contains strong substantive protections, mandating that companies minimize the data they collect from children, honor parents’ data deletion requests, and implement strong security safeguards. To date, the FTC has brought nearly 40 COPPA enforcement actions.

The “Section 5 Superweapon”: The FTC’s true superweapon comes from Section 5 of the Federal Trade Commission Act, which prohibits unfair or deceptive practices in the marketplace. Since the advent of the internet, the FTC has used Section 5 to address a wide range of issues that affect people online, including the privacy of people purchasing and playing video games.

Policy Statement “Power-ups”: From time to time, the FTC releases policy statements that explain how the agency applies the laws it enforces. These potent statements put companies on notice that they will face legal action if they ignore the FTC’s prescriptions. In May, the FTC issued a Policy Statement on Biometric Information, which sets out a list of unfair practices relating to the collection and use of such data. Earlier, the FTC issued a Policy Statement on COPPA and EdTech that emphasized COPPA’s limits on companies’ ability to collect, use, and retain children’s data.

The Backstory

The FTC’s quest to secure a safer online environment for kids and their personal information has been ongoing since Congress passed COPPA in 1998. Previous blockbuster titles in the COPPA franchise include the FTC’s landmark 2019 settlement with Google/You Tube and the 2018 VTech and Musical.ly/TikTok actions.

COPPA has been extremely effective in giving parents information about and control over their kids’ data. There’s been an emerging consensus, however, that the legal framework for children’s privacy should be updated to include teenagers and meet the challenges of social media, mobility, ad tech, and immersive technologies – issues that weren’t present when Congress enacted the law 25 years ago. Despite the introduction of several bills in Congress to update COPPA, none have yet become law. The FTC therefore has proposed several new ideas to protect the privacy of not only children under the age of 13 but teens too. These are now playing out in the FTC’s enforcement actions.

 Multiplayer Actions

During the past half year or so, the FTC has announced four new COPPA actions, plus a an order against Meta/Facebook relating to a previous settlement. For video game companies, two stand out: the Epic Games/Fortnite settlement (see our earlier blog) and the Microsoft/Xbox Live settlement, announced in June. The FTC’s settlements with Amazon/Alexa and Edmodo also provide some clues to unlocking the secrets of the FTC’s COPPA enforcement mode. Consistent with ESRB Privacy Certified’s focus on privacy compliance in video games, we’ll focus our analysis on the two gaming cases. But we’ll add some insights from the NPCs (here, nonplayable “cases”), too.

Epic Games/Fortnite

Late last year, the FTC filed a two-count complaint and proposed settlement order against Epic Games. It alleged that Epic knew its massively popular game Fortnite was “directed to children” and unlawfully collected personal data from them without VPC. The FTC also charged Epic with violating the FTC Act by using unfair “on by default” voice and text chat settings that led to children and teens being bullied, threatened, and harassed within Fortnite. Epic settled with the FTC, agreeing to pay a $275 million civil penalty and to standard injunctive relief. (In the privacy area, this includes monitoring, reports, a comprehensive privacy plan, and regular, independent audits.) The final court Order entered in February also required Epic to implement privacy-protective default settings for children and teens. It also required the company to delete personal information previously collected from children in Fortnite unless the company obtains parental consent to retain such data or the user identifies as 13 or older.

Microsoft/Xbox Live

In the beginning of June, the FTC filed a one-count complaint and proposed settlement order against Microsoft alleging that its Xbox Live online service violated COPPA in three ways: (i) by collecting personal information (i.e., email address, first and last name, date of birth, and phone number) from kids under 13 before notifying their parents and getting VPC; (ii) by failing to provide clear and complete information about its data practices in COPPA’s required notices, i.e., that it didn’t tell parents that it would disclose Xbox’s customer unique persistent identifier to third-party game and app developers; and (iii)  by holding on to kids’ data for years even when parents did not complete the account creation process.

Microsoft, which has long had a comprehensive privacy program, settled with the FTC for $20 million. It agreed to implement new business practices to increase privacy protections for Xbox users under 13. For example, the Order requires Microsoft to tell parents that a separate child account will provide significant privacy protections for their child by default. The company also must maintain a system to delete, within two weeks from the collection date, all personal information collected from kids for the purpose of obtaining parental consent. In addition, Microsoft must honor COPPA’s data deletion requirements by deleting all other personal data collected from children after it no longer needs it for the purpose collected.

Unearthing the Seven COPPA Revelations

Beyond the allegations and remedies of the enforcement actions, there’s a wealth of information about the FTC’s kids’ privacy priorities and practices you might want to adopt – or avoid – if you want to stay out of the sites of the COPPA Controller. Here are COPPA Battlegrounds seven lessons for COPPA compliance based on the FTC’s recent kids’ privacy actions:

1. Sequence your game play to obtain VPC before you collect ANY personal information from a child: The FTC’s complaint in the Xbox action emphasized that – even though Microsoft had a VPC program in place – it violated COPPA by not obtaining parental consent before it collected any personal information from kids besides their data of birth. Xbox did require children to involve their parents in the registration process, but the FTC found that Microsoft’s initial collection of kids’ email addresses, their first and last name, and phone number before obtaining consent violated COPPA’s VPC requirements. The FTC also blasted Microsoft for requiring kids to agree to the company’s service agreement, which, until 2019, included a pre-checked box allowing Microsoft to send them promotional messages and to share user data with advertisers. The FTC’s approach indicates that they will look closely at companies’ verifiable parental consent sequences, and that they will strictly enforce COPPA’s prohibition on collecting any personal information before obtaining VPC (unless an exception to VPC exists).

2. The FTC views COPPA’s “actual knowledge” standard broadly and so should you: When the FTC announced its Epic Games settlement, we reminded companies that you can’t disclaim COPPA by declaring that you don’t process children’s information or by ignoring evidence that children are playing your games. Now, with the Xbox Live settlement, the FTC has affirmed that it will enforce COPPA against any company with “actual knowledge” that the company is handling children’s personal information, regardless of whether that company has directed its service to children intentionally. Significantly, the settlement requires Microsoft – when it discloses personal information about children to other video game publishers – to tell them that the user is a child. The FTC’s requirement for Microsoft to share information about children on its platform with third parties is a game-changing move. In the FTC’s words, “[I]t will put [third-party] publishers on notice that they, too, must apply COPPA protections to that child.”

3. Your COPPA notices must be clear, understandable, and complete: The FTC emphasized that it’s not enough under COPPA’s notice provisions to summarize your collection, use, and disclosure practices generally. Instead, your direct notice must be complete. The FTC faulted Microsoft for failing to tell parents about its collection of personal information children shared through their profile or Xbox Live usage, such as their “gamertags,” photos, which kids used to create avatars, and voice recordings from video messages. The agency also alleged that Microsoft’s notice failed to inform parents that it created persistent identifiers for children, which it combined with other information, and shared with third-party game and app developers. Going forward, it’s important for companies to specify, in a clear and complete way, their practices in the notices required by COPPA, and not just provide parents with a link to a densely worded privacy policy.

4. Privacy by default is not a fad: In Epic Games, the FTC focused for the first time not just on “privacy by design” but on “privacy by default,” finding that Epic did not have “privacy-protective” default settings in Fortnite that limited kids’ contact with strangers and otherwise protected their privacy. The FTC went further in Xbox Live, emphasizing that, even though Xbox had default settings that only allowed a child to disclose their activity feed or otherwise communicate with parent-approved “friends,” Microsoft configured other defaults in a way that did not protect children sufficiently. As the FTC emphasized in a blog about the Amazon case, “[C]ompanies that ignore consumers’ rights to control their data do so at their peril . . . The upshot is clear: Any company that undermines consumer control of their data can face FTC enforcement action.”

5. Take your data minimization and retention/deletion obligations seriously: The FTC’s recent cases also highlight COPPA’s substantive provisions on data minimization and data retention. The COPPA Rule prohibits conditioning a child’s participation in a game on the child “disclosing more personal information than is reasonably necessary to participate in such activity” and allows companies to keep it “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.” In the Edmodo complaint, for example, the agency said that Edmodo violated COPPA by using the personal information it collected for advertising instead of limiting it to educational purposes.

In the Xbox Live case, the agency chided Xbox for holding onto kids’ data when the parental verification process was incomplete, sometimes for years. Although Microsoft described this as a “technical glitch,” and explained that this data “was never used, shared, or monetized,” the FTC doubled down on its concerns with company data retention practices that violate COPPA. Indeed, in the Amazon Alexa case, the FTC charged that Amazon made it difficult for parents to exercise their right, under COPPA, to delete their children’s voice recording data. It further alleged that Amazon disregarded parents’ deletion requests, retained kids’ voice recordings indefinitely, and misled parents about its data deletion practices (e.g., by retaining copies of transcripts of voice recordings). The FTC is wielding the “Sword of COPPA” to press for meaningful data minimization, purpose limitation, and data retention/deletion practices.

6. Be especially careful when dealing with kids’ biometric data, algorithms, and machine learning: The FTC’s Xbox Live settlement covers biometric information like avatars generated from a child’s image and emphasizes COPPA’s strict limitations on the retention of this type of data from kids. In the Amazon case, the agency was clearly troubled by Amazon’s retention of kids’ voice recordings, which count as biometric info, indefinitely. One of the FTC Commissioners emphasized this point, stating that “Claims from businesses that data must be indefinitely retained to improve algorithms do not override legal bans on indefinite retention of data.” Consider yourself warned!

7. Privacy Innovation Can Help You Comply with COPPA: Not all the privacy-protective action in COPPA Battlegrounds comes from the FTC. Even before the settlement, Epic Games announced that it was creating “Cabined Accounts” to provide safe, tailored experiences for younger players. Following the FTC’s action, Microsoft unveiled its plans to test “next-generation identity and age validation” methods to create a “convenient, secure, one-time process for all players that will allow us to better deliver customized, safe, age-appropriate experiences.” Xbox explained that the entire games industry can benefit from advancing safe and innovative digital experiences that are accessible, simple to use, and benefit all players. We agree! Many ESRB Privacy Certified members are developing new strategies and tools to enhance kids’ privacy. Achievement unlocked!

The Final Conquest

Congratulations on completing the breakout version of COPPA Battlegrounds! You can now take your kids’ privacy program to the next level. Contact us at privacy@esrb.org if you’d like to discuss how your company can prevail in COPPA Battlegrounds – and its inevitable sequels.



As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule, and the general “Privacy Certified” seal.

The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.

]]>
P.S.R. Reinforces Fundamental Privacy Principles in a Changing World https://www.esrb.org/privacy-certified-blog/psr-reinforces-fundamental-privacy-principles-in-a-changing-world/ Thu, 20 Oct 2022 12:00:44 +0000 https://www.esrb.org/?p=4979 Read our key takeaways from the IAPP's Privacy. Security. Risk conference: kids and teen privacy, sensitive data and data minimization, and deidentification.

The post P.S.R. Reinforces Fundamental Privacy Principles in a Changing World appeared first on ESRB Ratings.

]]>
After a busy few days in Austin, I’ve pulled together my key takeaways from last week’s International Association of Privacy Professional’s (IAPP) Privacy. Security. Risk. 2022 conference (P.S.R.). P.S.R. is a one-of-a-kind conference that focuses on the intersection of privacy and technology. And there certainly was lots of tech, from content dealing with bias in AI to privacy engineering. But given the location in Texas, one of many states that now place significant restrictions on women’s reproductive rights, the effect of the U.S. Supreme Court’s recent decision in the Dobbs case on the constitutional right to privacy, was a strong undercurrent throughout the event.

Starting with the keynote session (which you can watch here, if you’re an IAPP member) and going through sessions on geolocation, cybersecurity, and advertising, many speakers grappled with new privacy challenges arising from Dobbs. Much of the conversation, though, focused on applying privacy basics to new and emerging technologies. This year’s P.S.R. highlighted that it’s an important time for companies to be good and responsible stewards of data. Here are more details on three topics that came up repeatedly at the conference: (1) Kids and Teens; (2) Data Minimization; and (3) Deidentification.

Kids and Teens
It’s clear that the UK Children’s Code and its offshoot, the recently passed California Age Appropriate Design Code (CA AADC), are top of mind. Companies are looking for more guidance and best practices from regulators on how to best comply. Both the UK and California codes feature similar concepts such as “the best interests of the child,” and privacy by default and prohibit behavioral ads/profiling. There are some differences, of course, but they are more technical than conceptual. If you’re looking for further analysis, we recommend checking out our post on the CA AADC and reading through the Future of Privacy Forum’s excellent analysis here.

During the keynote session featuring Federal Trade Commission (FTC) Commissioner Rebecca Kelly Slaughter, the IAPP’s Chief Knowledge Officer, Caitlin Fennessy, asked her if there are questions from the FTC’s 95-question Advance Notice of Proposed Rulemaking (ANPR) on commercial surveillance and data security that people should focus on when submitting comments. Commissioner Slaughter mentioned issues of tech addiction and psychological harms to teens that traditionally aren’t thought of as privacy problems, but stem from the same data sets. While the Commissioner did not have any updates on the FTC’s review of the Children’s Online Privacy Protection Act (COPPA) review to share, she strongly encouraged the public to submit comments on the ANPR. Many attendees interpreted the Commissioner’s COPPA comment as yet another signal that the FTC has effectively abandoned the COPPA Rule Review in favor of the ANPR. The FTC just extended the comment period, so you have plenty of time to file your comment.

Sensitive Data and Data Minimization
With five new state privacy laws (California, Virginia, Colorado, Utah, Connecticut) coming into effect next year, there was a lot of discussion about privacy law basics. It’s no surprise then that that the panels focused on defining personal data. In particular, sensitive data came up at nearly every session.

The state laws have similar definitions of sensitive data, but there are some key differences privacy professionals must pay attention to. For example, all states consider special category data like ethnic origin, religious beliefs and sexual orientation to be sensitive data. Virginia, Colorado, and Connecticut all consider personal data collected from a known child to be sensitive information. Each of the state laws specifies precise geolocation as sensitive data, except for Colorado. Colorado instead, is planning to cover geolocation information under its proposed rules for processing “sensitive data inferences.” Sensitive data inferences are “inferences made by a [c]ontroller based on [p]ersonal [d]ata, alone or in combination with other data, which indicate an individual’s racial or ethnic origin; religious beliefs; mental or physical health condition or diagnosis; sex life or sexual orientation; or citizenship or citizenship status.”

And just about every time someone spoke about sensitive data, they stressed the importance of data minimization. This concept that goes back to the Fair Information Practice Principles (FIPPs), first developed in the 1970’s, which contained the collection limitation principle, designed to prevent overcollection of information. As many speakers made clear (referring in part to the Dobbs decision and fears about the use of reproductive data), data can’t be breached, hacked, or turned over to law enforcement if it’s not collected in the first place.

Deidentification
The issue of deidentification also came up frequently, often in relation to data minimization. Deidentification refers to actions that organizations can take to remove identifying characteristics from their data.

Where can you look for deidentification standards? P.S.R. panelists mentioned governmental sources, such as the Health Insurance Portability and Accountability Act’s (HIPAA) deidentification standards in the medical privacy context and the FTC’s three-part test for deidentified data (pasted below from page 10 of this report) as good starting points. The FTC standard states that deidentified data is not:

“reasonably linkable” to the extent that a company: (1) takes reasonable measures to ensure that the data is de-identified; (2) publicly commits not to try to reidentify the data; and (3) contractually prohibits downstream recipients from trying to re-identify the data.

(The California Privacy Rights Act, which comes into effect in January 2023, also uses a similar standard.) That said, deidentification may not last long as a privacy-enhancing tool. As one speaker noted, some data scientists predict that technological advances will allow most data sets to be identifiable within three to five years. Our takeaway: It’s best to err on the side of minimizing the data you collect, use, and share from the outset. This is a principle we’ve long preached to members of ESRB Privacy Certified program.

* * *

Although P.S.R. explored newer technologies from biometrics to data clean rooms, much of the conference focused on core privacy practices: Have you done your risk assessments and data protection impact assessments, and implemented mitigating factors? Do you apply best practices for cybersecurity and have documentation for how and why you might deviate from those best practices and standards? Are you keeping the FIPPs in mind? These, of course, are the types of questions we think about all of the time at ESRB Privacy Certified. Amidst all the changing laws and technologies, it’s reassuring to know that sticking to privacy fundamentals can boost your compliance efforts. And don’t forget, we’re here to help our members with the issues I summarized above – child and teen privacy, sensitive data and data minimization, deidentification – and more.

Photo credit: Meghan Ventura

The post P.S.R. Reinforces Fundamental Privacy Principles in a Changing World appeared first on ESRB Ratings.

]]>
The UK Age Appropriate Design Code: Childproofing the Digital World https://www.esrb.org/privacy-certified-blog/the-uk-age-appropriate-design-code-childproofing-the-digital-world/ Thu, 21 Jan 2021 15:47:36 +0000 https://www.esrb.org/?p=4046 “A generation from now, I believe we will look back and find it peculiar that online services weren’t always designed with children in mind. When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthy, […]

The post The UK Age Appropriate Design Code: Childproofing the Digital World appeared first on ESRB Ratings.

]]>

“A generation from now, I believe we will look back and find it peculiar that online services weren’t always designed with children in mind. When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthy, get a good education or buckle up in the back of a car.”
– Information Commissioner Elizabeth Denham

In May 2018, the European Union’s General Data Protection Regulation (GDPR) went into effect, recognizing for the first time within the European Union (EU) that children’s personal data warrants special protection. The United Kingdom’s Data Protection Act 2018 adopted GDPR within the United Kingdom and, among other things, charged the Information Commissioner’s Office (ICO) with developing a code of practice to protect children’s personal data online. The result is the Age Appropriate Design Code (also referred to as the Children’s Code), an ambitious attempt to childproof the digital world.

The Internet was not built with children in mind, yet children are prolific users of the Internet. The Children’s Code, which is comprised of fifteen “Standards,” seeks to correct that incongruity by requiring online services that children are likely to use to be designed with their best interests in mind.

For over the last twenty years, the U.S. Children’s Online Privacy Protection Act (COPPA) has been the primary source of protection for children’s privacy online. COPPA protects the privacy of internet users under 13 years old, primarily by requiring informed, verifiable consent from a parent or guardian. The Children’s Code, however, has much grander aspirations. It protects all children under 18 years old, asking companies to reimagine their online services from the bottom up.

The foundational principle of the Children’s Code calls for online services likely to be accessed by children under 18 years old to be designed and developed with the best interests of the child as a primary consideration. The Children’s Code is grounded in the United Nations Convention on the Rights of the Child (UNRC), which recognizes that children have several rights, including the rights to privacy and to be free from economic exploitation; to access information; to associate with others and play; and to have a voice in matters that affect them.

To meet the best interests of the child, online services must comply with each of the applicable fifteen Standards. Those Standards are distilled below.

1. Assessing and Mitigating Risks
Compliance with the Children’s Code begins with a Data Protection Impact Assessment (DPIA), a roadmap to compliance and a requirement for all online services that are likely to be accessed by children under 18 years old. The DPIA must identify the risks the online service poses to children, the ways in which the online service mitigates those risks, and how it balances the varying and sometimes competing rights and interests of children of different age groups. If the ICO conducts an audit of an online service or investigates a consumer complaint, the DPIA will be among the first documents requested.
The ICO suggests involving experts and consulting research to help with this process. This might not be feasible for all companies. At a minimum, however, small- and medium-sized companies with online services that create risks to children will be expected to keep up to date with resources that are publicly available. More will be expected of larger companies.
While the Internet cannot exist without commercial interests, the primary consideration must be the best interests of the child. If there is a conflict between the commercial interests of an online service and the child’s interests, the child’s interests must prevail.

2. Achieving Risk-Proportionate Age Assurance
To adequately assess and mitigate risk, an online service must have a level of confidence in the age range(s) of its users that is proportionate to the risks posed by the online service. The greater the risk, the more confidence the online service must have.
The ICO identifies several options to obtain what it calls “age assurance,” which can be used alone or in combination depending on the circumstances. Age assurance options include self-declaration by users (a/k/a age gates), artificial intelligence (AI), third-party verification services, and hard identifiers (e.g., government IDs). Less reliable options, like age gates, are only permitted in low-risk situations or when combined with other age assurance mechanisms.

Achieving an adequate level of confidence will be challenging. The Verification of Children Online (VoCO), a multi-stakeholder child online safety research project led by the U.K.’s Department for Digital, Culture, Media & Sport (DCMS), is attempting to address that challenge. The VoCO Phase 2 Report provided the following potential flow as an example:
[F]or a platform that needs a medium level of confidence, a user could initially declare their age as part of the onboarding process, and alongside this an automated age assurance method (such as using AI analysis) could be used to confirm the declared age. If this measure suggests a different age band than that stated, which reduces confidence in the initial assessment, a request could be made to validate the user’s age through a verified digital parent.

Ultimately, if an online service is unable to reach an appropriate level of confidence, it has two options: 1) take steps to adequately reduce the level of risk; or 2) apply the Children’s Code to all users, even adults.

3. Setting High Privacy by Default
For all children, high privacy must be the default setting. This means an online service may only collect the minimum amount of personal data needed to provide the core or most basic service. Additional, optional elements of the online service, for example to personalize offerings, would have to be individually selected and activated by the child. To illustrate this point, the ICO uses the example of a music download service.

An example of privacy settings that could apply to a music service

High privacy by default also means that children’s personal information cannot be used in ways that have been shown to be detrimental. Based on specific Standards within the Children’s Code, this means the following must be turned off by default:

  • Profiling (for example, behavioral advertising);
  • Geolocation tracking;
  • Marketing and advertising that does not comply with The Committee of Advertising Practice (CAP) Code in the United Kingdom;
  • Sharing children’s personal data; and
  • Utilizing nudge techniques that lead children to make poor choices.

To turn these on, the online service must be able to demonstrate a compelling reason and adequate safeguards.

4. Making Online Tools Available
Children must be given the tools to exercise their privacy rights, whether it be opting into optional parts of a service or asking to delete or get access to their personal information. The tools should be highlighted during the start-up process and must be prominently placed on the user’s screen. They must also be tailored to the age ranges of the users that access the online service. The ICO encourages using easily identifiable icons and other age-appropriate mechanisms.

5. Communicating Age-Appropriate Privacy Information
The Children’s Code requires all privacy-related information to be communicated to children in a way they can understand. This includes traditional privacy policies, as well as bite-sized, just-in-time notices. To help achieve this daunting task, the ICO provides age-based guidance. For example, for children 6 to 9 years old, the ICO recommends providing complete privacy disclosures for parents, while explaining the basic concepts to the children. If a child in this age range attempts to change a default setting, the ICO recommends using a prompt to get the child’s attention, explaining what will happen and instructing the child to get a trusted adult. The ICO also encourages the use of cartoons, videos and audio materials to help make the information understandable to children in different age groups and at different stages of development.

For connected toys and devices, the Children’s Code requires notice to be provided at the point of purchase, for example, a disclosure or easily identifiable icon on the packaging of the physical product. Disclosures about the collection and use of personal data should also be provided prior to setup (e.g., in the instructions or a special insert). Anytime a connected device is collecting information, it should be obvious to the user (e.g., a light goes on), and collection should always be avoided when in standby mode.

6. Being Fair
The Children’s Code expects online services to act fairly when processing children’s personal data. In essence, this means online services must say what they do, and do what they say. This edict applies not just to privacy disclosures, but to all published terms, policies and community standards. If, for example, an online service’s community standards prohibit bullying, the failure to enforce that standard could result in a finding that the online service unfairly collected a child’s personal data.

Initial implementation of the Children’s Code will be a challenge. User disruption is inevitable, as are increased compliance and engineering costs. The return on that initial investment, however, will hopefully make it all worthwhile. If Commissioner Denham’s vision is realized, the digital world will become a safe place for children to socialize, create, play, and learn.

This article has been published in PLI Chronicle, https://plus.pli.edu.

If you have more questions about the Age Appropriate Design Code or you want to learn more about our Program, please reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post The UK Age Appropriate Design Code: Childproofing the Digital World appeared first on ESRB Ratings.

]]>
The UK’s Age Appropriate Design Code: 5 Steps to Get You Started https://www.esrb.org/privacy-certified-blog/the-uks-age-appropriate-design-code-5-steps-to-get-you-started/ Wed, 14 Oct 2020 14:59:30 +0000 https://www.esrb.org/?p=3890 On September 2, 2020, the United Kingdom’s Age Appropriate Design Code (Code) went into effect. There is, however, a 12-month transition period to allow companies to bring their online services—including websites, mobile apps and connected toys and devices—into compliance. While 12 months might seem like a lot of time, it is not. There is much […]

The post The UK’s Age Appropriate Design Code: 5 Steps to Get You Started appeared first on ESRB Ratings.

]]>
On September 2, 2020, the United Kingdom’s Age Appropriate Design Code (Code) went into effect. There is, however, a 12-month transition period to allow companies to bring their online services—including websites, mobile apps and connected toys and devices—into compliance. While 12 months might seem like a lot of time, it is not. There is much work to be done. To get started, we recommend taking 5 steps.

1. Begin Conducting a Data Protection Impact Assessment.
The Code applies to all online services that children under 18 years old are likely to access—in other words, most online services. And if the Code applies, then a Data Protection Impact Assessment (DPIA) is required. If you have not done a DPIA, this is the time. If you have done one, update it with the Code in mind.

Not sure if the Age Appropriate Design Code applies to your online service? Read more here.

The DPIA should be your road map to compliance with the Code. It should identify risks your online service poses to children, and then ways in which you plan to mitigate those risks. It should memorialize the varying and sometimes competing rights and interests of children of different age groups, and how you have balanced those rights and interests. Ultimately, the best interests of the children must be your primary consideration, even trumping your own commercial interests. Familiarize yourself with the UN Convention on the Rights of the Child and the General Comment on Children’s Rights in Relation to the Digital Environment.

The DPIA will take time to complete. While it should be started early, it will be a living document that is updated as new risks are identified and new solutions implemented.

It should be a multi-departmental effort, pulling from the design and development, marketing, data security, and legal teams, at a minimum. However, your Data Protection Officer should head the project.

Keep in mind that if the UK’s Information Commissioner’s Office (ICO) conducts an audit of your online service or investigates a complaint, its first ask will likely include a copy of your DPIA. If you have never done one and you are not sure where to get started, the ICO provides a helpful template on its Children’s Code Hub.

2.Take Steps to Know Your Users.
To conduct a proper DPIA, you will need to determine the level of confidence you have in the age ranges of your users. Specifically, what children are using or are likely to use your online service?

If you do not plan to apply the Code to your online service because you do not believe children under 18 years old are likely to access it, you must be prepared to defend that decision to the ICO. The ICO will expect evidence to support your decision. Do you have empirical data of your users’ ages? Have you conducted a user survey or done consumer research? If not, you may have work to do to satisfy the ICO.

Ultimately, the greater your uncertainty, the greater the risk and, therefore, the greater the need to mitigate. This might include eliminating elements of your online service especially risky to children or taking steps to limit children’s access. Please keep in mind, however, that the ICO does not want to see an age-gated Internet. In fact, according to the ICO, the use of age gates—i.e., where a user declares his or her age—is only appropriate in low-risk situations or where additional safeguards are in place.

3. Plan for “High Privacy” by Default.
The ICO seems to want “high privacy” to be the default setting for all users, but it is only required for users under 18 years old. High privacy by default means:
• Only collecting personal data needed to provide your “core” service;
• Allowing children to opt into optional elements of your service that require the additional collection and use of personal data, and minimizing the personal data you collect for those additional elements; and
• Turning off “detrimental uses,” like profiling, data sharing, and geolocation tracking, by default and only allowing them to be turned on when there is a compelling need and adequate protections in place.
The following is an example the ICO provides for a music download service, which helps to illustrate this point:

4. Begin Developing Online Tools.
Children must be given tools within your online service to make choices and exercise rights. This should include, for example, the ability to opt into and opt out of optional elements of your service, request the deletion of their personal data, and obtain access to their personal data. These tools must be highlighted to the child during the start-up process, prominently placed on the screen, and age appropriate.

5. Work on Age Appropriate Privacy Notices.
In addition to your standard privacy policy intended for adults, the Code requires privacy disclosures that are understandable and accessible to your child users. If your online service is accessed by or likely to be accessed by children in different age groups, appropriate disclosures will need to be tailored to each of those age groups. For children 6 to 9 years old, for example, the ICO expects you to explain the basic concepts of your online service’s privacy practices and online tools. You are encouraged to use cartoons, and video and audio materials to make the disclosures child friendly. Older teens, in contrast, should be given more detail and more choices.
Moreover, you are expected to do more than just post a privacy policy. If, for example, a child attempts to opt into a lower privacy setting, you are expected to display an age appropriate notice. Children should be encouraged to get or speak with a parent or trusted adult. They should be told what personal data will be collected if the default setting is changed and how that information will be used. If the personal data will be shared with a third party, the child should be given a separate opt-in choice for the sharing or, at a minimum, a clear and age appropriate notice that the data will be shared. Any risks should be highlighted, and children should be encouraged to keep the default setting if they are unsure or do not understand. The following is a sample notice provided by the ICO:

If you have more questions about the Age Appropriate Design Code or you want to learn more about our Program, please reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post The UK’s Age Appropriate Design Code: 5 Steps to Get You Started appeared first on ESRB Ratings.

]]>