The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.
]]>There are hundreds of hot hashtags on TikTok, the viral video-sharing platform with over 1 billion users worldwide, but it’s safe to say that #kidsprivacy (or even #privacy) isn’t among them. The top hashtags in the U.S. for the past week (which change almost daily) collectively generated 286K posts over seven days while #childprivacy and variants have a grand total of 182 posts from all over the world, for all (TikTok) time. Still, children’s privacy is a trending topic for the platform, which has been facing global scrutiny over its children’s data privacy practices.
To date, TikTok has paid out roughly half a billion dollars in children’s privacy suits brought by regulators (and private plaintiffs) in the United States, as well as the United Kingdom and the European Union. Last week, TikTok’s privacy woes exploded when the U.S. Department of Justice (DOJ), acting on behalf of the Federal Trade Commission (FTC), filed a complaint in a federal court in California against TikTok, its Chinese parent, ByteDance Ltd., and several related entities (collectively, TikTok) alleging “unlawful massive-scale invasions of children’s privacy” affecting millions of children under the age of 13.
As expected, the government alleged that TikTok “flagrantly violat[ed]” the Children’s Online Privacy Protection Act (COPPA), and the COPPA Rule. The government also alleged that TikTok infringed a settlement agreement with the FTC over an earlier COPPA lawsuit that arose from the FTC’s 2019 investigation of TikTok’s predecessor company, Musical.ly.
The FTC’s original 2019 complaint alleged that the video sharing platform shared extensive personal information from children under the age 13 without verifiable parental consent (VPC) as required by COPPA. User accounts were public by default, which meant that other users could see a child’s personal information, including their profile bio, username, picture, and videos. Although the app allowed users to change their default setting from public to private so that only approved users could follow them, kids’ profile pictures and bios remained public, and strangers could still send them direct messages.
TikTok ultimately entered into a consent order with the FTC, forking over $5.7 million in civil monetary penalties to resolve the action, the largest COPPA fine at that time. (Since then, the FTC has obtained much larger monetary settlements in COPPA cases against Google/YouTube ($170 million) and Epic Games ($275 million)). The 2019 order also required TikTok, among other things, to destroy all personal information collected from users under age 13 or obtain parental consent for those accounts.
The main claims in the new lawsuit are that Tiktok: (1) knowingly created accounts for children and collected data from those children without first notifying their parents and obtaining verifiable parental consent (VPC); (2) failed to honor parents’ requests to delete their children’s accounts and information; and (3) failed to delete the accounts and information of users they know are children. As the FTC put it in its press release announcing the new case, TikTok was “aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about . . . compliance failures that put children’s data and privacy at risk. Instead of complying . . . TikTok spent years knowingly allowing millions of children under 13 on their platform . . . in violation of COPPA . . . .”
Unlike the 2019 case, the new TikTok action is not a settlement, and the government will need to prove its allegations in court to prevail on its claims. TikTok has made clear that it disagrees with the complaint’s allegations, stating that many “relate to past events and practices that are factually inaccurate or have been addressed.” What will happen next, though, is unclear.
Although we expect that TikTok will file a motion to dismiss the complaint, TikTok is facing much larger stakes than COPPA’s $51,744 per violation civil penalty. (Even if you only calculate violations per kid, that’s an astronomical amount given that they were “ubiquitous” on the platform.) The COPPA case is playing out alongside TikTok’s existential tangle with the U.S. government over Congress’ “ban or divest” law. TikTok has challenged the constitutionality of that law, which requires ByteDance to divest its U.S. TikTok assets by January 19, 2025 or face a ban on the app.
Regardless of what happens, the government’s complaint provides insights into the FTC’s views on what companies can and can’t do with kids’ data under COPPA. Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel:
1) You Can’t Use Kids’ Information for Profiling and Marketing Under the “Internal Operations” Exception: Following the 2019 settlement, TikTok used an age gate (in this case, a date of birth prompt) to identify U.S. users under the age of 13 and created “TikTok for Younger Users” (what the complaint calls “Kids Mode”), a limited experience that allows kids to view videos but does not allow them to create or upload videos, post information publicly, or message other users. Although TikTok touted its “safety and privacy protections designed specifically for an audience that is under 13 years old,” according to the complaint, it still collected and used “extensive” personal information – “far more data than it needed” – from Kids Mode account holders without first providing parental notice or obtaining VPC.
The information collected included username, password, and date of birth along with persistent identifiers like IP address and unique device identifiers. According to the complaint, TikTok combined this information with app activity data, device information, mobile carrier information, and app information to amass profiles on children and share it with third parties. In one outrageous example, the complaint alleges that TikTok shared kids’ profiles with the analytics and marketing measurement platform AppsFlyer and with Facebook, so they could “retarget” (lure back) users whose engagement had declined.
As the complaint makes clear, TikTok’s use of persistent identifiers like device ID from Kids Mode users does not comport with the “internal operations” exception, which only permits companies to use such identifiers without VPC if they do not collect any other personal information and only “for the sole purpose” of providing support for an online service’s internal operations. Although there is some scope for companies to collect and use kids’ information for internal operations without VPC, companies cannot interpret the internal operations exception broadly to cover the collection and use of persistent identifiers for profiling and marketing.
2) You Can’t Allow Kids to Circumvent COPPA: Although COPPA does not require companies to validate users’ ages, you can’t allow users to circumvent COPPA by building, “back doors that allowed users to bypass the age gate . . . .” In the complaint, the government alleges that by allowing users to use login credentials from certain third-party online services, including Instagram and Google, TikTok allowed users to avoid the age gate altogether and set up regular accounts. These policies and practices led to the creation of millions of “unknown user” accounts that allowed children to gain access to adult content and features of the general TikTok platform. TikTok, in turn, then collected and maintained vast amounts of personal information from the children who created and used these regular TikTok accounts without their parents’ consent.
3) Make Sure Your Age Gates Work: The complaint alleges that kids could easily retry the age gate. TikTok did not prevent children who initially put in an under-13 birth date from restarting the account creation process and providing a new birth date that would make them old enough to lift Kids Mode. As the FTC’s COPPA FAQs have long recommended, you should use technical means, such as a cookie, to prevent children from back-buttoning to enter a different age.
4) Don’t Make Deletion Difficult – and Do It!: Much of the complaint focuses on TikTok’s failure to delete accounts and information that “even their own employees and systems identify as belonging to children” as well as its other failures to delete children’s personal data upon parental request. The government alleges, for example, that TikTok required parents to “navigate a convoluted process” to request the deletion of personal information collected from their children. TikTok often did not honor parents’ requests, either by not responding to their requests at all, or by only deleting accounts if there were “objective indicators” that the account holder was under 13 or the parent completed a form certifying under penalty of perjury that they were the parent or guardian of the account holder. Alongside these allegations, the complaint also alleges that TikTok retained kids’ data in databases long after purportedly deleting their accounts.
5) Don’t Mislead Regulators: The government’s complaint also details the ways in which TikTok failed to maintain records and communications relating to its children’s privacy practices and compliance with the 2019 order. More critically, the complaint alleges that TikTok made false statements that it had removed child accounts and deleted the associated data. Instead, as the complaint states, TikTok retained and had been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed],” including the data of child, teen, and adult users, including IP addresses, device IDs, device models, and advertising IDs. If true, that’s TikTok cringe-worthy.
We’re sure there’s lots more to learn from the complaint, but for now we’ll stick with these five takeaways. We’ll be following the case closely as it plays out in the federal court and providing other pointers to ESRB Privacy Certified members. And maybe we’ll check out next week’s top hashtags to see if #kidsprivacy makes it to the top ten, unlikely as that seems.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.
The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.
]]>The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.
]]>There are hundreds of hot hashtags on TikTok, the viral video-sharing platform with over 1 billion users worldwide, but it’s safe to say that #kidsprivacy (or even #privacy) isn’t among them. The top hashtags in the U.S. for the past week (which change almost daily) collectively generated 286K posts over seven days while #childprivacy and variants have a grand total of 182 posts from all over the world, for all (TikTok) time. Still, children’s privacy is a trending topic for the platform, which has been facing global scrutiny over its children’s data privacy practices.
To date, TikTok has paid out roughly half a billion dollars in children’s privacy suits brought by regulators (and private plaintiffs) in the United States, as well as the United Kingdom and the European Union. Last week, TikTok’s privacy woes exploded when the U.S. Department of Justice (DOJ), acting on behalf of the Federal Trade Commission (FTC), filed a complaint in a federal court in California against TikTok, its Chinese parent, ByteDance Ltd., and several related entities (collectively, TikTok) alleging “unlawful massive-scale invasions of children’s privacy” affecting millions of children under the age of 13.
As expected, the government alleged that TikTok “flagrantly violat[ed]” the Children’s Online Privacy Protection Act (COPPA), and the COPPA Rule. The government also alleged that TikTok infringed a settlement agreement with the FTC over an earlier COPPA lawsuit that arose from the FTC’s 2019 investigation of TikTok’s predecessor company, Musical.ly.
The FTC’s original 2019 complaint alleged that the video sharing platform shared extensive personal information from children under the age 13 without verifiable parental consent (VPC) as required by COPPA. User accounts were public by default, which meant that other users could see a child’s personal information, including their profile bio, username, picture, and videos. Although the app allowed users to change their default setting from public to private so that only approved users could follow them, kids’ profile pictures and bios remained public, and strangers could still send them direct messages.
TikTok ultimately entered into a consent order with the FTC, forking over $5.7 million in civil monetary penalties to resolve the action, the largest COPPA fine at that time. (Since then, the FTC has obtained much larger monetary settlements in COPPA cases against Google/YouTube ($170 million) and Epic Games ($275 million)). The 2019 order also required TikTok, among other things, to destroy all personal information collected from users under age 13 or obtain parental consent for those accounts.
The main claims in the new lawsuit are that Tiktok: (1) knowingly created accounts for children and collected data from those children without first notifying their parents and obtaining verifiable parental consent (VPC); (2) failed to honor parents’ requests to delete their children’s accounts and information; and (3) failed to delete the accounts and information of users they know are children. As the FTC put it in its press release announcing the new case, TikTok was “aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about . . . compliance failures that put children’s data and privacy at risk. Instead of complying . . . TikTok spent years knowingly allowing millions of children under 13 on their platform . . . in violation of COPPA . . . .”
Unlike the 2019 case, the new TikTok action is not a settlement, and the government will need to prove its allegations in court to prevail on its claims. TikTok has made clear that it disagrees with the complaint’s allegations, stating that many “relate to past events and practices that are factually inaccurate or have been addressed.” What will happen next, though, is unclear.
Although we expect that TikTok will file a motion to dismiss the complaint, TikTok is facing much larger stakes than COPPA’s $51,744 per violation civil penalty. (Even if you only calculate violations per kid, that’s an astronomical amount given that they were “ubiquitous” on the platform.) The COPPA case is playing out alongside TikTok’s existential tangle with the U.S. government over Congress’ “ban or divest” law. TikTok has challenged the constitutionality of that law, which requires ByteDance to divest its U.S. TikTok assets by January 19, 2025 or face a ban on the app.
Regardless of what happens, the government’s complaint provides insights into the FTC’s views on what companies can and can’t do with kids’ data under COPPA. Here are five attention-grabbing takeaways that should be a part of any company’s #COPPA reel:
1) You Can’t Use Kids’ Information for Profiling and Marketing Under the “Internal Operations” Exception: Following the 2019 settlement, TikTok used an age gate (in this case, a date of birth prompt) to identify U.S. users under the age of 13 and created “TikTok for Younger Users” (what the complaint calls “Kids Mode”), a limited experience that allows kids to view videos but does not allow them to create or upload videos, post information publicly, or message other users. Although TikTok touted its “safety and privacy protections designed specifically for an audience that is under 13 years old,” according to the complaint, it still collected and used “extensive” personal information – “far more data than it needed” – from Kids Mode account holders without first providing parental notice or obtaining VPC.
The information collected included username, password, and date of birth along with persistent identifiers like IP address and unique device identifiers. According to the complaint, TikTok combined this information with app activity data, device information, mobile carrier information, and app information to amass profiles on children and share it with third parties. In one outrageous example, the complaint alleges that TikTok shared kids’ profiles with the analytics and marketing measurement platform AppsFlyer and with Facebook, so they could “retarget” (lure back) users whose engagement had declined.
As the complaint makes clear, TikTok’s use of persistent identifiers like device ID from Kids Mode users does not comport with the “internal operations” exception, which only permits companies to use such identifiers without VPC if they do not collect any other personal information and only “for the sole purpose” of providing support for an online service’s internal operations. Although there is some scope for companies to collect and use kids’ information for internal operations without VPC, companies cannot interpret the internal operations exception broadly to cover the collection and use of persistent identifiers for profiling and marketing.
2) You Can’t Allow Kids to Circumvent COPPA: Although COPPA does not require companies to validate users’ ages, you can’t allow users to circumvent COPPA by building, “back doors that allowed users to bypass the age gate . . . .” In the complaint, the government alleges that by allowing users to use login credentials from certain third-party online services, including Instagram and Google, TikTok allowed users to avoid the age gate altogether and set up regular accounts. These policies and practices led to the creation of millions of “unknown user” accounts that allowed children to gain access to adult content and features of the general TikTok platform. TikTok, in turn, then collected and maintained vast amounts of personal information from the children who created and used these regular TikTok accounts without their parents’ consent.
3) Make Sure Your Age Gates Work: The complaint alleges that kids could easily retry the age gate. TikTok did not prevent children who initially put in an under-13 birth date from restarting the account creation process and providing a new birth date that would make them old enough to lift Kids Mode. As the FTC’s COPPA FAQs have long recommended, you should use technical means, such as a cookie, to prevent children from back-buttoning to enter a different age.
4) Don’t Make Deletion Difficult – and Do It!: Much of the complaint focuses on TikTok’s failure to delete accounts and information that “even their own employees and systems identify as belonging to children” as well as its other failures to delete children’s personal data upon parental request. The government alleges, for example, that TikTok required parents to “navigate a convoluted process” to request the deletion of personal information collected from their children. TikTok often did not honor parents’ requests, either by not responding to their requests at all, or by only deleting accounts if there were “objective indicators” that the account holder was under 13 or the parent completed a form certifying under penalty of perjury that they were the parent or guardian of the account holder. Alongside these allegations, the complaint also alleges that TikTok retained kids’ data in databases long after purportedly deleting their accounts.
5) Don’t Mislead Regulators: The government’s complaint also details the ways in which TikTok failed to maintain records and communications relating to its children’s privacy practices and compliance with the 2019 order. More critically, the complaint alleges that TikTok made false statements that it had removed child accounts and deleted the associated data. Instead, as the complaint states, TikTok retained and had been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed],” including the data of child, teen, and adult users, including IP addresses, device IDs, device models, and advertising IDs. If true, that’s TikTok cringe-worthy.
We’re sure there’s lots more to learn from the complaint, but for now we’ll stick with these five takeaways. We’ll be following the case closely as it plays out in the federal court and providing other pointers to ESRB Privacy Certified members. And maybe we’ll check out next week’s top hashtags to see if #kidsprivacy makes it to the top ten, unlikely as that seems.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.
The post #KidsPrivacy Trending for TikTok: Top Takeaways from the New COPPA Enforcement Action appeared first on ESRB Ratings.
]]>The post Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes appeared first on ESRB Ratings.
]]>With calls to strengthen kids’ online privacy and safety protections growing louder by the day, 2023 was supposed to be the year that Congress would pass new legislation. That didn’t happen. Enter the Federal Trade Commission (FTC).
The agency pursued several blockbuster children’s privacy enforcement actions in 2023, including two against video game companies, that resulted in hundreds of millions in fines and landmark legal remedies. Then, at the very end of the year, the agency issued long-awaited proposals for changes to the Children’s Online Privacy Protection Rule, a process it began in 2019.
The COPPA Rule, last updated in 2013, implements the Children’s Online Privacy Protection Act, which dates back even earlier — to 1999. Although the agency can’t change the Act itself (that’s Congress’ job), it can make far-reaching changes to the Rule. It’s still unclear what a final rule will look like and when (or whether) it will arrive, but the FTC’s cusp of the year move means that 2024 will certainly be a consequential year for children’s privacy.
As a longstanding FTC-authorized COPPA Safe Harbor program, we follow the agency’s COPPA work closely. We’ve delved into the Notice of Proposed Rulemaking (NPRM) to understand what the NPRM will mean for our member video game and toy companies – and for the millions of kids and teens (and their parents) that play games. (Although the average age of a gamer is 32, 76% of people under the age of 18 play video games.) We plan to file a comment on the proposed rule changes within the 60-day comment period that will start to run once the NPRM is published in the Federal Register, most likely later this week.
Although we’re still considering our responses to the NPRM, we’re providing a summary of the most important provisions to spare you reading all 164 pages of the document. (LinkedIn estimated that it would take me 228 minutes to read the NPRM. Once. I’ve already ready it multiple times.) So, if you don’t have four – or forty – hours to devote to COPPA Rule reform, read on. It shouldn’t take four hours, but this blog is on the longer side. For convenience, we’ve divided it into three categories: (1) Changes; (2) Emphasis; and (3) Status Quo.
CHANGES
First up, notable changes to definitions and substantive aspects of the Rule:
EMPHASIS
Beyond these proposed changes, it’s worth noting what is staying the same, but with more emphasis. Two issues stand out:
STATUS QUO
Finally, here’s what isn’t changing, at least not as part of the FTC’s rulemaking process:
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds CIPP/US and CIPP/E certifications from the International Association of Privacy Professionals.
The post Probing the FTC’s COPPA Proposals: Updates to Kids’ Privacy Rule Follow Agency’s Focus on Technological Changes appeared first on ESRB Ratings.
]]>The post A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern appeared first on ESRB Ratings.
]]>Since the beginning of September, we’ve seen the Irish Data Protection Commission issue a huge, €345 million ($367 million) fine against TikTok for using unfair design practices that violate kids’ privacy. Delaware’s governor just signed a new privacy law that bans profiling and targeted advertising for users under the age of 18 unless they opt-in. And the Dutch data protection authority, just this week, announced an investigation into businesses’ use of generative AI in apps directed at young children.
As I was catching up with these matters yesterday, news broke that a federal district court judge in California had granted a preliminary injunction (“PI”) prohibiting the landmark California Age Appropriate Design Code Act (“CAADCA”) from going into effect on July 1, 2024. The judge ruled that the law violates the First Amendment’s free speech guarantees.
As ESRB Privacy Certified blog readers might recall, in September 2022, California enacted the CAADCA, establishing a far-reaching privacy framework that requires businesses to prioritize the “best interests of the child” when designing, developing, and providing online services. At the time, I wrote that the California law had the “potential to transform data privacy protections for children and teens in the United States.”
In particular, I pointed to the law’s coverage of children under the age of 18, its applicability to all online services “likely to be accessed by a minor,” and its requirement that businesses set default privacy settings that offer a “high level” of privacy protection (e.g., turning off geolocation and app tracking settings) unless the business can present a “compelling reason” that different settings are in the best interests of children. I also noted the Act’s provisions on age estimation/verification, data protection impact assessments (“DPIAs”), and data minimization as significant features.
In December 2022, tech industry organization NetChoice filed a lawsuit challenging the CAADCA on a wide range of constitutional and other grounds. In addition to a cluster of First Amendment arguments, NetChoice asserted that the Children’s Online Privacy Protection Act (“COPPA”), which is enforced primarily by the Federal Trade Commission (“FTC”), preempts the California law. The State of California, represented by the Office of the Attorney General, defended the law, arguing that the “Act operates well within constitutional parameters.”
Yesterday’s PI shifts the “atmospherics” of the kids’ privacy landscape dramatically. But the injunction doesn’t mean that businesses and privacy practitioners can ignore the underlying reasons for the CAADCA (which was passed overwhelmingly by the California legislature) or the practices and provisions it contains. Here’s a very rough analysis of the decision and some tips about what it might mean for your kids’ privacy program.
The Court’s Holding: In her 45-page written opinion, Judge Beth Labson Freeman held that “NetChoice has shown that it is likely to succeed on the merits of its argument that the provisions of the CAADCA intended to achieve [the purpose of protecting children when they are online] likely violates the First Amendment.” The Court held that the CAADCA is a regulation of protected expression, and not simply a regulation of non-expressive conduct, i.e., activity without a significant expressive element. Because she viewed the statute as implicating “commercial speech,” the Court analyzed the CAADCA under an “intermediate scrutiny standard of review.”
The Relevant Test: Under that standard (often referred to as the Central Hudson test based on the name of the Supreme Court case that formulated it), if the challenged regulation concerns lawful activity and speech that is not misleading, the government bears the burden of proving that (i) it has a “substantial interest” in the regulation advanced, (ii) that the regulation directly and materially advance the government’s substantial interest, and (iii) that the regulation is “narrowly tailored” to achieve that interest.
The Court recognized that California would likely succeed in establishing a substantial interest in protecting minors from harms to their physical and psychological well-being caused by lax data and privacy protections online. Reviewing the CAADCA’s specific provisions, however, it found that that many of the provisions challenged by NetChoice did not meet the remaining prongs of the intermediate scrutiny test.
The Court’s Central Hudson Analysis: The Court made findings on each of the specific provisions challenged by NetChoice keyed to the Central Hudson factors. I highlight a few here:
COPPA Preemption: Although the Court granted the injunction based on First Amendment considerations alone, it did, briefly, address NetChoice’s argument that the COPPA preempts the CAADCA. The Court rejected this argument at the PI stage, explaining: “In the Court’s view, it is not clear that the cited provisions of the CAADCA contradict, rather than supplement, those of COPPA. Nor is it clear that the cited provisions of the CAADCA would stand as an obstacle to enforcement of COPPA. An online provider might well be able to comply with the provisions of both the CAADCA and COPPA . . . . “
Takeaways: The CAADCA litigation is far from over, and it is likely that the California Attorney General will seek an immediate interlocutory appeal. It is clear, though, that the district court’s decision will have consequences in the short term for state privacy laws that are scheduled to come into effect soon as well as for efforts underway in Congress on child-related online privacy and safety legislation. Here are a few takeaways:
While this all unfolds, ESRB Privacy Certified will continue to help its program members comply with existing laws and adopt and implement best practices for children’s privacy. As privacy protections for kids and teens continue to evolve, we’ll be following closely and providing guidance to our program members on all of the moving parts of the complex children’s privacy landscape. To learn more about ESRB Privacy Certified’s compliance and certification program, please visit our website, find us on LinkedIn, or contact us at privacy@esrb.org.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern appeared first on ESRB Ratings.
]]>The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.
]]>So, for a little fun, we decided to create an imaginary video game – COPPA Battlegrounds. Join the ESRB Privacy Certified team as we dive deeply into the ongoing saga of the Federal Trade Commission’s kids’ privacy enforcement actions – cases that have resulted in hundreds of millions of dollars in fines and landmark legal remedies. Venture into new privacy territory, unlocking the mysteries of “personal information,” “privacy by default,” “data retention,” and more! Collect XPs as you explore strategies and best practices to protect young gamers’ privacy.
The “COPPA Controller”: The Federal Trade Commission (FTC) is the U.S. government agency charged with protecting consumers and competition. It is the chief federal agency that works to protect consumer privacy. Over the years, it has brought hundreds of privacy and data security cases to protect consumers and their data.
The “Digital Defendants”: Several well-known tech companies have been hit with FTC actions alleging violations of children’s privacy law in the past half year. Two – Epic Games and Microsoft Xbox – are popular video game publishers. Amazon, Meta, and Edtech company, Edmodo, are also in the line-up.
The “Sword of COPPA”: The Children’s Online Privacy Protection Act of 1998 (COPPA) and its implementing COPPA Rule (updated in 2013) provide the FTC with a powerful weapon to protect the privacy of children under the age of 13. The law and rule (together, COPPA) require companies that offer services “directed to children,” or that have knowledge that kids under 13 are using their services, to provide notice of their data practices. They must also obtain verifiable parental consent (VPC) from parents before collecting personal information from children. COPPA also contains strong substantive protections, mandating that companies minimize the data they collect from children, honor parents’ data deletion requests, and implement strong security safeguards. To date, the FTC has brought nearly 40 COPPA enforcement actions.
The “Section 5 Superweapon”: The FTC’s true superweapon comes from Section 5 of the Federal Trade Commission Act, which prohibits unfair or deceptive practices in the marketplace. Since the advent of the internet, the FTC has used Section 5 to address a wide range of issues that affect people online, including the privacy of people purchasing and playing video games.
Policy Statement “Power-ups”: From time to time, the FTC releases policy statements that explain how the agency applies the laws it enforces. These potent statements put companies on notice that they will face legal action if they ignore the FTC’s prescriptions. In May, the FTC issued a Policy Statement on Biometric Information, which sets out a list of unfair practices relating to the collection and use of such data. Earlier, the FTC issued a Policy Statement on COPPA and EdTech that emphasized COPPA’s limits on companies’ ability to collect, use, and retain children’s data.
The FTC’s quest to secure a safer online environment for kids and their personal information has been ongoing since Congress passed COPPA in 1998. Previous blockbuster titles in the COPPA franchise include the FTC’s landmark 2019 settlement with Google/You Tube and the 2018 VTech and Musical.ly/TikTok actions.
COPPA has been extremely effective in giving parents information about and control over their kids’ data. There’s been an emerging consensus, however, that the legal framework for children’s privacy should be updated to include teenagers and meet the challenges of social media, mobility, ad tech, and immersive technologies – issues that weren’t present when Congress enacted the law 25 years ago. Despite the introduction of several bills in Congress to update COPPA, none have yet become law. The FTC therefore has proposed several new ideas to protect the privacy of not only children under the age of 13 but teens too. These are now playing out in the FTC’s enforcement actions.
During the past half year or so, the FTC has announced four new COPPA actions, plus a an order against Meta/Facebook relating to a previous settlement. For video game companies, two stand out: the Epic Games/Fortnite settlement (see our earlier blog) and the Microsoft/Xbox Live settlement, announced in June. The FTC’s settlements with Amazon/Alexa and Edmodo also provide some clues to unlocking the secrets of the FTC’s COPPA enforcement mode. Consistent with ESRB Privacy Certified’s focus on privacy compliance in video games, we’ll focus our analysis on the two gaming cases. But we’ll add some insights from the NPCs (here, nonplayable “cases”), too.
Late last year, the FTC filed a two-count complaint and proposed settlement order against Epic Games. It alleged that Epic knew its massively popular game Fortnite was “directed to children” and unlawfully collected personal data from them without VPC. The FTC also charged Epic with violating the FTC Act by using unfair “on by default” voice and text chat settings that led to children and teens being bullied, threatened, and harassed within Fortnite. Epic settled with the FTC, agreeing to pay a $275 million civil penalty and to standard injunctive relief. (In the privacy area, this includes monitoring, reports, a comprehensive privacy plan, and regular, independent audits.) The final court Order entered in February also required Epic to implement privacy-protective default settings for children and teens. It also required the company to delete personal information previously collected from children in Fortnite unless the company obtains parental consent to retain such data or the user identifies as 13 or older.
In the beginning of June, the FTC filed a one-count complaint and proposed settlement order against Microsoft alleging that its Xbox Live online service violated COPPA in three ways: (i) by collecting personal information (i.e., email address, first and last name, date of birth, and phone number) from kids under 13 before notifying their parents and getting VPC; (ii) by failing to provide clear and complete information about its data practices in COPPA’s required notices, i.e., that it didn’t tell parents that it would disclose Xbox’s customer unique persistent identifier to third-party game and app developers; and (iii) by holding on to kids’ data for years even when parents did not complete the account creation process.
Microsoft, which has long had a comprehensive privacy program, settled with the FTC for $20 million. It agreed to implement new business practices to increase privacy protections for Xbox users under 13. For example, the Order requires Microsoft to tell parents that a separate child account will provide significant privacy protections for their child by default. The company also must maintain a system to delete, within two weeks from the collection date, all personal information collected from kids for the purpose of obtaining parental consent. In addition, Microsoft must honor COPPA’s data deletion requirements by deleting all other personal data collected from children after it no longer needs it for the purpose collected.
Beyond the allegations and remedies of the enforcement actions, there’s a wealth of information about the FTC’s kids’ privacy priorities and practices you might want to adopt – or avoid – if you want to stay out of the sites of the COPPA Controller. Here are COPPA Battlegrounds seven lessons for COPPA compliance based on the FTC’s recent kids’ privacy actions:
1. Sequence your game play to obtain VPC before you collect ANY personal information from a child: The FTC’s complaint in the Xbox action emphasized that – even though Microsoft had a VPC program in place – it violated COPPA by not obtaining parental consent before it collected any personal information from kids besides their data of birth. Xbox did require children to involve their parents in the registration process, but the FTC found that Microsoft’s initial collection of kids’ email addresses, their first and last name, and phone number before obtaining consent violated COPPA’s VPC requirements. The FTC also blasted Microsoft for requiring kids to agree to the company’s service agreement, which, until 2019, included a pre-checked box allowing Microsoft to send them promotional messages and to share user data with advertisers. The FTC’s approach indicates that they will look closely at companies’ verifiable parental consent sequences, and that they will strictly enforce COPPA’s prohibition on collecting any personal information before obtaining VPC (unless an exception to VPC exists).
2. The FTC views COPPA’s “actual knowledge” standard broadly and so should you: When the FTC announced its Epic Games settlement, we reminded companies that you can’t disclaim COPPA by declaring that you don’t process children’s information or by ignoring evidence that children are playing your games. Now, with the Xbox Live settlement, the FTC has affirmed that it will enforce COPPA against any company with “actual knowledge” that the company is handling children’s personal information, regardless of whether that company has directed its service to children intentionally. Significantly, the settlement requires Microsoft – when it discloses personal information about children to other video game publishers – to tell them that the user is a child. The FTC’s requirement for Microsoft to share information about children on its platform with third parties is a game-changing move. In the FTC’s words, “[I]t will put [third-party] publishers on notice that they, too, must apply COPPA protections to that child.”
3. Your COPPA notices must be clear, understandable, and complete: The FTC emphasized that it’s not enough under COPPA’s notice provisions to summarize your collection, use, and disclosure practices generally. Instead, your direct notice must be complete. The FTC faulted Microsoft for failing to tell parents about its collection of personal information children shared through their profile or Xbox Live usage, such as their “gamertags,” photos, which kids used to create avatars, and voice recordings from video messages. The agency also alleged that Microsoft’s notice failed to inform parents that it created persistent identifiers for children, which it combined with other information, and shared with third-party game and app developers. Going forward, it’s important for companies to specify, in a clear and complete way, their practices in the notices required by COPPA, and not just provide parents with a link to a densely worded privacy policy.
4. Privacy by default is not a fad: In Epic Games, the FTC focused for the first time not just on “privacy by design” but on “privacy by default,” finding that Epic did not have “privacy-protective” default settings in Fortnite that limited kids’ contact with strangers and otherwise protected their privacy. The FTC went further in Xbox Live, emphasizing that, even though Xbox had default settings that only allowed a child to disclose their activity feed or otherwise communicate with parent-approved “friends,” Microsoft configured other defaults in a way that did not protect children sufficiently. As the FTC emphasized in a blog about the Amazon case, “[C]ompanies that ignore consumers’ rights to control their data do so at their peril . . . The upshot is clear: Any company that undermines consumer control of their data can face FTC enforcement action.”
5. Take your data minimization and retention/deletion obligations seriously: The FTC’s recent cases also highlight COPPA’s substantive provisions on data minimization and data retention. The COPPA Rule prohibits conditioning a child’s participation in a game on the child “disclosing more personal information than is reasonably necessary to participate in such activity” and allows companies to keep it “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.” In the Edmodo complaint, for example, the agency said that Edmodo violated COPPA by using the personal information it collected for advertising instead of limiting it to educational purposes.
In the Xbox Live case, the agency chided Xbox for holding onto kids’ data when the parental verification process was incomplete, sometimes for years. Although Microsoft described this as a “technical glitch,” and explained that this data “was never used, shared, or monetized,” the FTC doubled down on its concerns with company data retention practices that violate COPPA. Indeed, in the Amazon Alexa case, the FTC charged that Amazon made it difficult for parents to exercise their right, under COPPA, to delete their children’s voice recording data. It further alleged that Amazon disregarded parents’ deletion requests, retained kids’ voice recordings indefinitely, and misled parents about its data deletion practices (e.g., by retaining copies of transcripts of voice recordings). The FTC is wielding the “Sword of COPPA” to press for meaningful data minimization, purpose limitation, and data retention/deletion practices.
6. Be especially careful when dealing with kids’ biometric data, algorithms, and machine learning: The FTC’s Xbox Live settlement covers biometric information like avatars generated from a child’s image and emphasizes COPPA’s strict limitations on the retention of this type of data from kids. In the Amazon case, the agency was clearly troubled by Amazon’s retention of kids’ voice recordings, which count as biometric info, indefinitely. One of the FTC Commissioners emphasized this point, stating that “Claims from businesses that data must be indefinitely retained to improve algorithms do not override legal bans on indefinite retention of data.” Consider yourself warned!
7. Privacy Innovation Can Help You Comply with COPPA: Not all the privacy-protective action in COPPA Battlegrounds comes from the FTC. Even before the settlement, Epic Games announced that it was creating “Cabined Accounts” to provide safe, tailored experiences for younger players. Following the FTC’s action, Microsoft unveiled its plans to test “next-generation identity and age validation” methods to create a “convenient, secure, one-time process for all players that will allow us to better deliver customized, safe, age-appropriate experiences.” Xbox explained that the entire games industry can benefit from advancing safe and innovative digital experiences that are accessible, simple to use, and benefit all players. We agree! Many ESRB Privacy Certified members are developing new strategies and tools to enhance kids’ privacy. Achievement unlocked!
Congratulations on completing the breakout version of COPPA Battlegrounds! You can now take your kids’ privacy program to the next level. Contact us at privacy@esrb.org if you’d like to discuss how your company can prevail in COPPA Battlegrounds – and its inevitable sequels.
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule, and the general “Privacy Certified” seal.
The post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions appeared first on ESRB Ratings.
]]>The post Wrapping Up 2022 with A Huge (Epic) Fortnite Privacy Case appeared first on ESRB Ratings.
]]>The “s” in settlements is not a typo: On Monday, the FTC announced two separate enforcement actions against Epic. Consistent with ESRB Privacy Certified’s focus on privacy compliance, though, we’ll limit our analysis to the FTC’s privacy-related case. In short, the FTC (represented by the Department of Justice) filed a two-count Complaint and a Stipulated Order in federal court alleging that Epic violated the Children’s Online Privacy Protection Act (COPPA) and the related COPPA Rule. COPPA protects the personal information of children under the age of 13. The FTC asserted that Epic knew that Fortnite was “directed to children” and unlawfully collected personal data from them without verifiable parental consent (VPC).
The FTC also charged Epic with violating the FTC Act, which prohibits unfair and deceptive practices, by using unfair “on by default” voice and text chat settings in Fortnite that led to children and teens being bullied, threatened, and harassed within the game, including sexually. It charged that Epic’s privacy and parental controls did not meaningfully alleviate these harms or empower players to avoid them. If approved, this settlement will require Epic to pay $275 million in civil penalties. (The other $245 million is for the other case and is allotted for consumer refunds.)
Apart from the epic fine, the Fortnite action provides insight into the FTC’s thinking on children’s and teens’ privacy. Here are seven takeaways from a case that will likely reverberate far past the New Year:
Following the FTC’s announcement, Epic explained that it had accepted the settlement agreements “because we want Epic to be at the forefront of consumer protection and provide the best experience for our players.” It set out – as a “helpful guide” to the industry – principles, policies, and recommendations that the company has instituted over the past few years to protect its players and meet regulators’ expectations globally. On the children’s privacy front, Epic recommended that game developers “proactively create age-appropriate ways for players to enjoy their games” – advice that mirrors our own. Maybe we can tie that up with a ribbon!
* * * * *
Wishing you and your loved ones a joyful and relaxing holiday season without any more blockbuster FTC announcements until 2023!
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post Wrapping Up 2022 with A Huge (Epic) Fortnite Privacy Case appeared first on ESRB Ratings.
]]>The post What Parents Need to Know About Privacy in Mobile Games: Communicate with Your Kids appeared first on ESRB Ratings.
]]>Our first four tips are privacy-specific while this last one applies to many parenting challenges: Communicate with your kids! Talk with them about what they should know and can do to protect their privacy online. If your kids are young, you can tell them to come to you or simply say no to all in-game requests for information. If your children are older, you can teach them how to use privacy settings and permissions.
You can also educate them in an age-appropriate way about the consequences of sharing too much personal information in a game. These can range from compromising the security of online accounts to attracting cyberbullies to damaging their personal reputation. Let them know that they can come talk to you if they’ve posted something online that they later realize is too personal (you can help them get it deleted) or if they’re receiving inappropriate advertisements, messages, or other communications. (You can report inappropriate ads to Apple and Google.)
Make sure your kids know they can turn to you for help in protecting their personal data and preferences, and that you know where to find answers and advice.
Sometimes, in a rush to play a game, your child might simply click “yes” on permissions, or even falsify their age, but when they understand how their personal data and preferences may be used, or more importantly misused, most kids will become more interested in managing their own privacy online. Make sure they know they can turn to you for help, and that you know where to find answers and advice.
Protecting your kids’ privacy in mobile games may sound overwhelming, but the benefits of playing games far outweigh the risks. Our tips – together with ESRB’s Family Gaming Guide and our “What Parents Need to Know” blogs can help you protect your kids’ privacy online.
• • •
If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post What Parents Need to Know About Privacy in Mobile Games: Communicate with Your Kids appeared first on ESRB Ratings.
]]>The post What Parents Need to Know About Privacy in Mobile Games: Don’t Let Your Children Lie About Their Ages appeared first on ESRB Ratings.
]]>It’s important that your child uses an accurate birthdate or age when signing up for a new game or mobile app. When companies know that children under the age of 13 are playing their games, they are required by law to follow the federal Children’s Online Privacy Protection Act (COPPA). COPPA and its associated Rule issued by the Federal Trade Commission (FTC) gives parents control over what information companies can collect from kids under 13 years of age through their websites, apps, and other online services, including mobile games. Under COPPA, companies with games, apps, and other services “directed to children” or who know that kids under 13 are using their game must:
Under COPPA, a game company can’t condition participation in a game on a child disclosing more information than is necessary. They’re also prohibited from using information for commercial purposes such as targeted marketing and advertising that are unrelated to gameplay. This is part of why it’s so important to make sure you or your kid enters an accurate birthdate or age when signing up for a new game!
Make sure your children enter their ages accurately so they can benefit from legal protections tailored to protect kids’ personal information.
Beyond COPPA, recently enacted privacy laws in states like California, Colorado, Connecticut, Utah, and Virginia give kids and their parents additional privacy rights. Some extend certain privacy rights to teens. For example, several of these state laws prohibit companies from selling or sharing teenagers’ (typically ages 13-16) personal information without their consent or the consent of their parent or guardian. You can ask that a mobile game company not sell or share your child’s information by making a request using a form or email address available from the company’s app or website. Other laws, such as California’s recently-passed Age Appropriate Design Code Act, require companies to set privacy controls in games and other products to the most-protective level for all users under the age of 18.
Companies that don’t follow these rules can get in a lot of trouble. The FTC and state law enforcers have slammed mobile game companies that failed to comply with COPPA with large fines and other penalties. And more enforcement is likely on the way. Along with our other tips, making sure that your children enter their ages accurately will help ensure that they benefit from legal privacy protections tailored for kids and teens.
Remember to check back tomorrow for our next tip!
Click here to continue to the final tip: Communicate with Your Kids.
• • •
If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post What Parents Need to Know About Privacy in Mobile Games: Don’t Let Your Children Lie About Their Ages appeared first on ESRB Ratings.
]]>The post What Parents Need to Know About Privacy in Mobile Games: Look for the ESRB Privacy Certified Seal appeared first on ESRB Ratings.
]]>You’re probably already familiar with ESRB’s content ratings for video games and apps, but did you know that ESRB also has special icons certifying a company’s compliance with ESRB’s privacy requirements? ESRB Privacy Certified is a membership-based program that works mostly with companies in the toy and video game industries. We review our members’ products for compliance with federal and state privacy laws, including the federal Children’s Online Privacy Protection Act (COPPA), and global rules, platform standards, and best practices.
We have two seals: (1) the ESRB Privacy Certified Kids Seal, which covers games directed or targeted to children, and (2) the ESRB Privacy Certified Seal for games that are not primarily directed to and do not target children.
The Kids Seal links to a confirmation page on our ESRB website, which confirms that a company is a member of our program, shows the seal(s) the member is approved to use, and provides a link to the member’s online privacy policy. The Federal Trade Commission (FTC), the United States’ leading privacy agency, has approved our Kids Seal requirements. Every year we provide a confidential annual report to the FTC detailing our compliance work with our members on children’s privacy.
When you see one of our ESRB Privacy Certified seals in a mobile app, you can be assured that we’ve reviewed the company’s privacy practices and policies thoroughly.
When you see one of our seals in a mobile app (often in or near the app’s privacy policy), you can be assured that we’ve reviewed the company’s privacy practices and policies thoroughly. We conduct an initial assessment to make sure the company’s product complies with applicable laws and the company’s actual practices are described accurately and fully. We also conduct two comprehensive reviews annually of each participant’s policies, practices, and products to help members remain compliant.
You can also find ESRB’s Privacy Certified seals on websites and connected toys. For more information about ESRB Privacy Certified, check out our website and blog. You can also follow us on LinkedIn and Twitter.
Click here to continue to Tip #4: Don’t Let Your Children Lie About Their Ages.
• • •
If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post What Parents Need to Know About Privacy in Mobile Games: Look for the ESRB Privacy Certified Seal appeared first on ESRB Ratings.
]]>The post What Parents Need to Know About Privacy in Mobile Games: Use Parental Controls and Permissions appeared first on ESRB Ratings.
]]>You can set privacy controls and permissions for the mobile games and apps your children play and download, just like the parental controls you use on your kids’ video game consoles. Some game companies allow you to enable privacy features (such as limiting which players can see your game activity) that would normally share identifying information.
Also, both Apple and Google have settings for families that help you protect your children’s privacy by allowing you to restrict information sharing about your child’s location and block targeted advertising. Even if you block targeted ads, your child may still receive contextual advertising. Although you can’t stop advertising completely, you can limit inappropriate ads by buying the paid version of the game (if available), putting your child’s phone in airplane mode for simple games that don’t require an online connection, or using a third-party ad blocker.
You can also take advantage of other privacy-protective features that the app stores offer to all users. You can access Apple’s complete controls here and Google’s here.
You can also teach your kids to consult with you or say no to permissions that pop up in-game asking for data. That includes requests for your precise location, the contacts in your phone, pictures, or anything else that could identify you or your child. Explain that they should just say no to permissions that ask for access to anything that has to do with health, money, or making changes to the phone’s hardware.
One of the most publicized developments is Apple’s App Tracking Transparency feature. If an app collects users’ data to track them across other apps and websites, the developer must send you a notification and receive your permission before they can track and share your activity. Apple automatically activates this feature if your children have an iPhone set up with a kids account. You can also toggle a setting, so these requests are denied automatically.
Take advantage of family controls and other privacy-protective features from the app stores like Apple’s “Families” and Google’s “Family Link.” You can access Apple’s complete set of privacy controls here and Google’s here.
Late last year, Google introduced a similar feature that prevents apps from collecting the user’s advertising ID, which is used for ad tracking, when the user is opted out of personalized ads on Android 12. Google won’t allow developers to transmit the advertising ID from children in apps targeting children, such as those in the Families Link program. If you opt out of tracking, your kids will still be able to play most games although some features might not be available. Additionally, Google announced in August that it would block ad targeting based on the age, gender, or interests of users under 18, and also turn off location history for users under 18. Google plans to “start rolling out these updates across our products globally over the coming months,” so we are hopeful that they’ll implement this fully by the end of 2022.
You can also just say no. Before you grant permissions to any new app your kid is using or wants to download, review which permissions the app requires. See if they correspond to gameplay. For example, if a simple alphabet game for preschoolers wants access to your phone number or contact list, just say no. Your kid may still be able to play the game, although the developer may limit its functionality. You can also look for a privacy-friendly alternative.
Take some time to explore privacy controls and permissions that make sense for you and your family. And remember to review and update them periodically as game companies roll out new features and your kids mature and change.
Remember to check back tomorrow for our next tip!
Click here to continue to Tip #3: Look for the ESRB Privacy Certified Seal.
• • •
If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post What Parents Need to Know About Privacy in Mobile Games: Use Parental Controls and Permissions appeared first on ESRB Ratings.
]]>The post What Parents Need to Know About Privacy in Mobile Games: Look for Privacy Labels and Policies appeared first on ESRB Ratings.
]]>That’s true, too, when it comes to protecting your children’s privacy. Whether your children are using your device or their own, it likely contains lots of sensitive information like personal contacts, location data, photos, and browsing history. And, like most app developers, many mobile game companies make money by selling data and serving ads to their users. That’s especially the case for free-to-play games that don’t cost anything to play up front.
Whether they’re free or not, mobile games and apps use tracking technologies to collect information from and about players. Developers use gameplay data to improve their games, customize the experience, measure progress, iron out bugs, serve personalized ads, detect cheats, comply with laws, and more. To do so, they collect all sorts of data – everything from your child’s birthday and location to what in-game purchases they made.
There are, of course, laws that govern the collection and use of kids’ information. And app stores and game developers also offer many privacy tools for both children and adults. Despite this, it can be hard for parents and caregivers to figure out the maze of privacy laws, settings, and features. So, here are some tips to help protect your children’s privacy.
Tip #1: Look for Privacy Labels and Policies
Most of the games your children play will be ones you or they have downloaded from either the Google Play Store (Android) or the Apple App Store (iOS). As you’ve heard before, you should always check for age and content ratings before your children play games to ensure that the game is age appropriate. ESRB ratings are displayed for all games in the Google Play store.
What you might not know is that the app stores have also introduced “labels” for privacy modeled on nutrition labels on products in the grocery store. Instead of information about calories and nutrients, they have information about a game’s privacy policy and data collection practices. Apple’s “Privacy Information” labels and Google’s “Data Safety” labels differ somewhat, but both have links to the game’s full privacy policy, explain the types of data the game is collecting, what the data will be used for, and whether the game is sharing information with third parties. The labels aim to be user-friendly and written in plain English, but they can still be difficult to understand. Here are a few things to focus on when looking at a privacy label in Google Play or the Apple App store:
The app stores have also introduced “labels” for privacy modeled on nutrition labels on products in the grocery store. Instead of information about calories and nutrients, they have information about a game’s privacy policy and data collection practices.
Of course, the labels aren’t perfect. Not all game developers have posted privacy labels, and Google and Apple don’t verify companies’ self-reported info. Plus, some labels do have errors – usually unintentional, due to mistakes or misunderstandings. But they are an important starting point for understanding what kind of data a game collects from its users and your kids, more specifically, and how companies use and share that information.
Another source of information is a company’s privacy policy. Many mobile game companies now have simple short-form policies or dashboards summarizing key privacy facts such as what information is gathered in a game, where it goes, how it gets used, and whom to contact if you have a problem or question. You can view an example of a short form privacy policy in the ESRB Rating Search app (Android | iOS). And if you want more information, you can always review a company’s full-length privacy policy, which provides much greater detail.
It’s best to look at a game’s privacy label and privacy policy before your kids start playing. If you want more information afterwards, you can check out Google’s Privacy Dashboard on your Android device to see which apps accessed your child’s data and when. You can also check out the App Privacy Report in your iOS device’s settings to see how often your child’s location, photos, camera, microphone, and contacts have been accessed during the last seven days. Together with the privacy labels, these features can give you a more complete picture of how the apps your children use treat their privacy.
Almost all video game apps have to collect some personal information to function. But if you want to minimize the amount of personal data that is collected, used, and shared about your child, you can look for games that make clear that they won’t use kids’ personal information for any marketing, online advertising, employ any third-party tracking that would directly identify a child, or collect and share precise location information. The privacy labels and other features offered by the app stores such as family programs can help you figure that out.
Remember to check back tomorrow for our next tip!
Click here to continue to Tip #2: Use Parental Controls and Permissions.
• • •
If you have more questions about kids’ privacy in mobile apps or you want to learn more about our program, please reach out to us through our contact page to learn more about our program. Be sure to follow us on LinkedIn for more privacy-related updates.
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.
The post What Parents Need to Know About Privacy in Mobile Games: Look for Privacy Labels and Policies appeared first on ESRB Ratings.
]]>