Kathryn Montgomery and Jeff ChesterBiography
Creating a Quality Digital Media Culture in the Big Data Era

In September 2019, the U.S. Federal Trade Commission (FTC) and the New York Attorney General fined Google $170 million for the failure of its YouTube service to comply with the Children’s Online Privacy Protection Act, commonly known as COPPA. We spearheaded the national campaign that led to passage of COPPA during the 1990s. The law requires commercial websites and other digital media that target children under 13 to limit the collection of personal information; mandates a mechanism for parental involvement; and places obligations on companies for minimizing the collection of data, and ensuring its security. In 2013, we successfully convinced the U.S. Federal Trade Commission (FTC) to update its COPPA regulations to address contemporary and emerging practices. The new rules include restrictions on the use of “cookies” and other “persistent identifiers” that enable behavioral targeting, personalized advertising, and location-based marketing.

In 2018, our coalition of privacy, consumer-protection, and child-advocacy groups filed a complaint with the FTC against Google. Its YouTube platform, which was launched in 2005, has quickly become the number one online destination for children in the U.S., and a boon to advertisers seeking to cash in on this market. Yet, as it unleashed a growing torrent of programming and marketing designed to appeal to kids, the tech giant has been disingenuously claiming that YouTube was intended only for those aged 13 and older. This cynical behavior sent a message that any powerful and well-connected corporation could ignore U.S. privacy law, even when that law was specifically designed to protect young people. 

In its landmark settlement agreement with regulators, Google has now promised to make a number of changes to YouTube’s business practices, which will affect both its U.S. and global operations. As of January 2020, YouTube no longer allows personalized, “behavioral” marketing on programming that targets children. In order to trigger these new digital marketing safeguards, Google requires video producers and distributers to self-identify that their content is aimed at kids. It has also committed to “use machine learning to find videos that clearly target young audiences, for example those that have an emphasis on kids characters, themes, toys, or games” to supplement the information received from YouTube content creators. Google has also announced that it will apply marketing and other safeguards currently in place on its YouTube Kids app to all child-directed content on its main YouTube platform. These policies include banning not only ads that feature “sexually suggestive, violent or dangerous content,” but also all food and beverage advertising. 

In addition to these internal policy changes on its main platform, the company has committed to make substantial investments in its YouTube Kids service. YouTube Kids was initially launched in 2015 as a separate app designed exclusively for young children. But the app never rivaled the main YouTube platform’s hold on children, and was plagued with a number of problems (including exposing kids to indecent and other harmful content). Now, as a result of the FTC investigation, Google announced that it will bring “the YouTube Kids experience to the desktop,” increase its promotion of the service to parents, and more effectively curate different programming that will appeal to more young people—with new tiers of content suitable for “Preschool (ages 4 & under); Younger (ages 5-7); and Older (ages 8-12).” Google has also created a $100 million fund fora three-year program that is designed for “the creation of thoughtful, original children’s content on YouTube and YouTube globally.

It remains to be seen how well these promised changes will be implemented, and whether the quality of content for children on YouTube will improve. The FTC is also now in the process of conducting an unusual early review of the rules implementing COPPA, which Google and other digital media companies may see as an opportunity to significantly weaken how the law is implemented. For the growing number of commercial companies seeking to generate revenues from the expanding and highly lucrative children’s digital media marketplace, privacy and data protection policies such as COPPA present an obstacle to the kind of friction-free online marketplace they have perfected. 

The Google settlement comes at a time when the media system is at a critical crossroads. The digital media and advertising technology (“ad-tech”) industry—led principally by platforms controlled by Google, Facebook and now Amazon—has fueled the development of a complex, far-reaching, global media, marketing, and sales apparatus. Today, digital marketing utilizes technologies that track and analyze our every move on every device, from home to school to work or store. Vast databases filled with personal details on our lives can be accessed in the “cloud” in seconds, and fed into a digital profile containing ever-more-detailed information about us. These profiles are continually updated and regularly sold or given to powerful “programmatic” advertising engines that enable marketers to buy and sell us in milliseconds. An expanding array of innovative techniques and applications use artificial intelligence, machine learning, neuromarketing, virtual reality, branded entertainment, influencer marketing and more to both predict and influence our behaviors—including whom we choose to vote for in elections. Our dependence on—or addiction to—the digital world ensures that a torrent of personal and other information continually flows into the databases of Google, Facebook, leading brands, device manufacturers, mobile app developers, marketing data clouds, and others.

Children and adolescents are at the epicenter of these developments.  Their role as “early adopters” with deep connections to digital media from their youngest ages, combined with their spending power and their ability to influence family expenditures, has made them a key target for tech companies and brands. The children’s digital marketplace is booming worldwide, fueled by an explosion of new technologies that are swiftly moving into every aspect of young people’s daily experiences. While all of these innovations create opportunities for enhancing children’s lives, they are also part of a powerful and growing Big Data system with far-reaching implications for safety, privacy, and health.  For example, the internet-connected toy market is expected to reach $25 billion in the next five years, with more and more products designed to react to a child’s behavior in real time and “grow” with them as they become older. Many of these new products have serious security flaws, including voice-recognition software that monitors not only the individual child user, but can also connect to playmates and others; this sensitive personal information can also be shared with third parties. “Smart speakers,” such as Amazon’s Alexa, and the emerging business based on “voice search” also gather extensive amounts of “home life data,” based on family interactions and activities, raising serious privacy and security issues for children and their families. New streaming-video services, many which are targeting children, are engaged in the same kinds of data-gathering practices pioneered by online platforms and apps.Virtual reality and AI are among the latest tools used by the food industry to promote unhealthy products to young people across multiple digital platforms. And an explosion of new messaging apps and online video gaming platforms threaten to increase young people’s exposure to sexual exploitation, violent and hateful content, and cyber bullying.  

While policies such as COPPA and the EU’s General Data Protection Regulation (GDPR) offer some privacy-connected safeguards for children, their protections are limited, particularly in the face of this rapidly emerging, next-generation, highly commercialized media culture, where young people will serve as a generation of digital “guinea pigs” to perfect an always-on, aware and interactive digital system whose principle goal is monetization and mass personalized influence.

We believe that it is time for civil society, child advocates, educators, consumer groups, industry, parents and policy makers should build on this work to forge an international movement on behalf of young people in the Big Data era. Children and teens must be guaranteed the right to grow up in a digital media environment that supports their healthy development, fosters personal and collective growth, promotes cooperation and harmony and strives to engender democratic values. Our collective efforts should build on several global initiatives currently underway, including the United Nation’s current review of its Convention on the Rights of the Child, to incorporate policies addressing the rights children should receive in the digital era, and UNICEF’s work promoting such issues as the internet’s impact on their health and privacy.  In the U.S., EU and elsewhere, we should work together to strengthen privacy and data protection legal frameworks for children. The GDPR should be better enforced when it comes to minors, and the U.S. should legislate an update of COPPA. Protections should also be extended beyond the youngest children to include adolescents as well, giving them greater control over how their information can be gathered and used. Regulatory policies must address the ways that digital marketing impacts the lives of youth, so that unfair practices—such as paid influencer marketing—are not permitted to target children directly.

Such an intervention is especially timely, especially as the technology industry finds itself under unprecedented public and government scrutiny worldwide. The controversy over how Facebook and Cambridge Analytica used data gathering, analytics, and targeting tools to spread misinformation and manipulate voters has spurred numerous hearings and legislative initiatives in Congress and in the European Union. Amid calls for “platform accountability” from civil rights organizations and other groups concerned about racial, economic and health justice, tech companies have been forced to make significant adjustments in their internal content moderation and advertising policies. There are ongoing investigations of Google, Facebook and Amazon by antitrust agencies in the U.S. and abroad. Federal and state privacy legislation, either enacted or proposed, as well as the 2018 implementation of the EU’s landmark data protection law—are forcing companies and the digital industry to begin making some changes to their business practices, including those that impact children.

The tech industry is also experiencing pressures from within its own ranks, as leading advertisers have mobilized to institute new codes of conduct and other related “brand-safety” regimes, designed to ensure that their ads do not appear alongside hate speech, fake news and other inappropriate content. Facebook, Google and other major platforms and publishers are being required to revise their business practices to ensure the interests of their most important global marketing partners are taken into account. Advocates should take advantage of these developments to ensure that concerns about the ways marketers and platforms manipulate and otherwise potentially harm young people are part of the brand safety debate.  Global marketers should be pressed to adopt new codes of conduct that require them to engage in more responsible practices when it comes to children.

As we move into the second decade of the 21st century, we must seize this unique historical moment to establish a quality digital media culture not only for today’s children, but also for future generations of young people. 

Kathryn Montgomery, PhD. is Research Director and Senior Strategist for the CDD. In the early 90s, she and Jeff Chester co-founded the Center for Media Education (CME), where she served as President until 2003, and which was the predecessor organization to CDD. Dr. Montgomery has written and published extensively about the role of media in society, addressing a variety of topics, including youth engagement with digital media and contemporary advertising and marketing practices. Jeff Chester is Executive Director of the Center for Digital Democracy (CDD), a Washington, DC non-profit organization. CDD is one of the leading U.S. NGOs advocating for citizens, consumers and other stakeholders on digital privacy and consumer protections online. A former investigative reporter, filmmaker and Jungian-oriented psychotherapist, Jeff Chester received his M.S.W. in Community Mental Health from U.C. Berkeley. He is the author of Digital Destiny: New Media and the Future of Democracy (The New Press, 2007), as well as articles in both the scholarly and popular press. CDD is one of the leading U.S. NGOs advocating for citizens, consumers and other stakeholders on digital privacy and consumer protections online. Since its founding in 2001 (and prior to that through CME), CDD has been at the forefront of research, public education, and advocacy protecting consumers in the digital age.