Professor Sonia Livingstone OBEBiography
“It’s None of Their Business!” Children’s Understanding of Privacy in the Platform Society
Facebook’s advertising campaign, launched in August 2019 to recover public trust1 following the Cambridge Analytica scandal,2 announced:
“We all have our own privacy settings. So, when it comes to your privacy on Facebook, we think you should have the same controls.”
It pictured a screenshot of privacy options (public, friends, close friends, only me) with the last option ticked. The implication is that, now, Facebook gives the public what it wants and deserves. But choosing ‘only me’ makes no sense in a networked world: who wants privacy at the cost of social isolation? Anyway, ‘only me’ does not solve the Cambridge Analytica problem, where people’s personal data were used for commercial and political purposes without their meaningful consent. For whatever you tick, none of your actions are private from Facebook itself.
This phrasing of a Facebook advert illustrates, even perpetuates, a wider confusion in society between interpersonal privacy and what is being called data privacy.3 Parents, teachers, government and businesses tend to talk to children as if privacy only means privacy from other people. When children are accused of lacking a sense of privacy by sharing their personal information with all and sundry, when parents worry about cyberbullying or grooming,4 even when the media panic about accidental data leaks from the ‘internet of toys’ or smart home devices, the focus is children’s interpersonal privacy and its safety implications. Policy responses centre, respectively, on better e-safety education, parental awareness and responsibility, and the regulation of product security. These are all important and urgent.
But adults say little to children about how to protect their privacy in relation to institutions (such as their school, doctor, or police) or businesses (most of which now collect personal data online in one way or another). Yet much of what a child does online – their searches, posts, likes or views – is immediately shared within a lucrative global data ecology. So if they are an Instagram or WhatsApp user, the child’s data will be shared with dozens of Facebook’s partners, since user profiling is the currency for real-time advertising auctions5 that target users.6 Attending to one’s privacy settings will not impact on data privacy, where there is no real ‘only me’ option.
Privacy from whom?
Privacy is not a singular property that an individual ‘has’ or controls. It must be understood in context, depending on whom one wants privacy from. Historically, interpersonal contexts have been the most important for children. But under today’s conditions of intense datafication, privacy contexts include not only interpersonal but also institutional and commercial contexts. The un-met challenge to children’s privacy stems from the widespread and carefully planned collection and use of children’s personal data, with consequences now and into the future. So now one must ask critical questions about children’s privacy from both (usually trusted) institutions and commercial enterprises of diverse kinds (many of the third-party users are unknown by and thus practically inaccessible to users). Our surveillance society has been remarkably slow to start worrying about organisational uses and abuses of people’s data, including children’s.7
Even in the interpersonal sphere, privacy is always relational and contextual.8 It is shaped by a host of cultural norms and expectations, often locally negotiated. If we don’t attend to the context, as experienced by those involved, we won’t understand what privacy means to people. For instance, a child may seek privacy in the (public) street if there are too many people at home. A few years ago, a teenager told me she felt private on Twitter (where tweets are formally ‘public’) but not on Facebook (where her privacy settings were high), because her many ‘friends’ only used the latter.9
As we argued in our recent ‘Children’s Data and Privacy Online’ project,10 it is vital not to confuse interpersonal with institutional and commercial contexts for privacy, for these contexts differ hugely in who or what one might seek privacy from. And these contexts are changing in complex ways. The privacy to walk down a street unobserved is now undermined by the mass introduction of surveillance cameras, though a child seeking escape from nosy siblings may not realise this. Whether your friends see what you do on Twitter or Facebook is unrelated to the data collected from you by the platforms, though neither company explains that clearly to their users.
This isn’t children’s confusion but ours. As a society, we conceptualise privacy first and foremost in interpersonal terms.11 Our visceral response to privacy intrusions derives from a perceived affront to our personal agency and dignity in relation to others that we know or can imagine. People with reason to distrust the state extend this visceral grasp of privacy to institutional contexts, demanding the fairness and accountability from the state that they expect in interpersonal contacts. But most people in modern democratic countries trust the authorities (government, police, health, school, transport, etc.) with their personal information and anticipate no real institutional risk to their privacy. This is because, until recently, our interactions with businesses, also, were built on interpersonal trust (you could talk to the shopkeeper, visit your bank manager, see for yourself what the market traders did). Hence the recent dramatic drop in public trust,12 and explosion of policy concern, now that global and proprietary digital platforms underpin both our interpersonal relations (where we expect to exercise agency), and our relations with institutions and business (where we are obliged to place our trust).
So, when adults talk to children about privacy, they assume an interpersonal context. For instance, to manage their online privacy, children are advised to choose who can see particular posts, and to delete messages that they regret or that might upset others. These are tactics for interpersonal privacy only, and they are ineffective for managing their privacy in institutional and commercial contexts. From Instagram or Snapchat or Amazon (and, probably, from their school or health provider) there is no realistic option to choose, to consent, or to delete.13
Children’s understanding of their data privacy
The assumption of interpersonal privacy is reflected in children’s understanding of the digital ecology. When we held participatory workshops with 11- to 16-year-olds around the UK,14 we saw how children tended to (over)extend what they know of interpersonal relations to the operation of platforms. For example, they might talk trustingly of Instagram because so-and-so’s father works in technology, and he would surely play fair. They assume ethical reciprocity: if they would never track someone without their knowledge or keep images against someone’s will, why would a company? Or they assume that the tactics that keep their activities hidden from their parents or enemies (pseudonyms, ghost mode, incognito search, clearing one’s history) also keep their data private from companies.
Inevitably, children’s experience of the operation, regulation and norms of institutions and businesses is relatively limited, especially when they are young. Children’s tendency to trust these organisations is also down to us. Who does not teach their child to trust their school or doctor or even the shopkeepers and other commercial services with which they have early dealings? Is the solution to privacy in a datafied world really to teach children to distrust? And who does teach children, including in school, about business practices, including the global nature and complex proprietary practices of the digital ecology?15 We found few children who know what Oracle or Experian do with their personal data or how this might shape their future.16 Should we be teaching even primary school children about platform business models? Would it enhance their agency if we did?
In our workshops, when we encouraged children to think not only about e-safety or how their parents shared embarrassing photos, but also about how their data are processed by their school, doctor, search engine, social media platforms and more, the conversation turned. Their confident expressions of agency and expertise would falter, and they would say, outraged: it’s creepy, platforms shouldn’t be poking around in the online contacts, I want to control who they share my data with and, most tellingly, it’s none of their business!17
If only.
Shifting the burden of privacy protection from user to service provider
Of course, we need earlier and better digital education.18 But the challenge of protecting privacy in a digital world goes beyond expecting children to understand and manage their personal data. Increasingly, the challenge is one of redesigning the conditions under which their data are collected, inferred, profiled and used by others. These conditions are, currently, systematically opaque to users. How can we expect children to be responsible for their data privacy when their parents, teachers or even policy experts don’t understand it? Even if transparency were dramatically increased, what use would it be if not linked to granular, meaningful and easily-implemented choices about what to share, with whom, and for what?
When a service’s Terms and Conditions state that users’ data will be shared with hundreds of data brokers and other third parties, yet no realistic alternatives to use of the service are provided, we must conclude that the burden of privacy protection has shifted from the user to the service provider. Others in this volume have proposed legislative, regulatory and business solutions, and doubtless these will be hotly debated in what Forbes’ Magazine has announced as “the year of digital human rights.”19
It is a particular challenge for child rights that, in a datafied world, individuals tend to be addressed algorithmically in the aggregate (as students, patients, customers, the public) rather than according to their differential needs and rights. Even when digital services are ‘personalised,’ this tends to serve commercial or bureaucratic logics rather than those determined by citizens and users. It may not even be in the provider’s interest to distinguish its treatment of adults’ and children’s data, impeding any chance of realising children’s best interests.20
Children cannot learn to act as agents and make wise choices, nor to have their voices heard as is their right, when adult society systematically talks to them about their data and privacy online in purely interpersonal terms. Society cannot expect to protect children’s right to privacy”21″Article
We have created a situation in which children learn that they don’t matter, that they have no agency, that their competence is misguided. In our workshops, children told us of their irrelevance – why would companies care what they did or thought? They referred to a dystopian Black Mirror world in which the machine has taken over. This sense of inevitability, in turn, reduces the pressure on service providers to develop user tools which provide children with meaningful choices about how their data are used. It is time to demand that institutions and businesses redesign their digital offer in ways that serve children’s best interests. And for society to hold them to account.
Trust in tech is wavering and companies must act, Edelman, 8 April 2019
Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach, The Guardian, 17 March 2019
Hintz, A., Dencik, L., Wahl-Jorgensen, Digital citizenship in a datafied society, 2019
These risks of online harm matter, of course but are not my primary concern here. For evidence of the interpersonal risks, see Children’s online activities, risks and safety: A literature review by the UKCCIS Evidence Group, 2017
In-house mediation with server-to-server bidding, Facebook for Developers
Facebook shared user data with 60+ companies, Investopedia, 4 June 2018. Facebook may also monetise non-users’ data, via the construction of ‘shadow profiles’ – see Shadow profiles: Facebook has information you didn’t hand over, Cnet, 11 April 2018
Zuboff, S. Surveillance Capitalism, 2019
Nissenbaum, H. Privacy as contextual integrity, Washing Law Review, 2004
Livingstone, S., Sefton-Green, The class: living and learning in the digital age, 2016
The project is funded by the Information Commissioner’s Office, Children’s data and privacy online: Growing up in a digital age
Laufer, R. S., Wolfe, M. Privacy as a concept and a social issue: a multidimensional developmental theory, Journal of Social Issues, 1977
2019 Edelman trust barometer: global report, Edelman, 2019
The UK’s proposed Age-Appropriate Design Code (discussed elsewhere in this volume) should provide children with greater privacy in these contexts, as might a sterner application of the GDPR than yet witnessed, and the new California Privacy Act.
Livingstone, S., Stoilova, M., Nandagiri, R. Children’s data and privacy online: Growing up in a digital age, London: London School of Economics and Political Science, 2019
Deceived by design, Forbrukerradet, 27 June 2018
Christl, W., Corporate surveillance in everyday life, Cracked Labs, June 2017
Livingstone, S., Stoilova, M., Nandagiri, R. Children’s data and privacy online: Growing up in a digital age. Research findings. London: LSE, 2019
Livingstone, S. Media literacy – everyone’s favourite solution to the problems of regulation, 2018
Why 2020 will be the year of digital human rights, Forbes, 26 December 2019
UN Convention on the Rights of the Child, 20 November 1989
if it confuses interpersonal and data privacy and fails to critically examine the conditions for each and the relations between them. We must stop advising children and parents that they can and should control the flow of their data in circumstances when they cannot, or when the result would be exclusion. We should call out businesses which claim that they respect people’s privacy when they do not.[footnote number="22"]Facebook claims its built privacy into its products ‘just like Apple.’ The i newsletter, 8 Jan 2020
Trust in tech is wavering and companies must act, Edelman, 8 April 2019
Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach, The Guardian, 17 March 2019
Hintz, A., Dencik, L., Wahl-Jorgensen, Digital citizenship in a datafied society, 2019
These risks of online harm matter, of course but are not my primary concern here. For evidence of the interpersonal risks, see Children’s online activities, risks and safety: A literature review by the UKCCIS Evidence Group, 2017
In-house mediation with server-to-server bidding, Facebook for Developers
Facebook shared user data with 60+ companies, Investopedia, 4 June 2018. Facebook may also monetise non-users’ data, via the construction of ‘shadow profiles’ – see Shadow profiles: Facebook has information you didn’t hand over, Cnet, 11 April 2018
Zuboff, S. Surveillance Capitalism, 2019
Nissenbaum, H. Privacy as contextual integrity, Washing Law Review, 2004
Livingstone, S., Sefton-Green, The class: living and learning in the digital age, 2016
The project is funded by the Information Commissioner’s Office, Children’s data and privacy online: Growing up in a digital age
Laufer, R. S., Wolfe, M. Privacy as a concept and a social issue: a multidimensional developmental theory, Journal of Social Issues, 1977
2019 Edelman trust barometer: global report, Edelman, 2019
The UK’s proposed Age-Appropriate Design Code (discussed elsewhere in this volume) should provide children with greater privacy in these contexts, as might a sterner application of the GDPR than yet witnessed, and the new California Privacy Act.
Livingstone, S., Stoilova, M., Nandagiri, R. Children’s data and privacy online: Growing up in a digital age, London: London School of Economics and Political Science, 2019
Deceived by design, Forbrukerradet, 27 June 2018
Christl, W., Corporate surveillance in everyday life, Cracked Labs, June 2017
Livingstone, S., Stoilova, M., Nandagiri, R. Children’s data and privacy online: Growing up in a digital age. Research findings. London: LSE, 2019
Livingstone, S. Media literacy – everyone’s favourite solution to the problems of regulation, 2018
Why 2020 will be the year of digital human rights, Forbes, 26 December 2019
UN Convention on the Rights of the Child, 20 November 1989
if it confuses interpersonal and data privacy and fails to critically examine the conditions for each and the relations between them. We must stop advising children and parents that they can and should control the flow of their data in circumstances when they cannot, or when the result would be exclusion. We should call out businesses which claim that they respect people’s privacy when they do not.[footnote number="22"]Facebook claims its built privacy into its products ‘just like Apple.’ The i newsletter, 8 Jan 2020