Kade CrockfordBiography
A New Digital Divide? Protecting Lower-Income People from Hyper-Digitalisation

Human beings have long classified and categorized one another. It’s one of the ways we make sense of the world around us (that person is tall, that person is skinny). Classification and categorization are economically, socially, and politically determined, and they in turn shape our economics, society, and politics. Decisions about which kinds of classifiers and categories matter – and who decides – are enormously influential; they shape not only our understanding of other people but also of ourselves (he is white, she is black; they are dangerous, we are safe).

It’s always been difficult to understand oneself in relation to the world, and in relation to oneself. For centuries, human beings have turned to philosophy and religion to help them get closer to understanding. In the 21st century, this project is significantly more challenging, for reasons that are not immediately obvious. Among the central difficulties is that children today grow up in a world where both the boundaries of categories and classifiers and the people who determine those boundaries are increasingly hidden from view, behind algorithmic black boxes. At the same time, the decisions computerized systems make about how to categorize and classify us are too often viewed as neutral. What is data-driven, the mythology goes, is objective. The opacity shrouding algorithmic decision making combined with its (false) cultural imprimatur of neutrality together pose an unprecedented threat to self-determined human subjectivity. Worse still, as these crucial decisions about classification and categorization are hidden and obfuscated by increasingly complex technology like neural networks, it becomes more and more difficult, and in some cases impossible, to democratize the ability to understand oneself and one’s world.

I first became aware that classification systems don’t always fit the diverse tapestry of human experience when my elementary school teacher said, “Girls line up on the left, boys on the right.” I froze, suddenly acutely aware that my gender identity exists somewhere outside the categories provided. It was simple enough for me, then, to ask my teacher why she asked us to divide ourselves up into these categories. Even if her answer didn’t satisfy, it was straightforward enough for me to ask the question. I knew what question to ask. I knew whom to ask. I could ask it.

It is not simple, now, for children to ask unaccountable corporations why unnamed engineers, product designers, or executives have categorized or classified them as high risk, likely to succeed, or in need of extra tutoring. In many if not most cases it is impossible. It is not even likely that children will understand that a decision to classify or categorize them has been made by these unaccountable actors. The child may only see an output that ranks them on a supposedly neutral scale, or may not see the ranking at all but will nonetheless experience a restricted set of options as a result of their score, whether they realize it or not. The consequence of this opacity is a loss of control over one’s life and, crucially, leaves behind gaping holes where prior generations of children had opportunities to learn how the adult world works, including what society values, how power works, what categories are privileged over others, and many other inquiries that are central to growing up.

It is likewise unrealistic to imagine that children are capable of interrogating what Alexa or Google Home tells them about their world, or what happens to the words the children speak into these devices when they drift up into the deceivingly-named “cloud.” Children, we know, ask big questions. And we know children’s brains are like sponges, constantly soaking up everything they see and hear. Recently my five-year-old nephew asked me, “Why is the world unstable?” He probably heard something about global instability on the radio and wanted to know what all the fuss was about. I explained, to the best of my ability, the concept of political instability, and why societies become unstable. I stressed that instability often results from inequity, because people don’t like to be treated unfairly. What would Amazon’s Alexa product tell a child who asks this question? How about a child who asks about heaven and hell? What does Alexa say when a child asks why daddy hurts mommy?

Scholars have for some time now grappled with questions related to children’s privacy rights attendant to the rapid and alarming spread of so-called “smart” devices into the homes of millions of people across the planet. Other scholars are thinking and writing about how students are increasingly being monitored and nudged by “education technology” like Class Dojo and Google’s apps for education suite. Others, including those who got very wealthy in Silicon Valley thanks to their involvement in building or selling addictive technologies, have spent the past few years loudly warning parents that children are becoming addicted to their smartphones and devices (like their parents), and fretting about the cost to social and intellectual development.

All of these are important subjects for research and public debate. Unfortunately, the question of how these technological systems of control reduce human agency while obscuring how power operates from the people it operates upon has not been as thoroughly studied or publicly debated.

Ultimately, we don’t need research to tell us that in most parts of the world, including my home country the United States, the financial interests of powerful corporations and the direct advertising market currently take legal precedence over the privacy and self-determination interests of adults and children alike. Technology companies have managed to convince many people that exchanging our personal data for the use of their services is not only a good bargain, but the only plausible business model for the digital 21st century. They are wrong.

It’s no surprise that the world’s richest and most powerful technology scions are consciously shielding their children from digital technologies during their formative years. For years, researchers have spoken of a “digital divide” separating the urban and suburban wealthy who have computers and fast internet access, from the rural poor, who don’t. But technology has changed our world quickly, and in many parts of the industrialized world the digital divide is no longer a simple question of who is “connected” and who isn’t. When we conceptualize the digital divide in the 21st century we must also interrogate who has the luxury of avoiding technological systems during their childhood, and who has no choice but to use – and therefore to be controlled by – them. Fighting for equal access to the internet is important. But a dedication to understanding and pushing back against the way power works to construct human beings and knowledge in the digital age must also be our focus as we do the hard work of recalibrating our law and social infrastructure to best advance democracy, human self-determination, and liberty for children in the 21st century.

Kade Crockford is the Director of the Technology for Liberty Program at the American Civil Liberties Union (ACLU) of Massachusetts. Kade’s work focuses on how systems of surveillance and control impact not just on society, but on targets including children. The Technology for Liberty Program aims to use our society’s unprecedented access to information and communication to protect and enrich open society and individual rights, by implementing basic reforms to ensure our new tools do not create inescapable digital cages limiting what we see, hear, think, and do.