Elettra Ronchi, Andras Molnar and Lisa RobinsonBiography
Addressing the Needs of Children in the Digital Environment
Policy interventions to ensure a trusted digital environment for children increasingly demand international collaboration and whole-government coordination across traditional policy fields. Recent events indicate that there is an urgent need for strong frameworks and guidelines to support all stakeholders involved to play their part in both protecting children from online risks, and ensuring that benefits can be realised.
Since 2008, following a call made at the Ministerial meeting on the Future of the Internet Economy in Seoul (South Korea),1 the Organisation for Economic Cooperation and Development (OECD) has engaged governments and key stakeholders in anti–cipating change and implementing good practice and preventative solutions, rather than simply reacting to problems in this space.
Children spend more time online than ever before, using mobile devices (smartphones and tablets) with Internet connec-tivity to access the digital environment. This time spent online creates a number of real and important opportunities for children and young people, such as socialising with peers, expressing themselves through the creation of online content, and seeking information on just about any topic imaginable: essentially allowing them to exercise a number of their rights, such as freedom of expression, and rights to information, leisure and participation.2 Whilst it is important to ensure that such ben-e-fits can be realised, increased exposure to the digital environ-ment also results in increased exposure to digital risks. Many digital risks are online versions of long known offline risks (for instance, bullying, racism, and sexual predation) and, just as is the case in everyday life, a zero-risk digital environment is unat-tainable. Nonetheless, setting conditions for a safer digital environment is feasible, and children must be provided with the (digital) skills and tools necessary to recognise and manage these risks, without unnecessarily limiting their online opportunities.
In 2012, OECD member countries adopted the ‘Recommendation on the Protection of Children Online’ (‘the Recommendation’).3 The Recommendation aims to support governments in setting the conditions for the protection of children online through better evidence-based policymaking and enhanced coordination between all stakeholders. Consistent with the United Nations Convention on the Rights of the Child (UNCRC), it defines children as all persons below the age of eighteen years. While not legally binding, OECD Recommendations do carry a political commitment, which in other policy areas such as privacy have proved highly influential in setting international standards and helping governments to design national legislation.
Today however, the landscape that gave rise to the Recommendation has dramatically changed. Not only have advances in technology resulted in an almost constant capa-city for children to be online through a wide range of mobile devices, the reasons why children go online have evolved. And this is no longer to simply undertake discrete tasks, such as for research or educational purposes, but for a wider range of reasons, including for entertainment, as well as communicating and socialising with peers. The previously identified risks have also evolved and new risks have emerged. At the same time, a changing commercial landscape has resulted in increased “datafication” and rendered children important commercial subjects, a fact which has significantly impacted on their right to privacy.4
Since 2017, the OECD has been examining whether the Recommendation remains relevant, through surveying OECD member countries; undertaking an extensive review of the legal and policy environment; and holding expert consultations. Given the brief nature of this contribution, it is not possible to cover the full breadth of issues which have been identified through this work. However, some concerns which have been identified as central are briefly expanded below. These are: (i) the centrality of protecting children’s privacy and data; (ii) the need for proportional legislative and policy responses; and (iii) the role of online platforms and other digital service providers.
Privacy and datafication
The privacy space has significantly evolved since the adoption of the Recommendation in 2012. Today, children’s personal information and their data is not merely the information that they knowingly share, but includes information that can be gleaned from their online actions, as well as from disclosures that friends and parents may make. These are sure to follow children into adulthood. The information that children share online has been identified as falling into categories of (i) data given – the data children themselves provide (i.e. name, date of birth, etc.); (ii) data traces – the data they have left online (i.e. through cookies, web beacons or device/browser fingerprinting, location data and other metadata); and (iii) inferred data – the data derived from analysing data given and data traces.5 At the same time, data can be interpersonal, institutional and commercial. Whilst most children have an understanding of their private space, interpersonal context, and personal data given (albeit depending on age), a similar understanding is more limited as it relates to commercial use of data traces and inferred data.6
The use of children’s data, particularly the commercial use of inferred data, is a central and key issue for policymakers. A number of potential risks flow from the use and misuse of children’s data. They include: concerns that artificial intelligence algorithms may direct children towards harmful advertising content; that children’s personal information could be shared, leading to inappropriate contact; that data may be collected unknowingly and without consent, through apps or ‘smart’ connected toys;7 and that children’s data may be used to allow marketers to target them.8
Proportional legislative and policy responses
Legislative responses today are wide-ranging and largely made up of rules and norms addressing specific risks. Responsibility is siloed across government agencies and often uncoordinated, despite digital issues crossing traditional legislative boundaries. As an illustration, legal responses to sexting often fall to Justice Ministries, when the involvement of bodies responsible for health and education is also likely needed. Sexting is also an example of where responses exist in the absence of clear evidence of actual risk; and where isolated legislative actions have resulted in some countries unnecessarily criminalising children as their own pictures are considered as child sexual abuse. Here, it is seen that the narrow conceptualising of laws and frameworks can in fact prove both ineffective and often counter-productive, if not outright harmful.9 Legislative and policy responses should be evidence based, and should appropriately address the needs of children online.
Role of online platforms and digital service providers
Concern regarding the impact of risks such as sexting, cyber-bullying, sextortion, and harmful online content has prompted calls to change legislation and put pressure on online service providers, platforms and social media sites to do more to protect children from data misuse and online abuse. In some countries steps have already been taken, for instance the introduction of the Age Appropriate Design Code in the UK which strengthens data protection rights for children; the 2017 decision in the United States to modify the Communication Decency Act by including liability for websites who facilitate child sex trafficking; and Germany’s 2017 law, which among other issues provides significant fines for online platforms that fail to remove hate speech. While multi-stakeholder dialogue and positive engagement with business is key to addressing a number of online concerns for children, requiring platforms to be more responsible and accountable may prove effective in promoting change.
Changes in technology have contributed to an evolving risk landscape which requires enhanced governmental action and international collaboration to ensure that children realise the benefits of the digital environment and are sufficiently protected from online risks. In light of these developments, this essay has examined three key issues:
- The advances of the technologies through which data can be collected, stored and used have resulted in new privacy risks that are highly complex. Today children’s online activities are the focus of commercial interests and a multitude of monitoring and data-generating processes. There is a need to better recognise children as data subjects and content creators, and consequently how best to protect their privacy.
- The wide-ranging nature of legislative responses, the drawback of separating legislative responsibilities and the narrow conceptualising of laws and frameworks are major concerns. To address these issues, policy and legislative responses should be evidence-based and able to appropriately address the needs of children in the digital environment.
- Finally, positively engaging businesses and better capitalising on multi-stakeholder actions are key to addressing a number of concerns for children online.