Baroness Helena Kennedy QCBiography
New Freedom? How the Digital Environment Poses Complex Legal Challenges for the Promotion of Children’s Rights

The year 1989 saw the introduction of the Convention on the Rights of the Child (UNCRC). Since then, it has become the most widely ratified international agreement in history. In the same year, the computer code was released that would ultimately lead to the creation of the World Wide Web.1 In the 30 years since, access to technology has expanded rapidly, allowing all people, including children, greater access to information and to each other. This new freedom has improved life for many, but has also posed complex legal challenges for countries around the world who wish to ensure and protect the rights of children.

The right to freedom of expression and access to information
Like all people, children have the right to freedom of expression under international instruments, including the UNCRC. Article 13 of the UNCRC provides that every child must be free to express their thoughts and opinions, and to be able to seek and receive all kinds of information, as long as it respects the rights and reputations of others.2 Furthermore, in Article 17, the Convention guarantees the right to access information. This is a right to receive reliable information from a diversity of sources with the goal of promoting the child’s social, spiritual, moral and physical wellbeing. It calls on states to develop guidelines to protect children from material that may be harmful to their wellbeing.3

These rights require a balancing act to take place: allowing the exercise of freedom of expression and the freedom to access information, while offering protections to children who are vulnerable by virtue of their age. Occasionally this balancing act is difficult. For example, there are certain restrictions to free expression intended to prevent children from being exposed to gratuitous violence or adult sexual imagery, yet these restrictions have also seen the inadvertent filtering out of LGBTQ+ videos on YouTube.4 In these cases, the balance must be struck carefully and efforts must be made to mitigate unintended consequences, ensuring at all times that the best interests of the child are served.

The current business model of the sector is dependent on targeted advertising, which allows many services to be ‘free.’ This makes some argue that, without targeted advertising, child-focussed content may not be created, or may be limited significantly; or that, requiring users to pay would limit access to poorer communities. However, the amount of advertising present can hinder children’s abilities to express themselves and access information, or crowd media spaces, making it difficult to receive information without undue influence.5 These opposing views need to be seen in the context that children are less able to distinguish between general information and paid content.6

Managing the right to access information and to freedom of expression, in a way that also protects children, is at the forefront of current debates about content controls. Formal regulation around what may constitute harmful content poses an increased risk of censorship and abuse of content restriction.7 But equally, failing to enforce current regulation in online scenarios and/or allowing online public spaces to be dominated by disinformation, misinformation or commercially driven information, fulfils neither social norms of information distribution, nor children’s rights to a broad set of information, while being protected from harmful information. Equally central to the debate is whether platforms are publishers or mere conduits:8 but increasingly this binary does not hold the answer. Platforms are often presented as neutral, but increasingly we are understanding that they are mediators of information: content that users consume online is actively recommended by the platforms themselves. The basis of automated decision-making recommendation systems that are governing what we see are increasingly coming under scrutiny. As Tristan Harris, Co-Founder of the Center for Humane Technology, points out, over 70% of the content people watch on YouTube are videos that have been recommended to them by YouTube’s algorithm.9 Regulators around the world are considering how to ensure greater transparency and accountability from platforms about the information they use to make these decisions, the information they distribute, and the way that they distribute it.

The right to privacy
Article 16 of the UNCRC provides that the law should protect a child from arbitrary or unlawful interference with their privacy, home, family life, and correspondence: this includes protecting children from unlawful attacks that harm their reputation. The right to privacy has been greatly affected by the advent of the internet, and the mediation of technology in all areas of children’s lives. In the light of the developing nature of children, both mentally and physically, certain interferences with the right to privacy are justifiable, due to the limits of their cognitive and development capacity.10 However, due to their lack of autonomy, children are also more vulnerable to having such interferences, many of which occur via technology.

One of the most significant concerns around a child’s right to privacy in the digital world is the collection of data by websites and service providers. Corporations collect significant amounts of data from their digital users. Some of it is required to provide a service, but the scale of personal data and information collected is vastly out of proportion to that which is required. Commonly, children are not in a position to fully appreciate that information is being collected on them, and even in circumstances where they have the opportunity to grant permission to the collection and selling of their information, they are not likely to be fully aware of the long-term impact.11 This data is regularly used by third party companies to profile and target advertisements to specific groups, or to direct user behaviour. Children are particularly susceptible to marketing. Increasingly, vast datasets relating to younger and younger children are used for commercial purposes, with little-to-no-regard for their privacy or best interests.

In order to ensure that corporations aren’t encroaching on children’s right to privacy, governments need to enact clear laws around the collection, use, and sale of children’s data. An important step forward is requiring privacy policies to consider the rights of children when they are being created. The EU has taken steps under the General Data Protection Regulation (GDPR), which places the burden on companies to ensure that the protection of children’s rights is upheld by their privacy policies.12 The UK went further, passing the Age Appropriate Design Code which sets out what this means in practice for users under 18.13

One issue affecting the protection of privacy rights is the regulatory variance between countries around the world, which can allow companies that work transnationally to ‘game’ data protection laws. An example of this is the blocking of certain websites in the EU, subsequent to the enactment of GDPR. Various companies were unable or unwilling to change their policies to meet the new data protection requirements, opting instead to regionally block access to their content for those living in the EU. Therefore, when governments seek to enact laws, corporations should cooperate to ensure ease of compliance without loss of access.14 Similarly, laws should be drafted in a manner which would not unduly deter the creation of content or diminish access to the digital world for children.

Governments should also ensure that there are extra protections put in place for children in relation to freedom of information requests. Specifically, children’s personal information should be exempt from freedom of information requests and databases with children’s information should be made anonymous.15

The right to a reputation
Embedded within Article 16’s right to privacy is legal protection for children from unlawful attacks on their honour or reputation.16 The internet has created a unique challenge for all people who seek to manage their reputation: this is particularly true when dealing with children. It is common for children, their peers, and their parents to share personal information and images about themselves, or one another. This sharing of information and images may be done with or without the child’s consent, often with harmless intentions, however not uncommonly for the purpose of harassment, bullying or exploitation.17 When information is posted by or about a child, it may not be understood or considered that the image or information posted may resurface at a later date.18 This could have significant consequences both in the immediate and distant future.

To protect a child’s right to both privacy and reputation, amendments should be made in national law to protect children from misuse of their personal information and sharing of their images without consent.19 This needs to be done with significant care, so as not to criminalise children or stymie the expression of the child themselves and potentially their parents. Rather, it should be framed in a manner that prevents abuse of children through the exploitation of their image or personal information.

Similarly, there should be easily accessible methods for children to request, correct, or delete data collected or published about them without their consent, that they believe could damage their reputation.20

Children’s rights are different from those of adults, and their age, maturity and evolving capacities must be given more thoughtful consideration when we create legislation, policy and guidance. We must ensure that we uphold all their rights, not merely one right at the expense of all others.

Baroness Helena Kennedy QC is one of Britain’s most distinguished lawyers and a Trustee for 5Rights Foundation. She gives a voice to those who have least power within the system, championing civil liberties and promoting human rights. She has used many public platforms – including the House of Lords, to which she was elevated in 1997 – to argue for social justice. She has written and broadcast on a wide range of issues, including on the rights of women and children. Most recently, she launched a three-part series called Forum Internum on BBC Radio 4, which explores freedom of thought, and why it needs protecting in the digital age.