Digital age assurance, age verification tools, and children's rights online across the globe: a discussion paper

From Brongersma
Jump to navigation Jump to search


Age assurance tools relate to a number of children's rights under the UN Convention on the Rights of the Child (CRC), including those listed below. This section presents some key principles for applying the CRC to age assurance tools:

Article 7 & 8 - birth registration, and protection and preservation of identity Age is an identity attribute, and as national birth registration systems are increasingly digitized, children's date of birth will be recorded in national systems. There is a balance to be struck between promoting children's rights to birth registration and to an identity, and ensuring that the use of this data is regulated effectively to protect children's privacy.
Article 12 - respect for the views of the child Children should be consulted on their views about which platforms are appropriate for them to access, and should have their views taken into account before being denied access to online spaces or content on the basis of their age.
Article 13, 14, 15 & 17 - freedom of expression, freedom of thought, freedom of association and access to information As much of children's lives has moved online, their rights to express themselves, to access information, and to meet with others and join groups should not be unduly restricted on the basis of their age.
Article 16 - right to privacy Most age assurance tools with a high degree of accuracy rely on official data that can easily identify a child. It is important that children's right to privacy is respected as they continue to engage in online spaces, and that they are only identified where strictly necessary to prevent serious harm, and with their consent or the consent of their parents or caregivers.
Article 19 - protection from violence, abuse and neglect. Children have the right to protection from violence online as well as offline, including from cyberbullying and harassment. Because age assurance methods can detect adults who contact children online, they can play a role in tackling harms that are perpetrated by adults against children (or harms perpetrated by older children against much younger children).
Article 28 - right to education Many platforms, including online gaming platforms, provide children with opportunities for learning that should not be unduly restricted on the basis of age, recognizing that the capacity for learning does not always correspond with age, and nor does the ability to cope with risk.
Article 34 - sexual exploitation Children have the right to protection from sexual exploitation online, and governments and platforms must take all measures to mitigate these risks, which may be easier to do if they know the age or age range of their users.
Article 36 - other forms of exploitation Governments must protect children from exploitation of their data by companies. Requiring platforms to take steps to ascertain the age of their user base in order to comply with special data protection laws for children may be one method of accomplishing this. However, it is also important to ensure that age assurance processes respect the data minimization principle, and children's right to privacy.




What is the evidence of risk and harm?

There are several different kinds of risks and harms that have been linked to children's exposure to pornography, but there is no consensus on the degree to which pornography is harmful to children. Prominent advocates point to research arguing that access to pornography at a young age is linked with poor mental health, sexism and objectification, sexual aggression and other negative outcomes. The evidence suggests that some children appear to be harmed by exposure to some kinds of pornography at least some of the time, but that the nature and extent of that harm vary.

There is conflicting evidence regarding how many children worldwide are accessing pornography online, and how often. Some studies have found that boys are more likely to experience greater exposure to pornography at an earlier age, and they are more likely to be exposed to violent or abusive images such as rape, whereas girls are more likely to be subject to involuntary or problematic exposure. The 2020 EU Kids Online study compared survey findings from 19 European countries and found that in most countries, most children who saw sexual images online were neither upset nor happy (ranging from 27 per cent in Switzerland to 72 per cent in Lithuania); between 10 per cent and 4 per cent were fairly or very upset; and between 3 per cent of children (in Estonia) and 39 per cent (in Spain) reported feeling happy after seeing such images.

It is worth considering the online pornography network and asking where children are most likely to come across or seek out pornography. There is evidence that pornography is increasingly shared on social media sites, and although several platforms have been very intent on removing all forms of sexual content, many pornography producers are able to channel 'cleaner' content through these platforms towards pornographic content hosted elsewhere,132 often by using bots.

Does the evidence warrant age restrictions?

As discussed above, the evidence is inconsistent, and there is currently no universal agreement on the nature and extent of the harm caused to children by viewing content classified as pornography. However, policymakers in several countries have deemed that children should not be able to access commercial pornography websites designed for users aged over 18. There would be additional challenges to designing a more nuanced age-rating system, as this would require establishing a clearer definition of pornography, as well as classifications within that definition that would be suitable for different age groups to view. In this context, differences in individual children's level of maturity and evolving capacities within the same age brackets would also come into play.

Are age assurance tools likely to be effective in this context?

In Australia, the House of Representatives Committee acknowledged that the age verification system would not be enforceable with regard to overseas websites, Google search results or social media platforms, and accepted that young people would be likely to bypass the system. However, the system was still thought to be likely to reduce harm, and therefore worth implementing, even if imperfect. The eSafety Commissioner in Australia notes that "age verification will never be the sole or even the principal firewall between pornography and our kids". She advises that parents are the best firewall for their children, as well as the education system, and the adoption of safety-by-design principles by platforms.

In relation to the UK Digital Economy Act, several civil society organizations raised alarms of serious privacy risks related to the collection of sensitive user data in relation to accessing pornography, and the ease with which children could potentially bypass the system by using a virtual private network. The Act has also been criticized for focusing on commercial pornography websites, overlooking the evidence regarding children's access to pornography elsewhere such as via social media.

Although age verification tools may prevent children from accessing pornography from commercial websites, it is unlikely that they would prevent children from accessing pornography completely. Therefore, if the goal is to prevent children from viewing pornography online in any form, it is not clear that preventing children from visiting commercial pornography websites through age verification would be a successful strategy.

At a baseline, age assurance tools could be more suited to ensure that younger children are not able to access commercial websites intended for adults, while mitigating broader privacy concerns. This could be done by checking whether the child in question appears to be within a range of 14-18, which could be effective in excluding young children. However, it is possible that this would cause children to seek out pornography elsewhere, such as on social media and to share it with friends on messaging apps, than preventing them from accessing it altogether. However, there is still an argument to be made that mandating the use of age verification or assurance in law could contribute to changing social norms around children accessing pornography, and hold the companies producing pornography more accountable for deploying the same restrictions online as is the norm offline in many contexts.

In the case of pornography accessed via social media, even if the platforms employed age assurance tools to tailor the user experience to the age of the user, it is unclear whether age assurance would protect child users of social media from bots designed to direct them to pornography sites.

From a rights perspective, extreme care would be needed to avoid excluding children from sexual and reproductive health information online: sexuality education, including resources for LGBTQ education, may be categorized as pornography in some contexts. Finally, it is questionable whether age assurance tools are an appropriate response to pornography that depicts extreme violence or violence against women, both of which can arguably be considered harmful for viewers of all ages.



Distinct from the contexts discussed above, there is another use for age assurance tools, namely the detection of child sex abuse materials online. [...]

There are significant concerns about the involvement of under-18s in the pornography industry, as this content constitutes child sexual exploitation under the CRC, whether the child consents to its production or not.

source: Report 'Digital Age Assurance, Age Verification Tools, and Children's Rights Online across the Globe: A Discussion Paper'; The discussion paper series on children's rights and business in a digital world is managed by the UNICEF Child Rights and Business Unit. This paper was written by Emma Day;; United Nations Children's Fund (UNICEF); April 2021