Embed children’s best interests in product development, design and policy.
Having agency means children can decide freely how they want to engage with the digital environment. This includes being able to start and stop using digital products and services of their choice easily, without feeling they are losing out, and knowing and getting precisely what they have signed up for, while not being tempted, manipulated or nudged into doing anything that undermines their safety, privacy, development and wellbeing.
This is one of 11 child rights principles applicable to the digital environment. Together they encompass the full range of child rights covered by the United Nations Convention on the Rights of the Child (UNCRC).
The more complex or opaque the digital environment, the more skilled the user must be if they are not to be deceived or manipulated. The power of platforms is often greater than the capacity of even skilled young people to manage.
Platform algorithms are often ‘out of sync’ with and insensitive to the young person’s state of mind or ability to cope, leading to experiences of ‘triggering’.
There is growing concern that the digital environment is designed to be risky in ways that prioritise profit over children’s rights and best interests. We need to consider new and emerging dimensions of digital literacy including ‘data literacy’ (Pangrazio & Sefton-Green, 2020; Stoilova et al., 2021) or ‘algorithm literacy’ (Bucher, 2018; Selwyn, 2022).
“TikTok has a nice function – there are three dots, and you can click ‘Not interested’. It does something with the algorithm… You can take some control over some posts.” (teenager with mental health difficulties, Norway)
ySKILLS is an EC-funded research network aiming to identify the actors and factors that undermine or promote the wellbeing of children aged 12–17 in a digital age. More about ySKILLS