At Revealing Reality we pride ourselves on understanding what children are doing and seeing online, and exploring the effects this has on them. Over the last 10 years we’ve spent hundreds of hours interviewing children, and devoted days and days of analysis to exploring the data we’ve gathered through these interviews, diary tasks and consensual screen recording of their online experiences.
We do this work because we think it’s essential that everyone from parents to platforms, teachers to teenagers, regulators to reformers can be armed with appropriate and accurate knowledge with which to make informed choices, promote the benefits and reduce the risks of being online.
But this research has some limitations. We are careful not to introduce children to ideas or material that they are not familiar with, and never to risk exposing them to harm. But this may mean we are not seeing or learning about everything they see. And we certainly can’t use real children to ‘test’ what a platform’s design allows younger users to come across or access online.
So over three years ago we started designing and developing a rigorous, safe, ethical method to use avatars to provide insight into what children may be exposed to online, based on real children’s interests and behaviours.
Think of it as a kind of mystery shopping – testing the efficacy of platform’s systems and safety measures, exploring the way platforms serve content to users based on their characteristics.
The first commissioned piece of work we did using this methodology was Pathways: How digital design puts children at risk for 5Rights, which was the first work of its kind to use avatars to demonstrate, based on real teenagers’ demographics, interests and behaviours, the content and advertising they were being shown. This work revealed that in some cases platforms were using users’ age to show them advertising targeting young people, but not using their age to prevent them seeing inappropriate content in their feeds.
We are now really pleased to have been commissioned by Ofcom, ahead of its anticipated extended regulatory responsibilities as described by the Online Safety Bill, to conduct a pilot study to test the feasibility of the regulator using avatars in research to deepen its understanding of children’s online experiences across a variety of platforms.
This research focused on children under 13 – those who are not supposed to be on most social media platforms, but whom we know from our own and other research, often are – in huge numbers. To be on these platforms, they have usually lied about their age, not only saying they are over 13, but often that they are over 18. Understanding the experiences of younger children and pre-teens is essential for effective regulation to keep them safe, but safeguarding during research is paramount. Using avatars offers a way to ensure both requirements are met, while providing data that is not available through interviews or tracking alone.
The pilot study shows clearly that avatars can be a valuable, ethical research tool to understand experiences across the range of online platforms children use.
As with all research, the design of the methodology, and the ‘inputs’ that underpin the way the avatars are set up, determine how revealing and representative the findings are. The more ‘real’ data is used to inform an avatar’s ‘behaviour’, and the more nuance can be added into the coding of content and experiences, the more useful the results. Understanding what real children do and how they behave is still a prerequisite for valid use of avatars in research. Our deep and up-to-date knowledge of what children see and do online ensured we were able to identify trends, spot slang and shorthand and join the dots in ways that others would have missed.
Similarly, the use of avatars is ethical insofar as the ‘rules’ for the avatars are designed to be ethical. In this work, for example, the avatars did not interact with other users, and did not interact with content that had engagement below a certain level, so as not to affect other users or inadvertently influence platforms’ algorithms.
The methodology, benefits, challenges and suggestions for further improvements are available in this short report.
We believe research using avatars will be an essential part of the toolkit to provide insight to the regulator and others about children’s (or adults’) experiences online and we look forward to conducting further work using this methodology in future.