Suicide, self-harm, depression and eating disorders: what children encounter online in a week

At Revealing Reality, we explore what life online looks like through children’s eyes – talking to them, analysing their social media feeds, and stepping into their digital worlds through avatars. And we’ve seen the types of content they encounter evolving dramatically.

Some of the content with the most obvious potential for harm – such as violent or sexually explicit content – appears to have become less visible on children’s most popular social media platforms. This may reflect the tech industry’s growing investment in moderation.

However, parents may be surprised to hear that, despite this, we continue to hear reports of children encountering distressing or potentially harmful content online. Content which you may assume is extremely rare on popular social media sites used by children.

Our recent research for Molly Rose Foundation

To understand what children are exposed to online, and capture a benchmark prior to the Online Safety Act taking effect, we worked with Molly Rose Foundation to investigate children’s exposure to four types of content of known concern: suicide, self-harm, depression and mental health, and eating disorders.

We asked 1,897 children aged 13 to 17 about the content they had seen in the past week:

  • 5% reported seeing content that encourages or promotes suicide
  • 6% reported seeing content that encourages or promotes self-harm
  • 8% reported seeing content that makes self-harm seem normal, appealing, or cool
  • 17% reported seeing content that makes feeling depressed seem normal, appealing, or cool
  • 12% reported seeing content that encourages or promotes eating disorders

Although some of these percentages may appear small, they represent a meaningful level of risk. For instance, if you were a parent would you feel comfortable with your child using a social media platform where one in twenty children reported seeing content encouraging suicide within the past week? Or where almost one in eight children reported seeing content encouraging or promoting eating disorders?

These types of potentially harmful content are neither niche nor new. They have been recognised – and discussed – for years, with broad agreement across mainstream platforms that they should have no place on their sites. Yet children continue to report encountering them.

Not all children were as likely to report seeing potentially harmful content. Girls, children with low wellbeing, and those with special educational needs or disabilities (SEND) were more likely to report seeing certain types of content in the last week. For example:

  • 20% of girls reported exposure to content encouraging or promoting eating disorders, compared to 6% of boys.
  • 14% of those with SEND reported exposure to content that encourages or promotes suicide, compared to 3% without SEND.
  • 18% of children with low wellbeing reported exposure to content encouraging or promoting self-harm, compared to 3% with high wellbeing.

For some, these types of ‘higher-risk’ material appeared alongside high volumes of lower risk content. For example, nearly a third (28%) of children exposed to content promoting eating disorders had also seen dieting or calorie-restrictive content 10 or more times on at least one platform in the last week.

While this survey cannot tell us exactly what each child saw – and without access to platform data it’s difficult to corroborate these experiences – their accounts raise an important question:

In what other environment where children spend hours each day would we find it acceptable for them to encounter material they believe promotes suicide or eating disorders?

Since this survey was conducted, the Online Safety Act has become enforceable and social media platforms have a legal duty to protect children online. Time will tell whether the Online Safety Act will have an impact.

We hope future surveys will show reductions in rates of children’s reported exposure.