Violent online content is now “inevitable” for children in Britain, with many first exposed to it while at primary school, according to research from the media watchdog.
Every British child interviewed for the Ofcom study had viewed violent material on the internet, ranging from videos of local school and street fights shared in group chats, to graphic and extremely graphic violence, including gang-related content.
Children were aware that even more extreme material was available in the deeper reaches of the web, but had not sought it out themselves, the report concluded.
The findings prompted the NSPCC to accuse tech platforms of sitting back and “ignoring their duty of care to young users”.
Rani Govender, a senior policy officer for children’s safety online, said: “It is deeply concerning that children are telling us that unintentional exposure to violent content has become a normal part of their online lives.
“It is unacceptable that algorithms continue to spread harmful content that we know can have devastating mental and emotional consequences for young people.”
The research, conducted by the Family, children and young people agency, is part of Ofcom’s preparation for its new responsibilities under the Online Safety Act, which was passed last year, and which gave the regulator the power to crack down on social networks that fail to protect their users, and in particular children, to protect.
Gill Whitehead, director of Ofcom’s online safety group, said: “Children should not feel that seriously harmful content – including material that depicts violence or promotes self-harm – is an unavoidable or unavoidable part of their online lives.
“Today’s research sends a strong message to tech companies that now is the time to take action so they are ready to meet their child protection duties under the new online safety laws. Later this spring we will discuss how we expect the industry to ensure children can enjoy an age-appropriate, safer online experience.”
Almost every leading technology company was mentioned by the children and young people interviewed by Ofcom, but Snapchat and Meta’s Instagram and WhatsApp apps were the most common.
“Children explained that there were private, often anonymous, accounts that existed solely to share violent content – usually local school and street fights,” the report says. “Nearly all of the children in this study who interacted with these accounts reported being found on Instagram or Snapchat.”
“There’s peer pressure to pretend it’s funny,” said an 11-year-old girl. “On the inside you feel uncomfortable, but on the outside you act like it’s funny.” Another 12-year-old girl described feeling “slightly traumatized” after being shown a video of animal abuse: “Everyone was joking about it.”
Many older children in the study “seemed to have become desensitized to the violent content they were exposed to.” Professionals also expressed particular concern about violent content that normalizes offline violence, reporting that children tended to laugh and joke about serious violent incidents.
On some social networks, exposure to explicit violence comes from the top down. On Thursday, Twitter, now known as The clip was reposted by Musk himself, who tweeted it in response to news channel NBC a report from the channel who accused him and other right-wing influencers of spreading unverified claims about the chaos in the country.
Other social platforms offer tools to help children avoid violent content, but offer little help. Many children, aged eight and up, told researchers that it was possible to report content they did not want to see, but that there was a lack of confidence that the system would work.
In private chats, they feared that coverage would label them as “traitors,” leading to shame or punishment from colleagues, and they did not trust platforms to impose meaningful consequences on those who posted violent content.
The rise of powerful algorithmic timelines, such as those of TikTok and Instagram, added an extra twist: there was a shared belief among children that if they spent some time with violent content (for example while reporting it), they would be more likely to would be inclined to recommend it.
Professionals in the study expressed concern that violent content was affecting children’s mental health. In a separate report released on Thursday, the children’s commissioner for England revealed that more than 250,000 children and young people were waiting for mental health care after being referred to NHS services, meaning one in 50 children in England are on the waiting list. For children accessing support, the average wait was 35 days, but last year almost 40,000 children had to wait more than two years.
A Snapchat spokesperson said: “There is absolutely no place for violent content or threatening behavior on Snapchat. When we find this type of content, we take swift action to remove it and take appropriate action against the offending account.
“We have easy-to-use, confidential in-app reporting tools and work with police to support their investigations. We support the aims of the Online Safety Act to help protect people from harm online and continue to work constructively with Ofcom on the implementation of the law.”
Meta has been contacted for comment. X declined to comment.