YouTube to limit teens’ exposure to weight and fitness videos

YouTube will stop recommending videos to teens that idealize specific fitness levels, body weights or physical features. Experts warn that such content can be harmful if viewed repeatedly.

The platform will still allow 13- to 17-year-olds to view the videos, but its algorithms will no longer send young users down related content “rabbit holes” afterward.

YouTube said such content does not violate its guidelines, but that repeated viewing may affect the well-being of some users.

Dr Garth Graham, global head of health at YouTube, said: “As teens develop ideas about who they are and what standards they hold for themselves, repeated consumption of content that sets idealised standards that begin to shape an unrealistic internal norm can lead some to form negative beliefs about themselves.”

According to YouTube, experts on its youth and family advisory committee have said that certain categories that may seem “innocuous” as a single video could be “problematic” if viewed repeatedly.

The new guidelines, which have now been introduced in the UK and around the world, apply to content that: idealises certain physical features over others, such as beauty routines to make your nose appear slimmer; idealises fitness or body weight, such as exercise routines that encourage striving for a certain appearance; or encourages social aggression, such as physical intimidation.

YouTube will no longer make repeated recommendations on those topics to teens who have registered their age with the platform as logged-in users. The safety framework has already been introduced in the US.

“A higher frequency of content that idealizes unhealthy norms or behaviors can amplify potentially problematic messages — and those messages can impact how some teens see themselves,” said Allison Briscoe-Smith, a clinician and YouTube consultant. “‘Guardrails’ can help teens maintain healthy patterns as they naturally compare themselves to others and assess how they want to show up in the world.”

In the UK, the recently introduced Online Safety Act requires tech companies to protect children from harmful content, and to also consider how their algorithms expose minors to harmful material. The law cites the potential for algorithms to cause harm by pushing large amounts of content to a child in a short period of time, and requires tech companies to assess any risk such algorithms may pose to children.

skip the newsletter promotion

Sonia Livingstone, professor of social psychology at the London School of Economics, said a recent report by the charity Children’s Society highlighted the importance of tackling the impact of social media on self-esteem. A study in the Good Youth Report showed that almost one in four girls in the UK are unhappy with their appearance.

“At least there is a recognition here that changing algorithms is a positive action that platforms like YouTube can take,” Livingstone said. “This will be particularly beneficial for young people with vulnerabilities and mental health issues.”

Related Post