More than six years after the tragic death of Molly Russell, Instagram is finally hiding all posts that could seriously harm children.
The Meta-owned app blocks messages about suicide, self-harm, eating disorders and other “types of age-inappropriate content” for users under 18.
Anyone between the ages of 13 and 17 will automatically get the block on Instagram – and also on Facebook – and won’t be able to turn it off, although it will be lifted once they turn 18.
It means they won’t see any of these posts in their Instagram home feed and Stories, regardless of whether they were shared by someone they follow.
It follows serious concerns about the way teenagers are being affected by social apps, including 14-year-old Instagram user Molly Russell who committed suicide in 2017.
Instagram will now automatically hide content related to suicide, self-harm and eating disorders from users under the age of 18
A blocking of age-inappropriate content on Instagram and Facebook will be rolled out globally in the coming months
In a new blog post, Meta said that the blocking of age-inappropriate content on Instagram and Facebook will be rolled out globally in the coming months.
“We want teens to have safe, age-appropriate experiences with our apps,” the tech giant said.
“We’ve developed more than 30 tools and resources to support teens and their parents, and we’ve spent more than a decade developing policies and technology to address content that breaks our rules or could be considered sensitive.
“Today we are announcing additional protections targeting the types of content teens see on Instagram and Facebook.”
Instagram requires everyone to be at least 13 years old if they want to create an account – already criticized by experts and the public alike as far too young.
The new block on ‘age-inappropriate content’ is only intended for users aged 13 to 17 and therefore does not affect users over 18.
Such content is automatically removed from the home feed, where posts from other users are displayed, as well as from Instagram Stories, the small snippets that disappear after 24 hours.
However, Instagram is making it harder for all users to find age-inappropriate content when they search for it – not just teens, but adults too.
When teens tap the search function and search for terms related to suicide, self-harm, and eating disorders, the results are hidden
If users of any age tap the search function – indicated by the magnifying glass in the menu – and search for terms related to suicide, self-harm and eating disorders, the results will be hidden.
Additionally, the user is directed to several options, including a local helpline they can text or call, or a prompt to message a friend for support.
Also newly announced by Meta, 13-17 year old users will also be asked to check their privacy settings regularly with new notifications.
The notifications will encourage them to update their settings to a more personalized experience with one tap, by enabling an option called ‘Enable recommended settings’.
This will automatically change their settings to limit who can repost, tag or mention their content, or include their content in the Reels Remixes tool.
“We also ensure that only their followers can message them and help hide offensive comments,” Meta said.
The updates build on Meta’s existing protections for teens, including preventing adults from sending messages to people under 18 who they don’t follow.
Instagram has already acknowledged that “young people can lie about their date of birth,” meaning 13- to 17-year-olds have told the app they are adults.
To counter this, it introduced new tools in 2022 to verify a user’s age, including asking them to upload a video selfie or have others vouch for their age.
To ensure teens regularly check their safety and privacy settings on Instagram and are aware of the more private settings available, Meta is sending new notifications to encourage them to update their settings with one tap to a more private experience
Meta’s changes are in line with “expert guidance” from professional psychologists, including Dr. Rachel Rodgers of Northeastern University.
“Meta is evolving its policies around content that may be more sensitive to teens, which is an important step in creating social media platforms where teens can connect and be creative in age-appropriate ways,” said Dr. Rodgers .
‘This policy reflects current understanding and expert guidance regarding the safety and wellbeing of teenagers.
“As these changes unfold, they provide great opportunities for parents to talk to their teens about how to navigate difficult topics.”