Instagram is introducing separate teen accounts for those under 18 in a bid to make the platform safer for children amid a growing backlash over the way social media influence the lives of young people.
Starting Tuesday in the US, UK, Canada and Australia, anyone under 18 who signs up for Instagram will be moved to a teen account, with those with existing accounts migrated over the next 60 days. Teens in the European Union will see their accounts adjusted later this year.
Meta acknowledges that teens can lie about their age and says they will be required to verify their age in more cases, such as when creating a new account with an adult birthday. The Menlo Park, Calif.-based company also said it is building technology that proactively finds teen accounts posing as adults and automatically places them in restricted teen accounts.
Teen accounts are private by default. Private messages are limited, so teens can only receive them from people they follow or are already connected to. ‘Sensitive content’, such as videos of people fighting or those promoting cosmetic procedures will be limited, Meta said. Teens will also get notifications if they’re on Instagram for more than 60 minutes, and a “sleep mode” will be enabled that turns off notifications and sends automatic replies to direct messages from 10 p.m. to 7 a.m.
While these settings are enabled for all teens, 16 and 17 year olds can disable them. Children under 16 need parental permission to do so.
“The three concerns we hear from parents are that their teens are seeing content they don’t want to see, or they’re being contacted by people they don’t want to contact, or they’re spending too much money on the app,” said Naomi Gleit, Meta’s chief product officer. “So teen accounts is really focused on addressing those three concerns.”
The announcement comes as the company faces lawsuits from dozens of US states who accuse the company of harming young people and contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that addict children to the company’s platforms.
In the past, Meta’s efforts to address teen safety and mental health on its platforms have been criticized for not going far enough. For example, kids are notified when they’ve spent 60 minutes on the app, but they can bypass it and keep scrolling.
Unless the child’s parents enable “parental controls” mode, which allows parents to limit the time teens spend on Instagram to a specific amount of time, such as 15 minutes.
With the latest changes, Meta gives parents more options to supervise their children’s accounts. People under 16 need permission from a parent or guardian to change their settings to less restrictive ones. They can do this by setting up ‘parental controls’ on their accounts and connecting them to a parent or guardian.
Nick Clegg, Meta’s president of global affairs, said last week that parents do not use parental controls the company has introduced in recent years.
Gleit said she thinks teen accounts will be a “great incentive for parents and teens to set up parental controls.”
“Parents can see who their teen is messaging through the family center and hopefully have a conversation with their teen,” she said. “If there’s bullying or harassment, parents have visibility into who’s following their teen, who’s following their teen, who’s messaged their teen in the last seven days, and hopefully they can have some of those conversations and help them navigate these really difficult situations online.”
US Surgeon General Vivek Murthy said last year that tech companies are asking too much of parents when it comes to keeping children safe on social media.
“We’re asking parents to manage a technology that’s evolving rapidly and fundamentally changing how their kids think about themselves, how they build friendships, how they experience the world — and technology, by the way, that previous generations never had to manage,” Murthy said in May 2023.