TikTok to block teens from beauty filters due to mental health issues

Teens are facing extensive new restrictions on the use of beauty filters on TikTok as they worry about rising anxiety and declining self-esteem.

Young people under the age of 18 will be prevented from artificially enlarging their eyes, plumping their lips and smoothing or changing their skin color in the coming weeks.

The restrictions will apply to filters – such as ‘Bold Glamour’ – that alter children’s facial features in ways that makeup cannot. Strip filters that add bunny ears or dog noses are unaffected. The billion-user social media company announced the changes during a security forum at its European headquarters in Dublin.

The effectiveness of the restrictions will depend on whether people are using the platform under their actual age, which is not always the case.

There has been widespread ensure that the beauty filters – some provided by TikTok, others created by users – have resulted in a push for teens, especially girls, to adopt a well-groomed appearance, with negative emotional consequences. Some young people have described how after using filters they found their real faces ugly.

TikTok also announced it will be tightening its systems to block users under the age of 13 from the platform, which could mean thousands of British children are banned from the platform. Before the end of the year, it will begin a trial of new automated systems that use machine learning to detect people violating age restrictions.

These measures come with stricter regulation of social media use by minors in Britain in the new year, under the Online Safety Act. The platform already deletes 20 million accounts worldwide every quarter because they are underage.

Chloe Setter, TikTok’s child safety policy lead, said: “We hope this will give us the ability to detect and remove faster and faster.”

People who have been wrongly blocked can appeal. “It could obviously be annoying for some young people,” Setter said, but she added that the platform will take a “safety approach.”

Ofcom said in a report last December it was reported that from June 2022 to March 2023, around 1% of TikTok’s total monthly active user base in the UK was removed for being underage.

The regulator has done that before warned the effectiveness of TikTok’s enforcement of the age restriction is “yet to be determined.” Strict enforcement of age limits above 13 for social media users is scheduled to begin next summer, requiring “highly effective” age checks.

The new ‘guardrails’ around beauty filters and age verification are part of a wave of online safety tweaks being announced by social media platforms ahead of stricter rules being imposed in the coming months, with potentially heavy fines for breaches of online safety rules.

Last week, Roblox, the gaming platform with 90 million daily users, announced it would restrict its youngest users from accessing the more violent, crude and frightening content on the platform after warnings about child grooming, exploitation and sharing of indecent images.

Instagram, which is operated by Meta, has launched “teen accounts” for those under 18 to give parents more control over their children’s activities, including the ability to prevent children from viewing the app at night.

Andy Burrows, the chief executive of the Molly Rose Foundation, which was set up to focus on suicide prevention, said: “It will not be lost on anyone that these shifts are largely being announced to comply with EU and UK regulations. This calls for more ambitious regulations, not less.”

He called on TikTok to be fully transparent about how the age guarantee measures will work and their effectiveness in reducing the number of under-13s on the platform.

Burrows added: “TikTok must act quickly to fix the systemic weaknesses in its design that allow a flood of harmful content to be algorithmically recommended to young people aged 13 or older.”

The NSPCC described the age protection measure as ‘encouraging’ but ‘just the tip of the iceberg’.

“Other social media sites need to go a step further and find effective ways to assess the ages of their users,” said Richard Collard, the charity’s head of online child safety policy. “Ofcom and the government also have an important role to play in forcing tech bosses to deliver age-appropriate experiences for all their users.”