‘Disturbing lack of moderation’: How eating disorder reports on X are increasing

Debbie was scrolling through X in April when a number of unwanted messages appeared on her feed. One showed a photo of someone who was visibly underweight and asked if they were thin enough. In another, a user wanted to compare how few calories they ate per day.

Debbie, who did not want to give her last name, is 37 and was first diagnosed with bulimia when she was 16. She did not follow either account behind the posts, which belonged to a group with more than 150,000 members on the social media site.

Out of curiosity, Debbie clicked on the group. “If you scroll down, all you see are posts about eating disorders,” she said. “People asking for opinions on their bodies, people asking for advice on fasting.” A pinned post from an admin encouraged members to “remember why we starve.”

The Observer has discovered seven other groups, with a total of nearly 200,000 members, that openly share content promoting eating disorders. All of the groups were created after Twitter was bought by billionaire Elon Musk in 2022 and rebranded as X.

Eating disorder campaigners said the scale of harmful content showed serious failings in moderation by X. Wera Hobhouse MP, chair of the all-party parliamentary group on eating disorders, said: “These findings are deeply concerning… X should be held accountable for allowing this harmful content to be promoted on its platform, putting many lives at risk.”

The internet has long been a breeding ground for content promoting eating disorders – sometimes called “pro-ana” – from message boards to early social media sites like Tumblr and Pinterest. Both sites banned posts promoting eating disorders and self-harm in 2012 after a storm of outrage over their spread.

Debbie said she remembers the pro-ana internet forums, “but you have to look hard to find them,” she said.

This kind of content is now more accessible than ever and, critics of social media companies argue, is being presented to users by algorithms that are feeding people more – and sometimes increasingly extreme – messages.

Social media companies have come under increasing pressure in recent years to improve protections following deaths linked to harmful content.

The coroner ruled that online content contributed to her death, meaning that 14-year-old Molly Russell committed suicide in 2017 after watching content that depicted her committing suicide and harming herself.

Two years later, in 2019, Meta-owned Instagram said it would no longer allow content depicting graphic self-harm. The Online Safety Act, passed last year, will require tech companies to protect children from harmful content, including the promotion of eating disorders, or face hefty fines.

Baroness Parminter, who sits in the all-party group, said that while the Online Safety Act was a “reasonable start”, it did not protect adults. “Social media providers’ duties only apply to content that children can see… And of course eating disorders don’t stop when you’re 18,” she said.

According to X’s user policy prohibits content that encourages or promotes self-harmwhich explicitly includes eating disorders. Users can report violations of X’s policies and posts, and also use a filter on their timeline to report that they are “not interested” in the content presented to them.

But concerns about a lack of moderation have grown since Musk took over the site. Just weeks later, in November 2022, he laid off thousands of employees, including moderators.

Due to the budget cuts, the number of staff working to improve moderation has been significantly reduced, according to figures provided by X to the Australian Online Safety Commissioner.

Musk also made changes to X that allowed users to see more content from accounts they don’t follow. The platform introduced the “For You” feed, making it the default timeline.

In a blog post last yearAccording to the company, about 50% of the content that appears in this feed comes from accounts that users aren’t yet following.

In 2021, Twitter launched “communities” as an answer to Facebook’s groups. They’ve become more prominent since Musk took over. In May, X announced: “Recommendations for communities you might like are now available on your timeline.”

In January, X’s competitor, Meta, which owns Facebook and Instagram, said it would still allow people to share content documenting their struggles with eating disorders, but it would no longer recommend it and it would be harder to find. While Meta has started pointing users to safety resources when they search for eating disorder groups, X allows users to search for such communities without displaying warnings.

skip the newsletter promotion

Debbie said she found X’s tools for filtering and reporting harmful content ineffective. She shared screenshots of the group’s posts with the Observer that kept appearing in her feed even after she reported it and marked it as irrelevant.

Mental health activist Hannah Whitfield deleted all of her social media accounts in 2020 to help her recover from an eating disorder. She has since returned to a number of sites, including X, and said her For You feed had “thinspiration” posts glorifying unhealthy weight loss. “What I found with[eating disorder content]on X was that it was a lot more extreme and radical. It definitely felt a lot less moderated and it was a lot easier to find really graphic stuff.”

Eating disorder charities stress that social media does not cause eating disorders, and that users who post pro-ana content are often ill and not malicious. But social media can lead those already struggling with eating disorders down a dark path.

Researchers believe that online users may be drawn to pro-eating disorder communities through a process similar to radicalization. One study, published last year by computer scientists and psychologists at the University of Southern Californiafound that “eating disorder-related content is easily found through tweets about ‘diet,’ ‘weight loss,’ and ‘fasting.’”

The authors, who analysed 2 million posts about eating disorders on X, said the platform offered “a sense of belonging” for people with the illness, but that unmoderated communities “can become toxic echo chambers that normalise extreme behaviour”.

Paige Rivers was first diagnosed with anorexia when she was 10 years old. Now 23 and training to be a nurse, she has seen content about eating disorders on her X feed.

Rivers said she found a number of settings that allow users to block certain hashtags or phrases. These settings are easy to bypass.

“People started using hashtags that were a little bit different, like anorexia with numbers and letters, and it just kind of fell through,” she said.

Tom Quinn, Director of External Affairs at Eating Disorders Charity Beatsaid: “The fact that these so-called ‘pro-ana’ groups are given the space to proliferate shows an extremely disturbing lack of moderation on platforms like X.”

For people in recovery, like Debbie, social media held a promise of support.

But the constant exposure to triggering content, which Debbie can’t seem to limit, has had the opposite effect. “It stops me from using social media, which is really sad because I struggle to find people who are in a similar situation, or people who can give advice about what I’m going through,” she said.

X did not respond to a request for comment.