Ofcom threatens to block pro-suicide website ‘linked to 50 deaths in Britain’ after forum refused to comply with new online safety laws that would see site’s operators jailed for 14 years
Ofcom has threatened to block a pro-suicide website linked to the deaths of at least 50 people in Britain after the site announced it would refuse to comply with new safety regulations introduced by the Online Safety Bill.
An investigation today found that British authorities failed to act on multiple warnings about the site, which is not being named for security reasons, including from victims’ families, police investigators and coroners.
Now it could be banned after releasing an insensitive statement in which its owners said they would refuse to change the forum to protect other people.
At the top of the site there is a message: ‘Hello guest. We will not follow or comply with the Online Safety Bill that was recently signed into law in the UK.
Callie Lewis, who struggled with chronic depression and suicidal thoughts, was also a member of the site. The 24-year-old, who had autism, was found dead in a tent in the Lake District in 2018
Talented sailor and mental health blogger Beth Matthews took her own life in March 2022. She is also said to have visited the site before her death.
‘This bill does not affect the site’s operations, nor are we present in Britain to receive any notices or fines that the UK government may impose.’
It adds that the law, which is designed to protect vulnerable people and especially children from harmful online content, is a “draconian” measure.
The site’s members also include children, with the youngest known victim being just 17 years old.
Ofcom will set out its legally enforceable code of practice next month on how it will tackle content that is now illegal under the bill.
It will have the power to fine companies up to ten percent of their global turnover and even jail bosses who consistently refuse to take action to limit the harmful content viewable on their sites.
Although the company behind the site, believed to be American, is not based in Britain, this does not necessarily mean it is excluded from British sanctions.
Ofcom will have the power to issue court orders against those it believes are in breach of the bill to prevent UK users from accessing sites.
But the site brazenly advises UK users on how to get around potential restrictions such as geo-blocking – despite operators potentially facing a prison sentence of up to 14 years if the content encourages or aids suicide.
Online, the site claims to provide a ‘safe space to discuss the topic of suicide without censorship’ – but explicit content about self-harm and suicide can be found in seconds, with no age checks on users.
An Ofcom spokesperson said: ‘We expect technology companies will be fully prepared to meet their new duties when the time comes. It is a serious concern when companies say they are going to ignore the law.”
One victim, Joe Nihill from Leeds, found the spot in April 2020 and took his own life after being ‘coached’ by others for a month. He was only 23. In a note to his family, he warned of the dangers of the forum and pleaded: “Please do your best to close the website to anyone else.”
Callie Lewis, who struggled with chronic depression and suicidal thoughts, was also a member of the site. The 24-year-old, who had autism, was found dead in a tent in the Lake District in 2018.
Deaf British TikTok star Imogen Nunn, 25, visited the forum in November 2022, just three months before she took her life
Zoe Lyall died in May 2020, with her family saying she had been ‘utterly abandoned’ by Berkshire Healthcare NHS Foundation Trust. It turned out that Zoe had also used the forum
“Without these forums, I think my daughter would have struggled to find the information she was looking for about how to die,” said Callie’s mother Sarah.
An investigation into Callie’s death highlighted the role the forum played.
“Callie was enabled by the advice given through the forum to frustrate a mental health assessment and subsequently take her life,” said senior East Kent coroner Patricia Harding when she sent the report to the Department of Culture, Media and Sports sent.
“In my opinion, action must be taken to prevent future deaths and I believe you have the power to take such action.”
The Kent-based coroner is one of six who have written to the government in recent years demanding the forum be closed.
Zoe Lyalle, 18, who died in May 2020, and Beth Matthews, 26, who died in March 2022, are also believed to have visited the site before their deaths.
The BBC said the site has more than 40,000 members around the world and hosts more than two million messages – many of which are extremely explicit.
Authorities told the BBC that the website is hosted anonymously, meaning no one can determine who keeps the site running.
Deaf British TikTok star Imogen Nunn – known on the app as Deaf Immy – used her account to promote mental wellbeing and positivity, despite her own struggles with mental health. The 25-year-old attended the forum in November 2022, just three months before she took her own life. Her mother Louise asked: ‘when will something be done about it?’
The influencer had used a ‘suicide kit’ allegedly sent by Canadian chef Kenneth Law. He is accused of supplying people with poison. Britain’s FBI, the National Crime Agency, claimed last month that the products he sold were linked to 88 deaths in Britain – and that he may have shipped as many as 1,200 parcels to 40 countries.
One victim Joe Nihill from Leeds discovered the site in April 2020 and after a month of being coached by others on the best way to end his life, the 23-year-old died
An Ofcom spokesperson said: ‘The Online Safety Bill makes it clear that sites and apps must take steps to prevent users from encountering illegal material, including content that encourages or supports suicide. Platforms will have to act quickly to remove these types of videos or posts as soon as they become aware of them.
‘Very soon after the Bill becomes law, we will consult on our draft codes and guidelines, which set out the specific standards that technology companies can use to tackle illegal harm.
“We expect technology companies to be fully prepared to meet their new tasks when the time comes. It is a serious concern when companies say they are going to ignore the law. Where services fail to comply, we have a wide range of enforcement powers at our disposal to ensure they are held fully accountable for the safety of their users.”
A government spokesperson said: ‘Every suicide is a tragedy, and encouraging or assisting suicide is already a crime under the Suicide Act.
‘Our Online Safety Bill will create a new criminal offense for encouraging or assisting serious self-harm through communication, while also ensuring that the largest social media companies must proactively prevent users from encountering content that encourages or supports suicide.
‘We have also pledged to reduce the suicide rate in England within two and a half years with our new National Suicide Prevention Strategy, backed by more than 100 measures including a national alert system to combat emerging methods.’
For confidential support, call Samaritans on 116123 or visit us www.samaritans.org