My nude images were shared online when I was 14 and it inspired me to fight for other survivors – here’s why I believe Big Tech is to blame

When Leah Juliett was just 14, nude photos she sent to a boy on Facebook were shared online, first with other kids at school and later on anonymous internet forums.

Juliett, who identifies as non-binary and uses the pronouns they/them, told DailyMail.com: ‘It started on an iPhone. It then spread to Facebook.

‘Then my abuse images essentially found their permanent home on an anonymous imageboard called Anon IB. That website still exists today.’

The now 27-year-old lawyer says the experience was devastating, inspiring her to become an activist and fight against what she sees as Big Tech’s failure to prevent child sexual abuse through images.

Leah Juliett, now 27, has become an advocate against abuse enabled by technology

They continued: ‘When this happened to me, when I was 14 years old, I wanted to die. I tried to die. I can’t stress that enough. I’ve made it my mission to fight for accountability from big tech and justice for survivors.’

In 2017, they organized the March Against Revenge Porn across the Brooklyn Bridge, kicking off a campaign against technology abuse that eventually propelled Juliett to the White House.

Juliett is now campaigning for the Heat Initiative, which aims to hold Apple accountable for the distribution of abusive images on the company’s iCloud.

They said, “I’ve really used my shame as a force for social good. But I’m only 27 years old. I didn’t want or expect this to be my life. When I was little, I wanted to be a singer.

“But because this is my life, and because it unfortunately still is for so many vulnerable teens and children in our country and around the world, I still carry my trauma with me.

“It’s a deeply ingrained part of who I am and an integral reason why I do the work that I do. But I’m stronger now. I’ve built a toolbox — a toolbox to reclaim the shame that I’ve experienced and use it for good.”

Julia told this website that since 2017 the language surrounding the topic has changed dramatically.

Juliett said, ‘The whole landscape of the [revenge porn] The issue has changed since… when I first walked across the Brooklyn Bridge.

“We don’t use that term anymore. Because there’s nothing I’ve done to take revenge on my body. And non-consensual nudity [is] no pornography.

“We say ‘image based sexual abuse’ and ‘child sexual abuse material.’ Those are more accurate terms to describe the real crimes that happen to children every day across the country.”

They added that “millions” of internet users worldwide are victims of similar abuse and that “the telephone is the delivery mechanism.”

According to DailyMail.com, legislation and education from both parties are essential to prevent the misuse of images.

But Big Tech is also part of the problem.

Juliett said, “It’s an important moment for us to look upstream and recognize that we can’t solve the problem at the well. We have to address it at the source. And in my work and in my experience over the past decade as a survivor and expert in this area, I’ve recognized that source as the iPhone.

“What people don’t realize is that these technology companies, including Apple and especially Apple, are not just laboratories for innovation, as Apple likes to call itself, but they are companies that offer products and services.”

Unlike supermarkets, for example, which are not allowed to sell products that poison people, there is little legislation surrounding Big Tech, Juliett believes.

They added: ‘These are companies that provide services to people and people are seriously harmed by their products.

“I personally think there are many things they can do to prevent this kind of harm. And there’s a very clear reason why they don’t and that is because they continually put profit over people.”

Data from the National Center for Missing and Exploited Children (NCMEC) shows that Apple documented 267 cases of child sexual abuse (CSAM) worldwide between April 2022 and March 2023.

The number of iPhone users worldwide is estimated at over one billion.

When Juliett was 14, nude photos she sent to a boy were shared online

When Juliett was 14, nude photos she sent to a boy were shared online

Juliett told this website: “They could provide a more robust reporting mechanism on their platforms. For example, we know that Meta has a robust reporting record with the National Center for Missing and Exploited Children.

“Apple, on the other hand, doesn’t have a significant reporting history, no matter how you slice it. But we know the abuse is happening in iCloud.”

Apple announced in 2021 that it would implement “NeuralHash,” an algorithm designed to detect and remove child sexual abuse data in iCloud.

But a few months later, the program was shut down due to privacy concerns.

Juliett said: ‘The most basic thing they can do today is they can do a basic hash, hash matching detection, and iCloud, which basically takes a piece of known CSAM and turns it into a unique string of numbers through an algorithm. It kind of fingerprints that image and then compares it to a list of other digital fingerprints.

“They could. Start today and save children’s lives today by detecting known images of child sexual abuse.”

In the company’s response to the Heat Initiative’s rollback, Erik Neuenschwander, Apple’s director of user privacy and child safety, said: “Child sexual abuse material is abhorrent, and we are committed to breaking the chain of coercion and persuasion that leaves children susceptible to this type of abuse.”

However, he said that after consulting with privacy and security experts, digital rights groups and child safety advocates, Apple concluded it could not proceed with the mechanism for scanning for CSAM material, even one specifically designed to protect privacy.

Neuenschwander wrote: ‘Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. It would also inject the potential for a slippery slope of unintended consequences.

“For example, if you scan for one type of content, you can do mass surveillance and you could create the need to also search other encrypted messaging systems for different content types.”

Juliett, now 27, said the experience was devastating

Juliett, now 27, said the experience was devastating

DailyMail.com has reached out to Apple for comment and was referred to an earlier statement from Apple to the Heat Initiative.

The statement said: ‘Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that leaves children vulnerable to sexual abuse.

‘We are proud of the contributions we have made to date and are committed to continuing to collaborate with child safety organizations, technologists and governments to develop sustainable solutions that help protect the most vulnerable in our society.

‘When it comes to child safety, we have made a meaningful contribution to this goal by developing a number of innovative technologies.

As you note, we decided not to pursue the proposal we made a few years ago for a hybrid client-server approach to CSAM detection for iCloud Photos for a number of good reasons.

‘After consulting extensively with child safety advocates, human rights organizations, privacy and security engineers, and academics, and after considering the scanning technology from virtually every angle, we concluded that it was not practical to implement without ultimately compromising the safety and privacy of our users.

“Scanning personal data in the cloud is frequently used by companies to monetize their users’ information. While some companies have justified this practice, we have chosen a very different path—one that prioritizes the security and privacy of our users. We believe that scanning every user’s privately stored iCloud content would have serious unintended consequences for our users.”

You can read the full statement here here.

Juliett campaigns against image-based abuse in a number of ways, including through poetry

Juliett campaigns against image-based abuse in a number of ways, including through poetry

But Juliett said she will keep fighting.

She told DailyMail.com: ‘I tell a lot of stories through poetry. And I will continue to use my voice to tell my story and shout my poems… wherever the wind takes me until I see major technological reforms.

‘When I started the March Against Revenge Porn in 2016, it felt like a very lonely fight for me. But 10 years later, I realized I don’t have to be alone. I don’t have to be alone.

“I am now part of an incredible group of survivors and allies. And if I were to lead that same march today, I know I would have hundreds of survivors by my side, friends. It has been incredibly difficult to go public with my story. But I know this is what I was born to do.”