Meta presentation ‘claimed 100,000 children who use Facebook and Instagram are sent photos of adults genitals or other sex abuse material EVERY DAY – with “people you may know” feature helping connect predators to youngsters’

  • New documents released as part of New Mexico’s civil lawsuit against Meta allege an algorithm linked children to potential predators
  • Employees reportedly noticed flaws in the design of the ‘People You May Know’ algorithm, which was rejected by executives
  • One employee also highlighted how an Apple executive filed a complaint with the company after being asked for their 12-year-old child on Instagram

According to an internal Meta presentation, an estimated 100,000 minors received photos of adult genitalia or other content depicting sexual abuse every day.

That figure is among new, unredacted material about the company’s child safety policies, which were detailed in a lawsuit filed by New Mexico last month.

In the documents, seen by the Wall Street JournalMeta staff noted that one of the recommended algorithms had reportedly connected child users to potential predators.

The algorithm, called “People You May Know” (PYMK), was allegedly brought to the attention several years earlier by executives who had rejected a staff recommendation to change the design, the lawsuit said.

Commenting on the report, a Facebook employee said the algorithm had “contributed to 75 percent of all inappropriate contact between adults and minors.”

A November 2020 presentation titled “Child Safety: State of Play” said Instagram used “minimal child safety protections”

Meta CEO Mark Zuckerberg speaks during the tech giant’s Connect developer conference on September 27, 2023 in Menlo Park, California

Another employee added: “How on earth didn’t we just take out PYMK between adults and children? It’s really quite disturbing.’

The civil lawsuit alleges that “Meta knowingly exposes children to the dual dangers of sexual exploitation and harm to mental health.”

In a 2020 internal email, employees reported that the prevalence of “sex talk” against minors was 38 times greater on Instagram than on Facebook Messenger in the US.

According to the Wall Street Journal, the email also urged the company to implement more security measures on the platform.

One employee also highlighted how an Apple executive had complained about being asked about their 12-year-old child on Instagram.

The employee noted that “things like this make Apple so angry that they threaten to remove us from the App Store.”

A November 2020 presentation titled “Child Safety: State of Play” said Instagram used “minimal child safety protections.”

It also described policies that deemed “minor sexualization” as “immature,” and noted that the platform had a minimal focus on human trafficking.

New Mexico claims that Meta leaders did not take action to prevent adults from seeking sexual contact with children until late 2022.

Rather than stop recommending children’s accounts to adults, Facebook and Instagram tried to block suggestions for adults who had previously shown suspicious behavior towards children.

Meta also acknowledged internally in 2021 that most minors on its platforms falsely claim to be adults, the lawsuit said.

A state survey of accounts disabled for childcare also found that 99 percent of those adults did not report their age.

One employee also highlighted how an Apple executive had complained about being asked about their 12-year-old child on Instagram

New Mexico claims that Meta leaders did not take action to prevent adults from seeking sexual contact with children until late 2022

The New Mexico Attorney General’s Office filed the charges after conducting an independent investigation into the matter.

They set up test accounts on Instagram and Facebook using artificial intelligence-generated photos and created fake accounts posing as teenagers or pre-teens.

A spokesperson for Meta told the WSJ that they would not comment on the newly released information.

DailyMail.com has reached out to Meta for comment.

Related Post