Mark Zuckerberg was ‘warned about platform’s potential for addiction and harm on children’
>
Meta CEO Mark Zuckerberg was warned about the harmful effects of his platforms on children and teens and decided to “turn a blind eye,” new lawsuits show.
The statements, part of a lawsuit filed in February 2023, were redacted and recently unsealed, revealing that workers and engineers were aware of issues, Bloomberg reports.
In the filing states, Zuckerberg was personally warned: “We are not on track to succeed on our core wellness issues (problematic use, bullying and harassment, connections and SSI), and are at increased risk of regulatory and external criticism.
‘These affect everyone, especially young people and makers; if not addressed, they will follow us into the Metaverse.”
The latest information also shows that Meta has quit its mental health team, while Steve Satterfield, Facebook’s vice president of privacy and public policy, said: said in court in 2021 that “safety and well-being of the teens on our platform is a top priority for the company.”
Redacted depositions from lawsuit filed against Meta show Mark Zuckerberg was ‘personally’ warned about how Facebook and Instagram harmed children
DailyMail.com has reached out to Meta for comment.
A Meta spokesperson told Bloomberg that “it is unfair to fund work to support people’s well-being.”
“Because this is so important to our business, we’ve even increased funding, as evidenced by the more than 30 tools we offer to support teens and families,” the spokesperson said.
“Today, there are hundreds of employees throughout the company working to build features for this.”
The lawsuit, filed in Oakland, cites that more than a third of 13- to 17-year-old children report using one of the defendants’ apps “almost constantly,” admitting that this is “too much.”
It has been an ongoing lawsuit, first brought by many parents who claim their children have suffered at the hands of Facebook and Instagram.
The latest information also shows that Meta has quit its mental health team
The complaints, which were later consolidated into several class actions, alleged that Meta’s social media platforms were designed to be dangerously addictive, causing children and teens to consume content that increases the risk of sleep disorders, eating disorders, depression and suicide.
The case also argues that teens and children are more vulnerable to the adverse effects of social media.
“No one wakes up thinking they want to maximize the number of times they access Instagram that day,” a Meta contributor wrote in 2021, according to the filing.
“But that’s exactly what our product teams are trying to do.”
Cecilia Tesch, of Pueblo, claims her daughter, who is referred to as “RF” in the court documents, became addicted to the social media site at the age of seven and that the fixation caused her to develop an eating disorder.
She filed a lawsuit in 2020 along with Oregon resident Brittney Doffing.
Doffing is suing Snap and Meta for allegedly turning her daughter into a violent cellphone addict who has developed an eating disorder and has undergone multiple psychiatric admissions in recent years.
Over the past year, Meta has made strides in supporting teens and kids on Facebook and Instagram, particularly protecting them from predators.
In November 2022, the company made privacy changes for all users under the age of 16.
According to the company’s blog post about the new settings, a “suspicious” account is an adult account that has recently been blocked or reported by a youth.
As an additional safeguard, we are also testing the removal of the message button on teens’ Instagram accounts when viewed by suspicious adults.
Meta also said it has developed tools to encourage teens to report accounts that make them uncomfortable on Facebook.
Instagram also warns users when they’ve been on the app too long, encouraging them to take a break.