US appeals court revives a lawsuit against TikTok over 10-year-old’s ‘blackout challenge’ death

PHILADELPHIA — A U.S. appeals court on Tuesday revived a lawsuit filed by the mother of a 10-year-old Pennsylvania girl who died while trying to complete a viral challenge she reportedly saw on TikTok that challenged people to strangled themselves until they lost consciousness.

While federal law generally protects online publishers from liability for content posted by others, the court said TikTok could potentially be held liable for promoting the content or using an algorithm to tailor the content to children.

“TikTok makes choices about the content it recommends and promotes to specific users, and in doing so engages in its own first-party speech,” Judge Patty Shwartz of the 3rd U.S. Circuit Court in Philadelphia wrote in the opinion issued Tuesday.

Lawyers for ByteDance, TikTok’s parent company, did not immediately respond to phone calls and emails seeking comment.

Lawyers for the mother, Tawainna Anderson, had argued that the so-called “blackout challenge,” which was popular in 2021, appeared on Nylah Anderson’s “For You” feed after TikTok determined she might watch it — even after other children had died trying it.

Nylah Anderson’s mother found her unconscious in a closet in their Chester, Philadelphia, home and tried to revive her. The girl, described by her family as a cheerful “butterfly,” died five days later.

“I can’t stop replaying that day in my head,” her mother said at a 2022 news conference when she filed the lawsuit. “It’s time for these dangerous challenges to end so other families don’t have to experience the heartache we do every day.”

A district judge initially dismissed the lawsuit, citing Article 230 of the Communications Decency Act of 1996, which is often used to protect Internet companies from liability for things posted on their sites.

The three-judge appeals court partially reversed that decision on Tuesday and sent the case back to the lower court for a new hearing.

“Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that watching the images on her screen would kill her. But TikTok knew Nylah would watch because the company’s custom algorithm pushed the videos to her ‘For You Page,’” Judge Paul Matey wrote in partial concurrence with the opinion.

Jeffrey Goodman, the family’s attorney, said it’s “inevitable” that courts will take a closer look at Section 230 as technology touches every facet of our lives. He said the family hopes the ruling will help protect others, even if it doesn’t bring Nylah Anderson back.

“Today’s ruling is the clearest statement yet that Section 230 does not provide the blanket protections that social media companies claim it does,” Goodman said.

Related Post