This TikTok Lawsuit Could Change the Face of Social Media Forever, and It’s About Time

Social media and its Section 230 protections may have reached their Waterloo line. For the two-plus decades that we’ve been using social media platforms like X (well, Twitter), Facebook, Instagram, TikTok, and others, they’ve been operating under protections designed 25 years ago to protect platforms like Compuserve and AOL.

Those protections, part of the Communications Decency Act of 1996, said that online computer services could not be held liable for content posted to their platforms by third parties. These services were like dumb, giant warehouses with shelves full of information posted by others. A warehouse does not create what is in it, it just accepts the content and gives consumers access to it.

This was back in the days of AOL, which controlled the pages you saw using keywords, a crude organizing principle for such a vast amount of information. In a sense, early platforms like Prodigy, CompuServe, and AOL were just a nice interface away from the Bulletin Board Systems that came before them.

Modern digital services, primarily social media, have one big difference: They no longer passively wait for you to discover content and make connections. Everything is tailored based on custom algorithms. TikTok’s vaunted For Your Page, X’s For You page, Threads’ For You Feed, Facebook’s feed, Instagram’s recommendations — they’re all powered by algorithms that learn your habits and then deliver other people’s content based on those presumed interests.

AOL wanted people to sign up and stay, but they kept the numbers high by controlling churn. Almost as many people stopped paying and using the service as signed up each month. That’s why we all got so many disks and CDs in the mail, begging us to join.

Algorithms in control

Nowadays, the platforms are mostly free. Ads and affiliate deals pay the bills, so it’s crucial to keep eyes glued to each service. Hence the algorithms that do the dirty work of keeping us all engaged.

AOL, CompuServe and even internet service providers could rightly argue that they have no control over the content we see online and that responsibility still lies with the content creators. But algorithms make the picture for modern social media and perhaps even search engines like Google much bleaker.

Article 230 has been under attack for years. I always thought it protected all online services fairly well. When you go looking for someone to blame for seeing unwanted violent, hateful, perverted or even pornographic content in your feed, the ultimate responsibility lies with the creator of that content, not the host.

I no longer believe that and from what I can tell, it appears that the US courts may soon set a precedent on this point. a closely followed case.

A precedent could be set

In 2021, a 10-year-old girl, Nylah Andreson, found a viral meme on her TikTok feed. The video promoted something called “The Blackout Challenge.” Social media is full of these types of viral challenges, and the vast majority of them are harmless.

This one wasn’t it. It promoted strangling yourself until you passed out.

Tragically, Nylah, according to the filedied while attempting the challenge and her family has since been suing TikTok. While the lower courts dismissed the case, a U.S. Court of Appeals ruled that Nylah’s family could sue TikTok, specifically noting that the TikTok algorithm was not protected by federal Section 230.

From the statement:

“TikTok makes choices about the content it recommends and promotes to specific users, and in doing so, engages in its own first-party speech.”

While TikTok doesn’t have anyone curating the content for anyone’s feed, it’s safe to say that the algorithm is the deciding factor. The algorithm is programmed by TikTok, which is owned by Chinese company ByteDance (the company currently has to sell itself to US entities or risk a ban in the US).

The Andreson case continues, and if Nylah’s family wins its lawsuit against TikTok, it could mean a swift end to protections for all social media platforms currently using algorithms to shape our feeds. If TikTok loses, the social media companies could be held liable the next time you see hate speech, violent images, pornography, or suggestions for dangerous actions.

In a separate interview, Nylah’s family said they want big tech companies to be held accountable for the algorithms and to do more to protect their users.

The winds of change

Whatever the end result, any platform that programs an algorithm to analyze your interests and then serves up content based on that analysis has a responsibility to ensure that the algorithm cannot deliver dangerous content.

In my own social media use, particularly on TikTok, I’ve been amazed at the power and flexibility of the algorithm. It endlessly fills my For Your Page, keeping me engaged for hours. It allows for personal curation, which is usually done by searching for interesting things.

When I find something I like, I pay extra attention to it. I watch it more than once, pause the video, like it, share it, and then watch a few more videos in the same vein. If I do this a few times, I can shape my FYP feed to see more videos about people refurbishing old gadgets or making pasta.

However, these feeds have a needy side. They always throw in a “you might also like” topic that is popular with others. They try to prevent you from losing interest in your feed and the platform.

That’s how, I think, most people ultimately view things like violence and dangerous memes. You have to show the feed how much you don’t like that content, then you can filter it out – assuming the algorithm allows that.

TikTok will fight this case, as other social media platforms have done, but I think the tide has turned and a loss is possible. If that happens, TikTok, X, Threads, Facebook, Instagram, and other social media platforms could be forced to tear up and overhaul all of their algorithms to ensure they don’t repeat the mistakes of the past. Otherwise, they could be buried under costly lawsuits — which they might lose again — until the platforms collapse and disappear for good.

You may also like

Related Post