Tech Industry
Twitter

A quick guide to the Section 230 Supreme Court hearings

Don't feel bad, it's all a bit confusing.
By Elizabeth de Luna  on 
A split photo: on the left, the white columns of the United States Supreme Court building in daylight. On the right, a screenshot of the YouTube and Twitter app buttons on a phone.
The Supreme Court's Section 230 hearings could have wide-reaching consequences for free speech. Credit: Robert Alexander/Getty Images and Franz-Peter Tschauner/picture alliance via Getty Images

The Supreme Court is currently reviewing the cases of Gonzalez vs. Google and Twitter vs. Taamneh to determine if YouTube and Twitter are liable for terrorism-related content hosted on their platforms.

Of course it's abhorrent that terrorists use YouTube and Twitter to recruit and plan their activities. But those sites are used by millions (and in YouTube's case, billions) of people, and host billions of pieces of content, most of which are not related to terrorism. And because of that, the law says YouTube and Twitter are not responsible for bad actors on their platform. Here's how the Gonzalez vs. Google and Twitter vs. Taamneh are attempting to change the Supreme Court's mind.

What is Section 230?

Section 230 preserves a free and open internet. In 1996, just as the then-new internet was gaining widespread acceptance, Congress committed to supporting that development in Section 230 of the Communications Decency Act of 1996. 

In less than 800 words, Section 230 recognizes that the internet and the services on it give Americans access to a "diversity of political discourse...cultural development [and] intellectual activity." It states that the internet should remain free from government regulation so that it, and free speech, can flourish. Services like YouTube and Twitter are free to moderate user content and speech according to their own guidelines.

Why are YouTube and Twitter in hot water?

Supreme Court cases Gonzalez vs. Google and Twitter vs. Taamneh allege that YouTube and Twitter should be liable for aiding and abetting terrorism because they recommended terrorism-related content (in the case of Gonzalez vs. Google) and hosted terrorism-related content( in the case of Twitter vs. Taamneh).

As of now, YouTube and Twitter are protected from that liability by a Section 230 that states: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

Basically, you are responsible for what you do online. Services like YouTube and Twitter cannot be held responsible for the content posted to their platform, and neither can fellow users of the platform. In simpler terms: when someone posts something hateful online, the speaker is responsible, not the service that hosts the post.

Gonzalez vs. Google and Twitter vs. Taamneh allege that YouTube and Twitter should not be protected under Section 230 and are liable for promoting terrorism-related content, not just hosting it.

What does the Supreme Court have to decide?

The Supreme Court must break Section 230 into teeny, tiny pieces, down to the word, to determine if it will protect YouTube and Twitter in these cases.

Justices have quibbled over the definition of "aiding and abetting" and whether either platform could be considered as having aided and abetted terrorist organizations. They also discussed whether or not YouTube's recommendation algorithm and the platform's suggestions for what to "watch next" could be considered an endorsement of a piece of content or just a "neutral" tool for cataloguing YouTube's massive library.

The Supreme Court is also considering the implications of their decision in the long term. Should it find YouTube and Twitter liable, and therefore move to regulate parts of big tech that have previously been left untouched? Or would that open all internet services to liability and undoubtedly overwhelm the court systems with thousands, if not millions, of new lawsuits?

And what about free speech? Would finding YouTube and Twitter liable stifle a free and open internet and put individuals at risk for legal action every time they share a video or post in an online forum? Or would it be better to hold YouTube, Twitter, and other open platforms responsible for any terrorism-related activity on their sites?

What would the internet look like if Twitter and YouTube became responsible for the content on their sites?

The shape if the internet as we know it was made in the image of free speech. To make platforms responsible for what is said or hosted on their sites means that those platforms be open to a countless lawsuits. It would also mean that you as a user would be liable for anything you say on those platforms that upsets somebody enough to pursue legal action under the amended Section 230.

To avoid being buried in legal fees, platforms would resort to significant, if not complete, censorship to restrict how individuals interact online. That could hinder innovation, communication, and generally make the world a much smaller place.

More in Twitter, YouTube

Mashable Image
Elizabeth de Luna

Elizabeth is a culture reporter at Mashable covering digital culture, fandom communities, and how the internet makes us feel. Before joining Mashable, she spent six years in tech, doing everything from running a wifi hardware beta program to analyzing YouTube content trends like K-pop, ASMR, gaming, and beauty. You can find more of her work for outlets like The GuardianTeen Vogue, and MTV News right here(opens in a new tab)


Recommended For You

How to watch Premier League soccer in the U.S.

'Judy Blume Forever' review: A literary icon gets a triumphant, timely tribute


How to watch 'Barry' Season 4: The bloody saga is coming to a conclusion

More in Tech
Google Bard introduces new features for generating and debugging code

Why you should consider going green with your gadgets this Earth Day and every day
By Mashable BrandX and HP

Rihanna, Taylor Swift among the few celebrities paying Twitter to keep their blue checkmarks

Avast show us what to expect In 2023 and how to stay safe


Trending on Mashable

'Wordle' today: Here's the answer, hints for April 21

Dril and other Twitter power users begin campaign to 'Block the Blue' paid checkmarks

How to remove Snapchat's My AI from your Chat feed

The biggest stories of the day delivered to your inbox.
By signing up to the Mashable newsletter you agree to receive electronic communications from Mashable that may sometimes include advertisements or sponsored content.
Thanks for signing up. See you at your inbox!