A pair of cases currently pending at the Supreme Court may decide the future of Tech giants like Google, Twitter, and Youtube. The Supreme Court granted review in these cases to determine whether internet service companies are shielded from civil liability when they not only serve as a platform for user-posted content but also recommend certain content to users.
These cases deal with a law that was enacted at the dawn of the internet age, Section 230 of the Communications and Decency Act of 1996. This law was designed to protect internet platforms from civil liability in certain circumstances. Section 230 does this in two ways.
First, Section 230 shields internet platforms from civil liability for content posted by users on their platforms. If a defamatory statement is posted on Facebook, the person defamed can sue the person who posted the statement, but not Facebook. In this instance, Facebook is acting like a telephone company that has zero control over the content of conversations. In such circumstances, it would make little sense to sue AT&T for a defamatory conversation taking place over the phone lines. So, too, for suing Facebook.
Second, Section 230 modifies the common law to shield internet platforms from liability even when they exercise editorial control by removing or restricting access to certain material posted on their site. Thus, the fact that Facebook removes some content or enforces user guidelines does not turn the internet platform into a publisher (a publisher which could be liable under common law for knowingly failing to remove unlawful content) so long as the internet platform acts in good faith.
Big Tech’s marketing strategies have put pressure on Section 230. Today’s tech platforms rely on advertising dollars and that means they make money when you decide to stay online longer. Thus, they use carefully calibrated algorithms to analyze user preferences and data and recommend additional content to users hoping to entice them to spend a few more minutes on their website.
And Big Tech is good at pushing users to stay online as well as to more extreme content. The Plaintiffs, for instance, point out that a recent study finds that 71% of violent, graphic, or hate speech videos on YouTube were viewed because they were recommended to them by the YouTube algorithm.
The cases discussed here stem from horrific terrorist attacks overseas. In 2015, twenty-three-year-old Nohemi Gonzalez was studying in Paris when she was murdered by ISIS terrorists at a bistro. Nohemi’s family alleges that Youtube (owned by Google) provided assistance and material support to ISIS by affirmatively “recommended ISIS videos to users,” targeting those whose online history indicated they would be interested in ISIS videos.
In the second case, the Plaintiffs similarly claim that recommendations made by Facebook led to several Hamas killings and attacks in Israel. Facebook allegedly recommended Hamas-related content to users and also recommended that users “friend” Hamas supporters or Hamas itself.
At issue in the two cases is what happens when internet platforms use computer algorithms to recommend posts. This question has divided lower court of appeals judges. Some judges have read the civil immunity granted by Section 230 broadly to protect recommendations. Other judges have concluded that Section 230’s civil liability shield is limited to traditional editorial functions, like taking down content.
The Plaintiffs argue that the text of Section 230 does not provide an immunity shield for a company’s own content but rather protects a company only when content is “provided by another information content provider.” In an influential opinion, Second Circuit Judge Katzman wrote that Section 230 shielded liability only for traditional editorial functions like removing content. Recommending content, he concluded, was a message from the internet platform and not “by another content provider.” In the decision below, Judge Berzon of the Ninth Circuit agreed, finding that Section 230 shields the decision of whether to publish, withdraw, or alter content—but does not protect activities that promote or recommend content.
Of note, these cases are at the very beginning stages of litigation. So, even if the Supreme Court finds that Section 230 does not provide a civil liability shield, the internet platforms may be able to raise other defenses, including the First Amendment.
In an earlier case involving Section 230, Justice Thomas warned that “[e]xtending immunity [under Section 230] beyond the natural reading of the text can have serious consequences.” Before giving companies immunity, he urged the Court to be “certain that is what the law demands.” The Supreme Court will soon have an opportunity to determine that very question.