For a law whose central clause contains just twenty-six words, Section 230 of the Communications Decency Act of 1996 has generated vast amounts of debate over the past few years, thanks in part to criticism from both sides of the political spectrum. Conservative politicians say the law—which shields online services from liability for the content they host—allows social networks like Twitter and Facebook to censor right-wing voices, while liberals say Section 230 gives the social platforms an excuse not to remove offensive speech and disinformation. Donald Trump and Joe Biden have both spoken out against the law, and promised to change it. This week, the Supreme Court is hearing oral arguments in two cases that could alter or even dismantle Section 230.
On Tuesday, the court’s nine justices heard arguments in the first case, Gonzalez v Google. The family of Nohemi Gonzalez, a US citizen who was killed in an Isis attack in Paris in 2015, claim that YouTube violated the federal Anti-Terrorism Act by recommending videos featuring terrorist groups, and thereby helped cause Gonzalez’s death. On Wednesday, the court heard arguments in the second case, which also involves a terrorism-related death: in that case, the family of Nawras Alassaf, who was killed in a terrorist attack in 2017, claim that Twitter, Facebook, and YouTube recommend content related to terrorism, and thus contributed to his death. After a lower court ruled the companies could be liable, Twitter asked the Supreme Court to say whether Section 230 applies.
The clause at the heart of Section 230 states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In practice, this has meant that services such as Twitter, Facebook, and YouTube are not held liable for things their users post, whether it’s links or videos or any other content (unless the content is illegal). The question before the Supreme Court is whether that protection extends to content these services recommend, or promote to users via their algorithms. Section 230, the plaintiffs argue in Gonzalez, “does not contain specific language regarding recommendations, and does not provide a distinct legal standard governing recommendations.”
Continue reading “Section 230 gets its day in court”