
For the past several years, critics on both sides of the political spectrum have argued that Section 230 of the Communications Decency Act of 1996 gives social-media platforms such as Facebook, Twitter, and YouTube too much protection from legal liability for the content that appears on their networks. Right-wing critics argue that Section 230 allows social-media companies to censor conservative thinkers and groups without recourse, by removing their content (even though there is no evidence that this occurs), and liberal critics say the platforms use Section 230 as an excuse not to remove things they should be taking down, such as misinformation. Before the 2020 election, Joe Biden said he would abolish Section 230 if he became president, and he has made similar statements since he took office, saying the clause “should be revoked immediately.”
This week, the Supreme Court said it plans to hear two cases that are looking to chip away at Section 230 legal protections. One case claims that Google’s YouTube service violated the federal Anti-Terrorism Act by recommending videos featuring the ISIS terrorist group, and that these videos helped lead to the death of Nohemi Gonzalez, a 23-year-old US citizen who was killed in an ISIS attack in Paris in 2015. In the lawsuit, filed in 2016, Gonzalez’s family claims that while Section 230 protects YouTube from liability for hosting such content, it doesn’t protect the company from liability for promoting that content with its algorithms. The second case involves Twitter, which was also sued for violating the Anti-Terrorism Act; the family of Nawras Alassaf claimed ISIS-related content on Twitter contributed to his death in a terrorist attack in 2017.
The Supreme Court decided not to hear a similar case in 2020, which claimed that Facebook was responsible for attacks in Israel, because the social network promoted posts about the terrorist group Hamas. In March, the court also refused to review a decision which found Facebook was not liable for helping a man traffick a woman for sex. While Justice Clarence Thomas agreed with the decision not to hear that case, he also wrote that the court should consider the issue of “the proper scope of immunity” under Section 230. “Assuming Congress does not step in to clarify Section 230’s scope, we should do so in an appropriate case,” Thomas wrote. “It is hard to see why the protection that Section 230 grants publishers against being held strictly liable for third parties’ content should protect Facebook from liability for its own ‘acts and omissions.’”
Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer
Continue reading “Section 230, the platforms, and the Supreme Court”