The Supreme Court is negotiating elementary US liability privileges for social networks. Google warns us to get there, otherwise, there is a risk of extensive censorship.
Google warns: An expected ruling in a case before the US Supreme Court on liability exemption for social networks such as YouTube, Facebook, Instagram, TikTok, or Twitter could “turn the internet upside down” and lead to widespread censorship. Should the judges restrict privileges, large operators would be forced to block more potentially offensive or harmful content. The makers of smaller websites, on the other hand, are likely to suspend their filtering efforts and fact checks in order to avoid being held liable for the moderation of content in the first place.
Liability question: platforms recommended “harmful content”
The Wall Street Journal reports on Google’s submission to the Supreme Court. The lawsuit was filed by the family of US student Nohemi Gonzalez, who was killed in the 2015 terrorist attacks by “The Islamic State” (ISIS) in Paris. Your allegation in the “Gonzales vs. Google” dispute is that the YouTube portal, which belongs to the defendant group, supported ISIS by recommending propaganda videos from the terrorist group to users.
The core of the dispute is Section 230 of the Communications Decency Act (CDA). This generally protects online platforms from being sued for harmful content that users post on their sites. The paragraph also gives them far-reaching opportunities to independently filter and delete content without being held liable for it. The Gonzalez family now alleges that the clause introduced by the US Congress in 1996 has been expanded over time by case law to cover acts and circumstances that the legislature never intended. According to the plaintiffs, certain activities by platforms, such as recommending harmful content, must not be protected.
Limited liability for editorial work?
In accepting the case, the Supreme Court has agreed to answer the question: Does Section 23 exempt interactive computer services from liability “if they specifically draw attention to information made available by another information content provider”? The constitutional judges should also clarify whether the clause only limits liability if platforms perform traditional editorial tasks, i.e. decide whether to display information at all or not.
In the most recent case of three that the US appeals courts have already taken up on Section 230, five lower court judges have come to the conclusion that the section creates broad impunity. Three courts have rejected such immunity. An appellate judge held that precedent precludes liability for such recommendations. Google argues that a meaningful distinction between recommendation and search algorithms is not possible. Section 230 should therefore apply to all relevant ranking processes. Most recently, the US government, particularly under ex-President Donald Trump, made attempts to reformulate the clause.