Justices Seem Reticent to Intervene, Point to Congress to Legislate
The U.S. Supreme Court is hearing arguments on two separate cases that could influence whether and how online platforms like YouTube and Twitter moderate incendiary content in the future.
The first case heard on Tuesday centers on Section 230 of the Communications Decency Act that shields online service providers from defamation lawsuits. The plaintiff is the family of Nohemi Gonzalez, a 23-year-old college student gunned down in a Paris restaurant by attackers allegedly radicalized by watching Islamic State videos on YouTube.
Eric Schnapper, who represents the family, argued YouTube should be responsible for its algorithm that “systematically recommended videos inciting violence and supporting terrorism.” YouTube’s effective promotion of the video content carries a different level of responsibility than merely allowing it to be posted, he said.
Lisa Blatt, the lawyer for Google that owns YouTube, said Section 230 provides “complete protection” for online content. She likened YouTube’s algorithm to “editorial curation” similar to results from a Google search. “Without the ability to provide content of interest to users,” she said, “the internet would be a useless jumble.”
Based on their questioning, justices seemed wary of wading into the issue. “These are not the nine greatest experts on the internet,” Justice Elena Kagan said. Justice Brett Cavanaugh fretted over a decision that could “crash the digital economy”.
Kavanaugh’s comment reflected why Section 230 was adopted in 1996 when the internet was just emerging and intended to protect online providers for being sued for what a user posted. The wording says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Section 230 also protects online providers from liability when they take down content, which ties in with the case the high court will hear today. The family of Nawras Alassaf, a victim of a 2017 terrorist attack in Istanbul, is suing Twitter under the Anti-Terrorism Act of 1990 for failing to take down video content posted by the Islamic State.
The current cases don’t easily fall into liberal or conservative legal narratives.
More Looming First Amendment Cases
Looming on the legal horizon are First Amendment questions related to statutes in Florida and Texas that prevent social media platforms from removing posts because of the views they express.
The current cases don’t easily fall into liberal or conservative legal narratives. Liberals complain Section 230 shields online providers for any responsibility to address disinformation, hate speech or violent content. Conservatives tend to argue legal immunity has enabled tech companies to become too powerful with an ability to exclude some voices from their platforms.
An eventual ruling in both cases later this year could radically change the social media environment. Justice Amy Coney Barrett asked whether Twitter users could be sued for retweeting ISIS videos. Schnapper said yes, noting a retweet is newly created content. In a response to a question from Justice Kagan, Schnapper said legal protection also might be lost or limit for algorithms used to generate user feeds or for search engines.
Malcolm Stewart, an attorney in the Biden administration, argued in favor of the Gonzalez family action, suggested lawsuits based on recommending content created by another party would be rare, even if immunity was unavailable.
During sprawling three-hour oral arguments, Justice Kavanaugh ad said Congress was better suited to decide how much legal protection online platforms should enjoy. That argument coincides with new concerns over how artificial intelligence could alter the online landscape.
“This was a pre-algorithm statute,” Justice Kagan said, adding that it provided scant guidance “in a post-algorithm world.” Justice Neil Gorsuch added, “Artificial intelligence generates poetry and it generates polemics.”