Donate
Text Audio
00:00 00:00
Font Size

We now know a bit about where some of the justices of the Supreme Court of the United States are leaning in one family’s landmark case against Google.

Google is being sued by the family of a woman killed in 2015 during an ISIS-led terrorist attack in Paris. The family has argued that YouTube, which is owned by Google, employs a recommendation system that led users to recruitment videos for the terrorist group.

Google countered the plaintiffs’ arguments by arguing that tech companies cannot be liable for content posted on the platforms by third-party users under Section 230 of the Communications Decency Act

The issue, according to SCOTUSblog, is “Whether Section 230(c)(1) of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.”

Oral arguments were held on Tuesday, where each party attempted to convince the high Court to rule in their favor. 

“The Supreme Court cases — there are two of them — it’s really asking whether the Antiterroris[m] Act trumps the liability shield,” said MRC Free Speech America & MRC Business Vice President Dan Schneider. “But the most important thing for people to understand is that Google and Facebook and Twitter, that they actually have intentionally and purposefully made money off of these terrorist organizations, while at the same time censoring conservatives, regular Americans, who are simply trying to voice their common sense values. So we’re the targets of these Big Tech platforms, not the terrorist organizations. This is outrageous.”

Perhaps most interestingly, Justice Clarence Thomas, the justice who will likely be a key player in future decisions involving “Big Tech” given his past comments on Section 230, actually appeared to defend algorithms in his questioning of the plaintiffs’ lawyer about their legal theory.

“If you’re interested in cooking … you don’t want thumbnails on light jazz,” he said, referring to the usefulness of algorithms. “I see these as suggestions and not really recommendations because they don’t really comment on them.”

In the past, Justice Thomas has argued that social media platforms are “sufficiently akin” to “common carriers,” and that no one should be denied access to the platforms because of their views.

Thomas wrote in a 2021 opinion that the Supreme Court has “no choice” but to “address how our legal doctrines apply to highly concentrated, privately owned information infrastructure such as digital platforms."

On Tuesday’s hearing, Justice Elena Kagan acknowledged the European Union’s efforts to regulate speech online but added, “We’re a court. We really don’t know about these things. We are not the nine biggest experts on the internet. Isn’t this a case for Congress, not the Court?”

University of Washington School of Law Professor and attorney for the plaintiffs, Eric Schnapper, argued that the platforms are responsible for how content is recommended to users.

“Third parties that post on YouTube don’t direct their videos to specific users,” he argued.

Justice Neil Gorsuch was skeptical of Schnapper’s argument and said that he was “not sure any algorithm is neutral.”  

“Most [algorithms] these days are designed to maximize profit,” Gorsuch said, indicating a position similar to that of Justice Thomas.

For his part, Justice Samuel Alito seemed frustrated at the confusion surrounding the issue. 

 “I don’t know where you’re drawing the line,” he told Schnapper. “That’s the problem.”

Google lawyer Lisa Blatt tried to convince the Court that a ruling against the Big Tech company would change the internet as we know it.

Chief Justice John Roberts addressed this.

“Would Google collapse and the internet be destroyed if Google was prevented from posting what it knows is defamatory?” he asked.

Blatt said that Google would not be destroyed, but that smaller websites would suffer from an unfavorable decision.

Meanwhile, Justice Kavanaugh expressed concern that the courts will be flooded with litigation if the high Court rules against Google. 

“Lawsuits will be nonstop,” Kavanaugh said, also suggesting that Congress should reform the law if a change is needed.

“Isn’t it better … to put the burden on Congress to change that, and they can consider the implications and make these predictive judgments?” he asked.

Schnapper rejected that argument, stating that most suits would be dismissed outright if the Court expanded platform liability.

“The implications are limited,” he argued, “because the kinds of circumstance in which a recommendation would be actionable are limited.” 

Conservatives are under attack. Contact your representatives and demand that Big Tech be held to account to mirror the First Amendment while providing transparency, clarity on so-called “hate speech” and equal footing for conservatives. If you have been censored, contact us at the Media Research Center contact form, and help us hold Big Tech accountable.