US Supreme Court shields Twitter from liability for terror-related content

Posted by
Check your BMI

The US Supreme Court handed Silicon Valley a massive victory on Thursday as it protected online platforms from two lawsuits that legal experts had warned could have upended the internet.

The twin decisions preserve social media companies' ability to avoid lawsuits stemming from terrorist-related content – and are a defeat for tech industry critics who say platforms are unaccountable.

In so doing, the court sided with tech industry and digital rights groups who had claimed exposing tech platforms to more liability could break the basic functions of many websites.

READ MORE: Nadal pulls out of Roland-Garros, foreshadows 'last year' of career

US Supreme Court building

toonsbymoonlight

In one of the two cases, Twitter v. Taamneh, the Supreme Court ruled Twitter will not have to face accusations it aided and abetted terrorism when it hosted tweets created by the terror group ISIS.

The court also dismissed Gonzalez v. Google, another closely watched case about social media content moderation – sidestepping an invitation to narrow a key federal liability shield for websites, known as Section 230 of the Communications Decency Act.

READ MORE: SA government vowing to punish protesters with fines of up to $50,000

Thursday's decision leaves a lower court ruling in place that protected social media platforms from a broad range of content moderation lawsuits.

The Twitter decision was unanimous and written by Justice Clarence Thomas, who said that social media platforms are little different from other digital technologies.

Twitter logo

"It might be that bad actors like ISIS are able to use platforms like defendants' for illegal – and sometimes terrible – ends," Thomas wrote.

"But the same could be said of cell phones, email, or the internet generally."

The court held that Twitter's hosting of general terrorist speech does not create indirect legal responsibility for specific terrorist attacks, effectively raising the bar for future such claims.

"We conclude," Thomas wrote, "that plaintiffs' allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack."

He stressed that the plaintiffs have "failed to allege that defendants intentionally provided any substantial aid" to the attack at issue, nor did they "pervasively and systemically" assist ISIS in a way that would render them liable for "every ISIS attack."

Twitter v. Taamneh focused on whether social media companies can be sued under US antiterrorism law for hosting terror-related content that has only a distant relationship with a specific terrorist attack. 

The plaintiffs in the case, the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017, alleged that social media companies including Twitter had knowingly aided ISIS in violation of federal antiterrorism law by allowing some of the group's content to persist on their platforms despite policies intended to limit that type of content.

"Countless companies, scholars, content creators and civil society organisations who joined with us in this case will be reassured by this result," said Halimah DeLaine Prado, Google's general counsel, in a statement.

"We'll continue our work to safeguard free expression online, combat harmful content, and support businesses and creators who benefit from the internet."

Twitter did not immediately respond to a request for comment.

Dismisses Google challenge, leaving Section 230 untouched

In a brief order, the court dismissed the case against Google with only a brief opinion, leaving intact a lower court ruling that held Google is immune from a lawsuit that accuses its subsidiary YouTube of aiding and abetting terrorism.

The outcome will likely come as a relief not only for Google but for the many websites and social media companies that urged the Supreme Court not to curtail legal protections for the internet.

The opinion was unsigned, and the court said: "We decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief. Instead, we vacate the judgment below and remand the case for Ninth Circuit to consider plaintiffs' complaint in light of our decision in Twitter."

No dissents were noted.

The case involving Google zeroed in on whether it can be sued because of its subsidiary YouTube's algorithmic promotion of terrorist videos on its platform.

The family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris, alleged that YouTube's targeted recommendations violated a US antiterrorism law by helping to radicalise viewers and promote ISIS's worldview.

The allegation sought to carve out content recommendations so that they do not receive protections under Section 230, potentially exposing tech platforms to more liability for how they run their services.

Google and other tech companies have said that that interpretation of Section 230 would increase the legal risks associated with ranking, sorting and curating online content, a basic feature of the modern internet. Google claimed that in such a scenario, websites would seek to play it safe by either removing far more content than is necessary, or by giving up on content moderation altogether and allowing even more harmful material on their platforms.

Friend-of-the-court filings by Craigslist, Microsoft, Yelp and others suggested that the stakes were not limited to algorithms and could also end up affecting virtually anything on the web that might be construed as making a recommendation.

That might mean even average internet users who volunteer as moderators on various sites could face legal risks, according to a filing by Reddit and several volunteer Reddit moderators.

Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, the original co-authors of Section 230, argued to the court that Congress' intent in passing the law was to give websites broad discretion to moderate content as they saw fit.

The Biden administration also weighed in on the case. In a brief filed in December, it argued that Section 230 does protect Google and YouTube from lawsuits "for failing to remove third-party content, including the content it has recommended."

But, the government's brief argued, those protections do not extend to Google's algorithms because they represent the company's own speech, not that of others.

Sign up here to receive our daily newsletters and breaking news alerts, sent straight to your inbox.