Supreme Court case could limit social media immunity from liability, by Christopher S. Yoo

Christopher S. Yoo Los Angeles Times

This fall, the Supreme Court marked a turning point in Internet history. The court agreed to hear Gonzalez v. Google, his first case interpreting Section 230 – a once arcane law that is now widely credited with enabling much of the current form of the internet and is being debated by politicians on both sides of the aisle.

Section 230 states that online businesses will not be “treated as the publisher” – like media such as television and newspapers – for content provided by a third party, such as people who post on their websites. Passed by Congress in 1996 as part of the otherwise ill-fated Communications Decency Act, the law provides some legal immunity to players such as Google, Twitter and Facebook for content shared on their platforms by users.

The law protects companies that provide a platform for the speech of others from the constant threat of libel suits, while allowing them to remove content they deem objectionable. This enabled the robust, often divisive discourse that defines the internet today. What could the Supreme Court’s intervention mean for its future?

People also read…

The Gonzalez case, currently in court, arose after a young woman, Nohemi Gonzalez, was killed by an Islamic State attack in Paris. His estate and family members allege that Google violated anti-terrorism law by allowing the terrorist organization to post content that furthered its mission on YouTube (which is owned by Google). They also claim that Google’s algorithms promoted the Islamic State by recommending its content to users.

The two courts that have considered the case so far have held that immunity under Section 230 covers possible violations of the Terrorism Act. But when reviewing different statutes in other 230-related decisions, the 9th Circuit Court of Appeals, which has jurisdiction over West Coast cases, interpreted Section 230 protections more narrowly than other courts. The possibility that this same law could mean different things depending on where someone lives in the United States violates the rule of law. Reconciling such inconsistencies is a common motivation for the Supreme Court to take up a case and may explain the current court’s interest in Gonzalez, as well as emerging issues around algorithmic recommendations. Judge Clarence Thomas has also expressed interest in taking up 230 past dissents.

The court could simply take the broad view of 230 protection for platforms, reducing the incentives to review the content those platforms carry. If the court takes a narrower view, it would lead to more content moderation.

Proponents of the narrow position might argue that while broad liability protection was appropriate when the industry first emerged, it is less justifiable now that internet companies are large and dominant. Stricter regulation could place a greater responsibility on companies to exercise discretion over the content they host and bring to potentially millions of people.

On the other hand, those in favor of preserving broad immunity with 230 argue that limiting protections to certain types of content will force companies to remove anything inconvenient from afar rather than undertake the difficult task. and controversial to decide what content to block. The result would be the loss of a significant amount of online speech, including anything that has even the most tenuous possibility of creating liability.

History provides good reason to fear that shrinking immunity could erode or stifle speech. Congress passed an amendment in 2018 stating that Section 230 does not apply to content that violates laws prohibiting sex trafficking. Two days after the law took effect, Craigslist removed its personal section rather than determining what content was actually related to prostitution. Other companies have followed suit.

The Supreme Court could approach the Gonzalez case in a completely different way, focusing less on content moderation and more on platform design. Section 230 clearly allows companies to remove certain types of objectionable content. What’s less clear is whether the law provides similar protection for algorithmic decisions to promote illegal content, which is the issue in plaintiffs Gonzalez’s objection to YouTube’s algorithms. Judges could restrict platforms’ ability to use algorithms to promote content, which is central to these companies’ business models.

If the Supreme Court sticks to its planned timeline, we’ll know by the end of June whether or not it decides to remake the future of the internet.

Christopher S. Yoo is a professor of law and founding director of the Center for Technology, Innovation & Competition at the University of Pennsylvania.


#Supreme #Court #case #limit #social #media #immunity #liability #Christopher #Yoo

Related Articles

Back to top button