Supreme Court Casts Doubt On States' Efforts To Restrict Social Media Content Moderation

WS
Winston & Strawn LLP

Contributor

Winston & Strawn LLP is an international law firm with 15 offices located throughout North America, Asia, and Europe. More information about the firm is available at www.winston.com.
The First Amendment protects social media companies' efforts to moderate content posted to their platforms.
United States Media, Telecoms, IT, Entertainment
To print this article, all you need is to be registered or login on Mondaq.com.

KEY TAKEAWAYS

  • The First Amendment protects social media companies' efforts to moderate content posted to their platforms.
  • In challenging the constitutionality of statutes on their face rather than as applied, litigants must carefully define the statutes' scope.

SUMMARY

The U.S. Supreme Court has issued a decision confirming that the First Amendment protects companies' rights to moderate content on their online platforms. In NetChoice, LLC v. Paxton and Moody v. NetChoice, LLC, the Court held that two states' attempts to restrict social media platforms' efforts to moderate content posted on their websites must be reassessed by the lower courts. But before sending the cases back down, the Court explained that content moderation, even when conducted by algorithm, is expressive activity protected by the First Amendment, and a state's interest in promoting ideological balance online does not justify restricting that expression. Although these cases are not over, the decision no doubt comes as a great relief to social media companies, which will be able to continue to moderate content posted by third parties in accordance with their terms of use.

The NetChoice cases arose from a pair of laws in Texas and Florida prohibiting large social media platforms from removing posts and banning users based on content, viewpoint, or the identity of the speaker. Both states' legislatures enacted the laws in response to the perception that social media platforms were censoring users with conservative views.

Trade groups representing the interests of the platforms challenged the laws on the ground that they violate the platforms' First Amendment right to make editorial decisions about what messages can appear on their sites. The states disagreed, arguing that "content moderation" is not expressive activity but rather unprotected censorship. The Fifth Circuit upheld Texas's law, while the Eleventh Circuit concluded that Florida's law violated the First Amendment.

In both cases, the plaintiffs had argued that the statutes violated the First Amendment on their face, meaning (in the First Amendment context) that "a substantial number of the law[s'] applications are unconstitutional, judged in relation to [their] plainly legitimate sweep." But in practice, the litigation focused on the narrower question of how the statutes apply to large social media platforms like Facebook and YouTube, without addressing how the statutes would apply to other online activity.

The Supreme Court sent both cases back down to the courts of appeals with instructions to first examine the full scope of what the statutes would prohibit before making a final determination as to their constitutionality. But the majority did not stop there. Concerned that the Fifth Circuit had failed to appreciate the expressive value of content moderation, Justice Kagan, writing for the majority, included a discussion of the relevant First Amendment principles to guide the lower courts' analysis of the merits.

The majority made clear that, for First Amendment purposes, content moderation by social media companies should not be treated any differently from similar efforts by traditional media to decide which messages to include and which to exclude. The majority opinion directed the lower courts to a line of cases starting with Miami Herald Publishing Co. v. Tornillo (1974), which held that the government cannot override a media outlet's editorial choices by forcing it to include certain content. When social media companies moderate content in accordance with their terms of service—even if they use an algorithm to do it—they are making the same sort of editorial decisions and are therefore engaged in expressive activity protected by the First Amendment. This is so even though social media platforms do not write their own posts or make their own videos, and "convey the lion's share of posts submitted to them."

The majority rejected Texas's argument that its asserted interest in "improving, or better balancing, the marketplace of ideas" by limiting what it deemed online censorship justified stifling the platforms' content-moderation efforts. "Texas's law profoundly alters the platforms' choices about the views they will, and will not, convey," Justice Kagan explained. "And we have time and again held that type of regulation to interfere with protected speech."

Although all nine Justices agreed that the decisions should be vacated and remanded, several Justices submitted separate concurring opinions. Justice Jackson and Justice Barrett emphasized the pitfalls of facial challenges, as opposed to challenges to laws as applied to specific applications. Justice Barrett also suggested that while the state laws regulating content moderation implicate the First Amendment domestically, they might not apply to foreign platforms (the most obvious unspoken example being TikTok, which is in the process of challenging a law forcing its divestiture). Justice Thomas cast doubt on the constitutionality of facial challenges as a whole and, along with Justices Alito and Gorsuch, criticized the majority's First Amendment analysis as dicta based on an incomplete record that does not support the conclusion that platforms engage in expressive activity when they determine what content to publish or prioritize.

WHAT THIS MEANS

Although the Court returned the cases to the lower courts for further analysis, social media companies have good reason to be optimistic that they will be able to continue to moderate content consistent with their terms of use, free of state interference. The decision is also likely good news for many of the companies that advertise on these platforms and rely on them to moderate what content is displayed next to their advertisements.

Companies should continue to monitor these cases as they make their way back through the lower courts, paying special attention to the lower courts' analysis of the statutes' scope. While arguments before the Court focused on large social media platforms, the statutes arguably sweep much more broadly (for example, as Justice Barrett pointed out, to online service providers or marketplaces like Uber and Etsy).

Different forms of online activity likely raise different constitutional concerns that are difficult to address with a facial challenge to the entire statute. Future litigants might be better served by heeding the Court's cacophony of warnings and bringing future challenges piecemeal, as applied to their specific circumstances.

View the decision here.

Washington, DC summer associate Molly Rose Gibson also contributed to this briefing.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

We operate a free-to-view policy, asking only that you register in order to read all of our content. Please login or register to view the rest of this article.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More