Moore – Winter 2024

Lone Star Misstep:
How Texas’s “Free Speech” Gambit Threatens First Amendment Principals and Tees up the Gutting of the Administrative State

Logan Moore


Introduction

Conservative commentators have long decried what they describe as “liberal bias” in the news media,[1] higher education,[2] and elsewhere. Now, too, they argue that social media companies are biased and discriminating against conservative viewpoints.[3] Lawmakers in several states have reacted by introducing legislation that threatens to undermine the scope of First Amendment protections and upset the carefully struck balance between the interests of the administrative state and private enterprise when it comes to state-mandated disclosures.

Texas is one such states. In response to fears of viewpoint discrimination, Republican legislators in Texas enacted House Bill 20 (H.B. 20) during the 87th Texas Legislature. According to the author of H.B. 20, State Representative Briscoe Cain, the bill “seeks to provide protections from censorship and other interference with digital expression….”[4] Specifically, the bill prohibits social media platforms from censoring a user, a user’s expression, or a user’s ability to receive the expression of another person based on viewpoint. The bill also imposes certain transparency requirements and requires social media platforms to establish a complaint system for users to challenge a decision to remove their content and to explain to any individual whose content is removed of the reason for removal. Notably, legislators did not empower state regulators to involve themselves in the enforcement of H.B. 20 through the administrative rulemaking process or through the imposition of administrative penalties. Enforcement, rather, is left to private users and to the state attorney general.

Litigation

Before the law could take effect, NetChoice, LLC and the Computer & Communications Industry Association, two trade organizations that represent some of the nation’s largest media organizations, filed suit in federal court seeking to prevent H.B. 20 from taking effect. Plaintiffs alleged in their complaint that H.B. 20 violated their members “First Amendment rights to engage in their own speech and to exercise editorial discretion over the speech published on their websites and applications.”[5] Plaintiffs further argued that H.B. 20 was unconstitutional under the Supremacy Clause—asserting that the bill was preempted by Section 230 of the Communications Act of 1934.[6] One key provision of Section 230 cited by Plaintiffs is Section 230(c)(2), which limits liability for good faith decisions to restrict access to certain objectionable materials.[7] According to the Congressional Research Service, Section 230 has been held by the courts to “protect online service providers like social media companies from lawsuits based on their decisions to transmit or take down user-generated content.”[8]

On December 1, 2021, the day before H.B. 20 was to take effect, U.S. District Court Judge Robert Pitman granted Plaintiffs’ motion for a preliminary injunction against enforcement of the law. In his order granting the motion, Judge Pitman held that key portions of H.B. 20 were “replete with constitutional defects, including unconstitutional content- and speaker-based infringement on editorial discretion.”[9]

On May 11, 2022, the Fifth Circuit Court of Appeals stayed the preliminary injunction pending appeal.[10] However, just two weeks later the U.S. Supreme Court vacated the stay over the dissent of Justices Alito, Thomas, and Gorsuch.[11] After assessing the merits of the district court’s decision, the Fifth Circuit reversed the preliminary injunction, with the plurality opinion noting that the court “reject[ed] the idea that corporations have a freewheeling First Amendment right to censor what people say.”[12] This ruling put the Fifth Circuit at odds with the Eleventh Circuit, which ruled a similarly-motivated Florida law was likely unconstitutional and prohibited the law from taking effect.[13] On September 29, 2023, the Supreme Court granted certiorari in both cases.[14]

            Oral argument in NetChoice, LLC v. Paxton—the Texas case—and in Moody v. NetChoice, LLC—the Florida case—occurred February 26, 2024. During the arguments, a majority of justices seemed skeptical of the assertion that the two laws in question were constitutional exercises of state authority.[15] For instance, in questioning Florida’s Solicitor General, Justice Kavanaugh, who seemed sympathetic to the view that social media platforms engage in speech when making moderating decisions, pointed to a line of Supreme Court rulings “which emphasize editorial control as being fundamentally protected by the First Amendment.”[16]      

Chief Justice Roberts likewise seemed hostile to the arguments being put forward by the Texas Solicitor General that big social media companies are common carriers, and thus subject to additional regulatory control by the state. Specifically, the Chief Justice said that the First Amendment does not apply to the social media platforms but instead “restricts what the government can do, and what the government’s doing here is saying you must do, you must carry these people; you’ve got to explain if you don’t. That’s not the First Amendment.”[17] The Chief Justice continued, indicating that unlike the telegraph or railroads in decades past, users of social media platforms have alternative choices if they don’t like the way in which a particular platform is operating.[18] Accordingly, “[t]hey can discriminate against particular groups that they don’t like,” while the state “[has] different obligations.”[19]

Unsurprisingly given their earlier dissent from the decision to block H.B. 20 from taking effect while litigation proceeded, Justices Thomas and Alito were two of the voices on the Court seemingly most likely to vote to allow both respective laws to go into effect.[20]

Discussion

            The benefits of social media content moderation policies extend far beyond the realm of digital governance—they shape the very fabric of our online interactions, influencing attitudes, behaviors, and societal norms. While far from perfect in their moderation efforts, social media companies have found success in combating the spread of certain harmful ideologies.[21] For instance, Twitter’s aggressive takedown strategy for pro-ISIS tweets resulted in ISIS propagandists fleeing the platform, thus decreasing “radicalization, recruitment, and attack planning opportunities.”[22] More recently, content moderation effects have had success in combating COVID-19 conspiracy theories. In a study of moderation efforts undertaken by Facebook, Twitter, Reddit, and 4chan, researchers at Technical University of Munich, Germany, found that removing or flagging COVID-19 misinformation “significantly reduced the spread of conspiracy theories in the ecosystem.”[23] Similarly, researchers at DePaul University conducted a study of Twitter’s so-called soft moderation techniques and found that displaying an interstitial cover over a misleading COVID-related tweet was effective in reducing the percentage of users who saw the tweet as accurate.[24]

            If Texas has its way and the Supreme Court allows H.B. 20 to take effect, the social media landscape as we know it will be completely upended. Social media companies could find themselves handcuffed in their efforts to combat misinformation and other harmful ideologies just in time for the 2024 general election. A Texas victory in NetChoice could be dangerous even for those who are not social media users. As Rebecca Tushnet, a First Amendment scholar at Harvard Law School, put it: “[i]f the states can impose this kind of regulation on platforms, we will know that the past 70 years or so of First Amendment jurisprudence can no longer be relied on in any real way.”[25] Thankfully, it seems a majority of justices recognize that social media platforms retaining the authority to engage in legitimate content moderation efforts is an important and valid exercise of First Amendment principles.

            Despite the seeming existence of a majority ready to protect social media platforms’ ability to continue engaging in content moderation efforts, it is less clear what fate awaits the lesser discussed aspect of H.B. 20—the “individualized-explanation requirements.” Daphne Keller, director of the Program on Platform Regulation at Stanford’s Cyber Policy Center, noted in an article discussing H.B. 20 that the Court could use the opportunity to review the law’s individualized-explanation requirements as a means to “address the relationship between the First Amendment and the administrative state.”[26] Specifically, the Court may use an opinion in this case to limit the applicability of the rule previously outlined in Zauderer where the Court held that the government can mandate commercial actors to disclose certain information so long as the disclosure is “reasonably related to the State’s interest in preventing deception of consumers.”[27] While this is far from certain, Keller noted that state attorneys general were arguing that Zauderer as it is currently applied “provides, effectively, a blank check for expansive, state-mandated disclosures.”[28] Such a limitation could effectively “eliminate [Zauderer] as a basis for broader regulatory disclosure mandates, and shift vast edifices of regulatory law…” which may, in turn, “lead to rather restrictive interpretations of lawmakers’ power to compel disclosures to regulators like the FDA, EPA, SEC, or EEOC.”[29]

Although the Court’s more conservative justices signaled some interest in addressing Zauderer, the justices notably spent little time during oral argument discussing the aspect of the law, and in fact no time discussing Zauderer.[30] This may mean that the justices see no reason to wade into the murky waters of the administrative state’s authority to mandate regulatory disclosures. If that is the case, a key tool that regulators use for consumer protection purposes would live to see another day. It would not be surprising, however, for the Court’s business-friendly, conservative majority to find a way to use this case as an opportunity to tip the scales in favor of corporate speech interests and against government regulators by limiting the application of Zauderer.



[1] See David Greenberg, The idea of “the liberal media” and its roots in the civil rights movement, 1 Sixties: J. of Hist., Pol. & Culture 167, 168-69 (noting that “the critique of the national media’s purported bias congealed into a hard-and-fast belief on the right” in the late 1950s and early 1960s in response to the burgeoning civil rights movement in the South).

[2] See Edward Burmila, Liberal Bias in the College Classroom: A Review of the Evidence (or Lack Thereof), 54 PS: Pol. Sci. & Pol. 598 (2021) (noting that “[t]he roots of the liberal academe narrative in American political discourse arguably began with” William F. Buckley’s 1951 book God and Man at Yale).

[3] Nikolas Lanum, Twitter, Facebook, Google have repeatedly censored conservatives despite liberal doubts, Fox News (Mar. 29, 2022), https://www.foxnews.com/media/twitter-facebook-google-censored-conservatives-big-tech-suspension (quoting commentator Mollie Hemingway as saying “You can not possibly have been alive in the last five years and think that social media companies do anything other than amplify left-wing insanity and crush anything from the right that hurts the left.”).

[4] Bill Analyses C.S.H.B. 20, Constitutional Rights & Remedies Select Committee Report, https://capitol.texas.gov/tlodocs/872/analysis/pdf/HB00020H.pdf (last visited Apr. 4, 2024).

[5] Complaint at ¶ 1, NetChoice, LLC v. Paxton, 573 F. Supp. 3d 1092, 1099 (W.D. Tex. 2021), vacated and remanded sub nom. NetChoice, L.L.C. v. Paxton, 49 F.4th 439 (5th Cir. 2022).

[6] Id. at ¶ 7.

[7] Id. at ¶ 156.

[8] Valerie C. Brannon & Eric N. Holmes, Cong. Res. Serv., Section 230: An Overview 2 (2024).

[9] NetChoice, 573 F. Supp. 3d at 1116.

[10] NetChoice, L.L.C. v. Paxton, No. 21-51178, 2022 WL 1537249, at *1 (5th Cir. May 11, 2022), vacated sub nom. NetChoice, LLC v. Paxton, 142 S. Ct. 1715 (2022).

[11] NetChoice, LLC, 142 S. Ct. 1715 (Justice Kagan also noted her dissent from the decision to grant the motion to vacate, but did not sign on to Justice Alito’s dissenting opinion.).

[12] NetChoice, L.L.C, 49 F.4th at 445 (5th Cir. 2022), cert. granted in part sub nom. NetChoice, LLC v. Paxton, 144 S. Ct. 477 (2023).

[13] In May 2021, the Florida Legislature enacted Senate Bill 7072, which “establishes a violation for social media deplatforming of a political candidate or journalistic enterprise and requires a social media platform to meet certain requirements when it restricts speech by users.” The Florida Senate Summary for SB 7072 is accessible at: https://www.flsenate.gov/Committees/billsummaries/2021/html/2345. As in Texas, NetChoice and the Computer & Communications Industry Association challenged this law on behalf of their clients. The Eleventh Circuit ultimately upheld a preliminary injunction against the law, holding that “it is substantially likely that social-media companies—even the biggest ones—are ‘private actors’ whose rights the First Amendment protects” NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th 1196, 1203 (11th Cir. 2022), cert. granted in part sub nom. Moody v. NetChoice, LLC, 144 S. Ct. 478 (2023), and cert. denied sub nom. NetChoice, LLC v. Moody, 144 S. Ct. 69 (2023) (internal citation omitted).

[14] NetChoice, LLC v. Paxton, 144 S. Ct. 477 (2023); Moody v. NetChoice, LLC, 144 S. Ct. 478 (2023).

[15] Amy Howe, Supreme Court skeptical of Texas, Florida regulation of social media moderation, SCOTUSblog (Feb. 27, 2024), https://www.scotusblog.com/2024/02/supreme-court-skeptical-of-texas-florida-regulation-of-social-media-moderation/.

[16] Id.; Transcript of Oral Argument at 44-45, Moody v. NetChoice, LLC (2023) (No. 22-277).

[17] Transcript of Oral Argument, supra note 16, at 52.

[18] Id. at 54.

[19] Id. at 53.

[20] See Howe, supra note 15.

[21] See generally Oliver L. Haimson et al., Disproportionate Removals and Difering Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas, 5 Proceedings of the ACM on Human-Computer Interaction 1 (2021); Ángel Díaz & Laura Hecht-Felella, Brennan Center for Justice, Double Standards in Social Media Content Moderation (2021), https://www.brennancenter.org/our-work/research-reports/double-standards-social-media-content-moderation.

[22] Maura Conway et al., Disrupting Daesh: Measuring Takedown of Online Terrorist Material and Its Impacts, 42 Studies in Conflict & Terrorism 141, 155 (2019).

[23] Orestis Papakyriakopoulos et al., The spread of COVID-19 conspiracy theories on social media and the effect of content moderation, The Harvard Kennedy School Misinformation Review, Volume 1, Special Issue on COVID-19 and Misinformation, at 5 (2020).

[24] Filipo Sharevski et al., Misinformation warnings: Twitter’s soft moderation effects on COVID-19 vaccine belief echoes, 114 Computers & Security 1 (2022).

[25] Rachel Reed, Compelling speech, Harvard Law Today (Feb. 21, 2024), https://hls.harvard.edu/today/supreme-court-preview-netchoice-v-paxton/.

[26] Daphne Keller, Platform Transparency and the First Amendment, 4 J. Free Speech L. 1, 68 (2023).

[27] Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio, 471 U.S. 626 (1985); See generally Note Repackaging Zauderer, 130 Harv. L. Rev. 972 (2017).

[28] Keller, supra note 26, at 69.

[29] Id. at 70.

[30] Megan Iorio et al., Four Key Takeaways from the Moody v. NetChoice and NetChoice v. Paxton Oral Arguments, Elec. Priv. Info. Center (Feb. 28, 2024), https://epic.org/four-key-takeaways-from-the-netchoice-v-moody-and-paxton-oral-arguments/.

Leave a comment

Your email address will not be published. Required fields are marked *