2
MOODY v. NETCHOICE, LLC
Syllabus
court held that the obligation to explain “millions of [decisions] per
day” is “unduly burdensome and likely to chill platforms’ protected
speech.” 34 F. 4th, at 1230.
The Fifth Circuit disagreed across the board, and so reversed the
preliminary injunction of the Texas law. In that court’s view, the platforms’ content-moderation activities are “not speech” at all, and so do
not implicate the First Amendment. 49 F. 4th 439, 466, 494. But even
if those activities were expressive, the court determined the State
could regulate them to advance its interest in “protecting a diversity
of ideas.” Id., at 482. The court further held that the statute’s individualized-explanation provisions would likely survive, even assuming
the platforms were engaged in speech. It found no undue burden under Zauderer because the platforms needed only to “scale up” a “complaint-and-appeal process” they already used. 49 F. 4th, at 487.
Held: The judgments are vacated, and the cases are remanded, because
neither the Eleventh Circuit nor the Fifth Circuit conducted a proper
analysis of the facial First Amendment challenges to Florida and
Texas laws regulating large internet platforms. Pp. 9–31.
(a) NetChoice’s decision to litigate these cases as facial challenges
comes at a cost. The Court has made facial challenges hard to win. In
the First Amendment context, a plaintiff must show that “a substantial number of [the law’s] applications are unconstitutional, judged in
relation to the statute’s plainly legitimate sweep.” Americans for Prosperity Foundation v. Bonta, 594 U. S. 595, 615.
So far in these cases, no one has paid much attention to that issue.
Analysis and arguments below focused mainly on how the laws applied
to the content-moderation practices that giant social-media platforms
use on their best-known services to filter, alter, or label their users’
posts, i.e., on how the laws applied to the likes of Facebook’s News Feed
and YouTube’s homepage. They did not address the full range of activities the laws cover, and measure the constitutional against the unconstitutional applications.
The proper analysis begins with an assessment of the state laws’
scope. The laws appear to apply beyond Facebook’s News Feed and its
ilk. But it’s not clear to what extent, if at all, they affect social-media
giants’ other services, like direct messaging, or what they have to say
about other platforms and functions. And before a court can do anything else with these facial challenges, it must “determine what [the
law] covers.” United States v. Hansen, 599 U. S. 762, 770.
The next order of business is to decide which of the laws’ applications
violate the First Amendment, and to measure them against the rest.
For the content-moderation provisions, that means asking, as to every
covered platform or function, whether there is an intrusion on pro-