During almost three hours of oral arguments Feb. 21, the U.S. Supreme Court discussed for the first time a case that questions Section 230 protections. The case looks at the liability of social media platforms and search engines regarding speech hosted on their sites, and if recommendation algorithms could be responsible for aiding terrorist activity.
The case before the court, Gonzalez v. Google, was brought by Reynaldo Gonzalez, the father of 23-year-old Nohemi Gonzalez who was killed while studying abroad in Paris during a 2015 Islamic State group attack. Reynaldo claimed videos posted to YouTube were suggested to users and amplified by recommendation algorithms to ultimately assist ISIS in recruiting members. He sued the search engine under the Anti-Terrorism Act of 2018.
Gonzalez focuses on Section 230 of the Communications Decency Act of 1996 which states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
The question before the court asks if Section 230 protects digital platforms that host online speech and use recommendation algorithms to target specific users, or if it only limits the liability of digital platforms “when they engage in traditional editorial functions” like content moderation.
Throughout the course of the arguments, the Supreme Court justices were skeptical of ruling against Google, and were wary of how their decision may shape the future of the internet and online speech.
Gonzalez’s attorney Eric Schnapper said their focus was on YouTube’s recommendation algorithms, rather than moderation policies. Schnapper argued YouTube’s algorithms provide recommendations for users to see certain videos based on their past searches. But, while Section 230 protects YouTube from liability for the content it hosts, Schnapper says its curation of recommended content is outside of the scope of Section 230 protection.
But Google’s attorney, Lisa Blatt, argued that attempting to “circumvent” Section 230 protections “by pointing to features inherent in all publishing,” like recommendations, would expose websites and online platforms to liability and “threaten today’s internet.”
Justice Clarence Thomas was the first to dive into the topic of algorithms. He questioned the type of algorithm YouTube employs, and if it is used differently depending on the content served. Schnapper said it’s “the same algorithm across the board.”
Justice Thomas: If it’s the same algorithm, I think you have to give us a clearer example of what your point is exactly. The same algorithm to present cooking videos to people who are interested in cooking, and ISIS videos to people who are interested in ISIS, racing videos to people who are interested in racing. Then I think you’re going to have to explain more clearly, if it’s neutral in that way, how your claim is set apart from that.
Schnapper: Surely. If I might turn to the practice of displaying thumbnails, which is a major part of what’s at issue here, and the issue is not the manner in which YouTube displays videos. It actually displays, as you doubtless know from having looked at, these little pictures, which are referred to as thumbnails. They are intended to encourage the viewer to click on them and go see a video. It’s the use of algorithms to generate these thumbnails that’s at issue, and the thumbnails, in turn, involve content created by the defendant.
Justice Thomas: The thumbnails, from what I understand, it’s based upon what the algorithm suggests the user is interested in. So, if you’re interested in cooking, you don’t want thumbnails on light jazz. It’s neutral in that sense. You’re interested in cooking. Say you get interested in rice — in pilaf from Uzbekistan. You don’t want pilaf from some other place, say, Louisiana. So I don’t see how that is any different from what is happening in this case. And what I’m trying to get you to focus on is … Are we talking about the neutral application of an algorithm that works generically for pilaf and it also works in a similar way for ISIS videos? Or is there something different?
Schnapper: No, I think that’s correct, but our view is that the fact that an algorithm is neutral doesn’t alter the application of the statute. The statute requires that one work through each of the elements of the defense and see if it applies … The statute says the claim must treat you as a publisher.
Chief Justice John Roberts: Well, the difference is that Google, YouTube, they’re still not responsible for the content of the videos or text that is transmitted. Your focus is on the actual selection and recommendations. They’re responsible that a particular item is there but not for what the item says. And I think it may be significant if the algorithm is the same across — as Justice Thomas was suggesting, across the different subject matters, because then they don’t have a focused algorithm with respect to terrorist activities… I think it might be harder for you to say that there’s selection involved for which they could be held responsible.
Schnapper: The statute, I think, doesn’t draw the distinction that way. The claim here is about the encouragement of users to go look at particular content … But, if that’s an actionable claim, then the conduct here would fit within it, because certain individuals would be shown these thumbnails, which would encourage them to go look at those videos.
Justice Elena Kagan: So I think you’re right, Mr. Schnapper, that the statute doesn’t make that distinction. This was a pre-algorithm statute. And, you know, everybody is trying their best to figure out how this statute applies, the statute which was a pre-algorithm statute applies in a post-algorithm world. But I think what was lying underneath Justice Thomas’s question was a suggestion that algorithms are endemic to the internet, that every time anybody looks at anything on the internet, there is an algorithm involved, whether it’s a Google search engine or whether it’s this YouTube site or a Twitter account or countless other things, that everything involves ways of organizing and prioritizing material… Does your position send us down the road such that 230 really can’t mean anything at all?
Schnapper: I don’t think so, Your Honor. The question — as you say, algorithms are ubiquitous, but the question is what does the defendant do with the algorithm. If it uses the algorithm to encourage people to look at ISIS videos, that’s within the scope of JASTA [Justice Against Sponsors of Terrorism Act].
Justice Roberts later asked Schnapper about the comparison between YouTube’s recommended videos and a bookstore with a table of sports books. Schnapper, after a bit of back and forth, concluded that the bookstore would not be liable for the content requested, but would be liable for the way the content was categorized. For example, if someone searched “ISIS videos,” the site would not be responsible for the third-party content provided in response. But the recommendation of other videos, Schnapper argued, is the issue at hand.
Justice Thomas continued to question Schnapper on how recommendations based on interest could constitute aiding terrorist activity.
Justice Thomas: I don’t understand how a neutral suggestion about something that you’ve expressed an interest in is aiding and abetting. I just don’t — I don’t understand it. And I’m trying to get you to explain to us how something that is standard on YouTube for virtually anything that you have an interest in suddenly amounts to aiding and abetting because you’re in the ISIS category.
Schnapper attempted to explain the concept in different ways, but the justices still didn’t understand the argument.
Justice Samuel Alito: I’m afraid I’m completely confused by whatever argument you’re making at the present time.
The panel, while trying to better understand Schnapper’s argument, did express concern with whether they should be making the decision in terms of the scope of Section 230.
Justice Kagan: We’re a court. We really don’t know about these things. You know, these are not like the nine greatest experts on the internet.
Justice Brett Kavanaugh noted apprehension in undermining Congress, which passed Section 230 in 1996 to assist in the growth of the internet and to ensure companies that hosted social content would be protected from lawsuits. Justice Kavanaugh referenced amici briefs filed on behalf of Google.
Justice Kavanaugh: To pull back now from the interpretation that’s been in place would create a lot of economic dislocation, would really crash the digital economy with all sorts of effects on workers and consumers, retirement plans and what have you, and those are serious concerns and concerns that Congress, if it were to take a look at this and try to fashion something along the lines of what you’re saying, could account for. We are not equipped to account for that. So are the predictions of problems overstated? If so, how? And are we really the right body to draw back from what had been the text and consistent understanding in courts of appeals?
Justice Amy Coney Barrett questioned if Schnapper’s proposal to narrow Section 230 could put average social media users at risk for liability if they were to retweet a tweet.
Justice Barrett: Let’s say that I think you’re a user of Twitter if you go on Twitter and you’re using Twitter and you retweet or you like or you say check this out. On your theory, I’m not protected by Section 230.
Schnapper: That’s content you’ve created.
Justice Barrett: That’s content I’ve created. Okay. And on the content creation point, let’s imagine — it seems like you’re putting a whole lot of weight on the fact that these are thumbnails, and so it’s something that YouTube separately creates.
Justice Barrett: What if they just screenshot? They just screenshot the ISIS thing. They don’t do the thumbnail. Then are they —
Schnapper: That’s pure third-party content.
Justice Barrett: That’s pure third — so this is just about how YouTube set it up?
Schnapper: That’s correct in this context.
Malcolm Stewart, U.S. deputy solicitor general and attorney for the Justice Department siding in part with Gonzalez, stated that YouTube is “applying neutral algorithms,” meaning it’s “simply showing more ISIS videos to people who’ve shown an interest in ISIS, just as it does more cat videos to people who’ve shown an interest in cats, that’s much less likely to give rise to liability under the Antiterrorism …”
Justice Kavanaugh interrupted Stewart’s statement, questioning its validity.
Justice Kavanaugh: And much less likely – I’m not sure based on what. You seem to be putting a lot of stock on the liability piece of this, rather than, as Justice Jackson, was saying, the immunity piece. And I’m just not sure, you know, if we go down this road, I’m not sure that’s going to really pan out. Certainly, as Justice Kagan says, lawsuits will be non-stop.
Justice Roberts echoed Kavanaugh’s concerns about impending lawsuits.
Justice Roberts: We’re talking about the prospect of significant liability in litigation. And up to this point, people have focused on the ATA [Anti-Terrorism Act] because that’s the one point that’s at issue here. But I suspect there would be many, many times more defamation suits, discrimination suits, as some of the discussion has been this morning, infliction of emotional distress, antitrust actions.