Social Media

New York AG Spars With FIRE Over Social Media Moderation of ‘Hateful Content’

New York State Attorney General Letitia James speaks at a news conference in New York
New York State Attorney General Letitia James speaks at a news conference to announce the filing of a federal lawsuit in partnership with at least 10 U.S. state attorneys general to stop a proposed $26 billion merger of mobile carriers Sprint and T-Mobile in New York, June 11, 2019. (Reuters/Mike Segar)

By Susanna Granieri

The Foundation for Individual Rights and Expression (FIRE), a free speech advocacy group, opened a new front in its ongoing legal battle with the state of New York this week over its attempts to regulate online speech.

FIRE sent a letter to New York’s Attorney General Letitia James on Wednesday demanding that she rescind her request to social media companies to provide information about their content moderation policies for removing rhetoric that could incite violence against Jewish and Muslim people following the recent terrorist attack in Israel.

“The First Amendment protects content creators and users from governmental burdens that are likely to chill their speech,” the letter reads.

The New York attorney general’s office did not respond to a request for comment, but on Friday Attorney General James rescinded her request to Rumble, which is represented by FIRE, stating that while she and her office “wholeheartedly disagree with [FIRE’s] analysis and the contentions in [FIRE’s] letter,” the office sought to avoid any “unnecessary dispute.”

FIRE attorneys remain concerned, however, that the First Amendment rights of the other five platforms are still being violated. FIRE attorney Daniel Ortner said Attorney General James’ request, in conjunction with the “very strongly worded” press release, suggests that “if you don’t create these policies and enforce them to remove this content, we are going to go after you.”

“That’s the veiled, or even really not-so-veiled threat that’s at the heart of what the attorney general is doing,” he said.

The escalating violence in the Middle East has also ignited tensions in the United States, leading to protests and demonstrations in major cities throughout the country, including in New York.

Last week, Attorney General James announced that her office was seeking information from six social media platforms about what “steps they are taking to stop the spread of hateful content encouraging violence against Jewish and Muslim people and institutions in the wake of terrorist attacks in Israel,” adding that these “calls for violence … have rapidly spread in the past week.”

She sent a letter, titled “Removing Calls for Violence,” to Google, Meta, X, TikTok, Reddit and Rumble, requesting the platforms “describe in detail” their moderation policies for identifying and removing hateful and violent content and banning users that disseminated it.

“In the wake of Hamas’ unspeakable atrocities, social media has been widely used by bad actors to spread horrific material, disseminate threats, and encourage violence,” Attorney General James said in the release. “These platforms have a responsibility to keep their users safe and prohibit the spread of violent rhetoric that puts vulnerable groups in danger. I am calling on these companies to explain how they are addressing threats and how they will ensure that no online platform is used to further terrorist activities.”

FIRE is already in an ongoing legal battle with the state over attempts to regulate online speech in a separate case.

In December 2022, FIRE challenged the state’s Online Hate Speech Law on behalf of Rumble and Locals, two social video sharing websites, and blogger Eugene Volokh of The Volokh Conspiracy.

The law requires that internet platforms publish their policies on responding to certain content and create an accessible “mechanism” in which users can file complaints of “hateful conduct” — defined as speech that aims to “vilify, humiliate, or incite violence against a group or class of persons” based on their race, religion, or gender, among other things — and receive a direct response.

Judge Andrew Carter of the U.S. District Court for the Southern District of New York issued a preliminary injunction barring enforcement of the law in February, stating it “both compels social media networks to speak about the contours of hate speech and chills the constitutionally protected speech of social media users, without articulating a compelling governmental interest or ensuring that the law is narrowly tailored to that goal.”

FIRE is now contending that Attorney General James’ recent letter to social media platforms violates the court’s injunction.

Just as the state’s law was “preliminarily enjoined as likely to violate the First Amendment—because it inhibits protected expression with viewpoint-discriminatory, overbroad, and vague speech regulations—so too does the State’s investigation impinge the free publication and creation of protected speech on [the platforms],” the letter reads.

The Supreme Court has consistently held that the First Amendment prohibits the government from punishing hate speech unless it crosses the line into an unprotected area such as incitement to violence, true threats, fighting words, or harassment.

Regarding incitements to violence, the Supreme Court in Brandenburg v. Ohio, ruled in 1969 that speech could not be punished unless it was “directed to inciting or producing imminent lawless action and is likely to incite or produce such action.”

FIRE argues that the attorney general’s letter fails to define what constitutes “calls for violence” and “other materials that may incite violence,” resulting in an overbroad and vague interpretation.

The lack of definition raises questions. “Would a video created by a pro-Israeli activist calling for bombing Gaza qualify as a ‘call for violence’?” the letter reads. “Is a news report including a quotation from a pro-Palestinian protestor defending Hamas attacks on Israeli military equal to ‘disseminating calls for violence and other materials that may incite violence’?”

Ortner said that even if the state has good intentions “to prevent some kind of violence,” it can’t do so by infringing on the First Amendment.

“They can’t demand [that] the social media platforms who are private, protected by the First Amendment, take down content [and] disclose their editorial judgments so that the state can then go after them if they don’t adequately police and take down this content,” he said, “Whatever the intention, the violation of the First Amendment is inappropriate.”


Tags