A Georgia talk-show radio host sued OpenAI, the company that owns ChatGPT, for libel June 5 after the artificial intelligence chat bot shared false information about the host to a journalist.
Mark Walters, a radio host at Armed American Radio, filed the lawsuit in the Superior Court of Gwinnett County, Georgia, and claimed ChatGPT published libelous information about him by sharing a “fabricated” complaint to a journalist. Libel is a published false statement that diminishes a person’s reputation.
The journalist, Fred Riehl of Ammoland.com, was reporting on a federal lawsuit filed in Washington, and provided ChatGPT with a link to the complaint and requested a summary of the lawsuit’s accusations.
ChatGPT responded, in part, that the complaint was “filed by Alan Gottlieb, the founder and executive vice president of the Second Amendment Foundation (SAF), against Mark Walters, who is accused of defrauding and embezzling funds from the SAF.”
According to Walters’ complaint against OpenAI, all the information provided by ChatGPT relating to him was false. Riehl requested a copy of the complaint from ChatGPT, which responded with a filing that was a “complete fabrication and bears no resemblance to the actual complaint, including an erroneous case number,” the lawsuit states.
While the information about Walters created by ChatGPT was not published in the traditional sense, in libel law something is considered “published” if it is communicated to a third party. However, publication of a false, defamatory statement to just one person can affect the amount of damage, if any, suffered to reputation.
In the lawsuit, Walters notes that OpenAI, (OAI), “is aware that ChatGPT sometimes makes up facts, and refers to this phenomenon as a ‘hallucination.’”
Eugene Volokh, law professor at UCLA and writer for The Volokh Conspiracy, wrote in his upcoming article “Large Libel Models? Liability for AI Output,” that these types of “disclaimers don’t immunize AI companies against potential libel liability.”
“Even if the AIs’ users are seen as waiving their rights to sue based on erroneous information when they expressly or implicitly acknowledge the disclaimers, that can’t waive the rights of the third parties who might be libeled,” he wrote.
The lawsuit claims that ChatGPT’s “allegations concerning Walters were false and malicious, expressed in print, writing, pictures, or signs, tending to injure Walter’s reputation and exposing him to public hatred, contempt, or ridicule. By sending the allegations to Riehl, OAI published libelous matter regarding Walters.”
John Monroe, Walters’ lawyer, says that ChatGPT’s system is “providing ‘information’” that is “purported to be fact when it’s not, and in this case they’ve said Mark Walters committed serious crimes, but he really has no relationship at all with that company, so it’s bizarre.”
Jess Miers, legal advocacy counsel at Chamber of Progress, an industry group that lobbies for and represents big tech companies, tweeted in response to questions about Section 230 — a law that protects online platforms from some liability for hosted content on their platforms — and how it may be used in Walters’ case.
“Remember: Section 230 is a defense. But before we even reach 230, we have to ask whether the complaint is viable in the first place. Here, it’s likely not,” she wrote.
June 5, 2023 — Mark Walters v. OpenAI
Tags