A federal lawsuit filed May 10, 2026, alleges that ChatGPT told the accused Florida State University gunman that a shooting was more likely to gain national media attention "if children are involved," with even two to three victims drawing greater coverage.
The lawsuit was filed by Vandana Joshi, the widow of Tiru Chabba, 45, one of two people killed in the Apr. 17, 2025, shooting at FSU's Student Union in Tallahassee.
Chabba, a father of two from Greenville, South Carolina, was working as an employee of campus vendor Aramark Collegiate Hospitality when the accused gunman, Phoenix Ikner, then 20, opened fire. The other victim was identified as university dining director Robert Morales. Six others were injured in the attack, according to People.
The complaint, filed in the U.S. District Court for the Northern District of Florida, names OpenAI and multiple affiliated companies as defendants, as well as Ikner himself.
It brings counts including negligence, gross negligence, strict products liability for defective design and failure to warn, negligent entrustment, and wrongful death. The family is seeking compensatory and punitive damages and has demanded a jury trial.
According to the lawsuit and chat logs reviewed by law enforcement, Ikner exchanged as many as 16,000 messages with ChatGPT over roughly 18 months before the attack.
The complaint alleges that ChatGPT identified firearms from photos Ikner uploaded, provided instructions on loading and operating guns, and told him the busiest hours at the FSU Student Union were between 11:30 a.m. and 1:30 p.m. on weekdays.
Ikner began the attack at approximately 11:57 a.m. Just three minutes before opening fire, the chatbot allegedly told him how to take the safety off his shotgun.
The suit further alleges that ChatGPT informed Ikner that "3 or more people killed" — or roughly five to six total victims — is often enough to push an incident into national headlines.
Chabba's attorney, Bakari Sellers, said the conversations also covered Ikner's interest in Hitler, Nazis, fascism, and Christian nationalism, and that Ikner asked ChatGPT about the Columbine and Virginia Tech shootings.
The complaint claims the chatbot "flattered" and "praised" Ikner, reinforced his worldview, and failed to escalate any of the exchanges for human review.
On Apr. 21, 2026, Florida Attorney General James Uthmeier announced a criminal investigation into OpenAI, stating that if ChatGPT "were a person, it would be facing charges for murder."
Uthmeier said prosecutors reviewed the chat logs and determined that ChatGPT offered "significant advice" to Ikner, including what gun and ammunition to use and when to be on campus to encounter the largest crowd. The criminal probe expanded an earlier civil investigation and includes subpoenas to OpenAI, Miami Herald reported.
OpenAI has denied responsibility. Spokesperson Drew Pusateri stated that "ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity."
The company said it identified an account believed to be associated with Ikner after the shooting and proactively shared that information with law enforcement. OpenAI has said it continues to cooperate with authorities.
Ikner has pleaded not guilty to two counts of first-degree murder and seven counts of attempted murder, with prosecutors indicating they intend to seek the death penalty. His trial is currently scheduled for October 2026.
This case represents what legal experts and investigators have described as one of the first known instances of an AI chatbot being subject to a criminal investigation linked to an alleged role in a mass shooting, as per WLRN.
Originally published on Lawyer Herald









