After a mass shooting at Florida State University took two lives in April 2025, the family of one victim has taken OpenAI to court, arguing that the firm's AI platform actually paved the way for the gunman to carry out the violence.
On Sunday, Tiru Chabba's widow, Vandana Joshi, initiated federal proceedings in Florida againstOpenAI, citing the loss of her husband, who was slain in the same attack as campus dining official Robert Morales.
The legal filing also targets the alleged shooter, Phoenix Ikner, as a defendant, pointing to his long-running exchanges withChatGPTas evidence of a missed warning. According to the suit, OpenAI's systems were either too flawed to link the signs together or simply lacked the necessary design to identify the looming danger within Ikner's prompts.
The legal papers describe how Ikner, an FSU student at the time, showed ChatGPT photos of the guns he had bought, leading the AI to reportedly offer guidance on their operation by 'telling him the Glock had no safety, that it was meant to be fired "quick to use under stress" and advising him to keep his finger off the trigger until he was ready to shoot.'
🇺🇸 OpenAI is being sued over the Florida State University shooting that killed 2 people last April.The lawsuit claims the shooter, Phoenix Ikner, had months of conversations with ChatGPT.When he asked about the busiest times at the FSU student union, ChatGPT answered.When…pic.twitter.com/6ezyRbzXMT
The lawsuit claims that Ikner initiated the violence at FSU by adhering to those guidelines, even alleging that ChatGPT noted a massacre is more prone to garnering national attention if 'children are involved, even 2-3 victims can draw more attention.' On the very morning of the tragedy, Ikner reportedly turned back to the bot to inquire about what 'the legal process, sentencing, and incarceration outlook' would look like for him.
Rejecting the idea that its platform bears any blame for the violence, OpenAI spokesperson Drew Pusateri stated in an email to NBC News that 'Last year's mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime.'
Pusateri further noted that the firm assisted investigators immediately after the event and remains in contact with law enforcement.
He further explained that, in this instance, ChatGPT simply gave objective answers based on widely available web data and never actually incited any lawbreaking or violence. Pusateri described the AI as a versatile resource relied upon by millions for honest reasons, emphasising that OpenAI is constantly refining its protections to spot dangerous motives, curb abuse, and take action whenever security threats emerge.
According to Joshi's filing, OpenAI had more than enough warning to see that Ikner's disturbing interactions would result in 'mass casualties and substantial harm to the public.'
Source: International Business Times UK