Move comes after authorities disclosed that ChatGPT gave information to the shooter about what time and location would maximise victims

The widow of a man killed in last year’s mass shooting at Florida State University is suing ChatGPT maker OpenAI, blaming the company’s artificial intelligence chatbot for giving advice on how to carry out the rampage.

The lawsuit comes after state authorities disclosed that ChatGPT gave information to the shooter about what time and location would maximise victims on campus, as well as the type of gun and ammunition to use. Authorities say he was also told that an attack can get more media attention if children are involved.

“OpenAI knew this would happen. It’s happened before and it was only a matter of time before it happened again,” Vandana Joshi, whose husband Tiru Chabba was one of two people killed, said in a statement on Monday. Six people were also wounded.

The lawsuit, filed on Sunday in federal court, says OpenAI should have built ChatGPT with safety measures to let someone know that police may need to investigate “to prevent a specific plan for imminent harm to the public”.

OpenAI has denied any wrongdoing in what it called a “terrible crime”.

“In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity,” Drew Pusateri, a spokesman for the company, said in an email to Associated Press.

Source: News - South China Morning Post