Florida Attorney General James Uthmeier has launched a criminal investigation into OpenAI after a shooting at Florida State University in 2025 that killed 2 people and injured 6 others.
According to investigation documents, suspect Phoenix Ikner spoke to ChatGPT before committing the crime. Authorities said the student asked the chatbot about the type of weapon, ammunition suitable and the location that could cause the most casualties. Investigators said ChatGPT answered those questions.
Mr. Uthmeier declared that if there is a person on the other side of the screen, that person may be prosecuted for murder. He left open the possibility of prosecuting OpenAI or company employees.
The incident sparked debate as to whether artificial intelligence developers could be held criminally responsible for the role of AI in crime or suicide.
Legal experts believe this is a real but very complex possibility. According to law professor Matthew Tokson of the University of Utah, the difference of the case lies in "a product that encouraged criminal behavior".
Mr. Tokson said that a more convincing lawsuit may require internal documents showing that the company knows the risks but does not handle them seriously.
Professor Brandon Garrett of Duke University said prosecutors must prove the charges "reasonably beyond doubt".
OpenAI affirms that ChatGPT is not responsible for the attack. The company said it is continuously strengthening safety measures to detect malicious intent, limit abuse and respond appropriately to unsafe risks.
According to AFP, many civil lawsuits related to AI platforms have been filed in the US, mostly related to suicides, but there has been no ruling against any business.
In December last year, Ms. Suzanne Adams' family sued OpenAI in California, accusing ChatGPT of contributing to the woman's murder by her son.
Experts believe that even if the sentence is not heavy, a criminal sentence can seriously damage the reputation of this AI business.