ChatGPT Helped Plan FSU Shooting, Florida Officials Say

In April 2025, a man opened fire on the campus of Florida State University, killing two adults and injuring six others. The shooter is charged with murder and attempted murder. Now, Florida officials are investigating OpenAI, the creator of the chatbot ChatGPT, to determine whether the company should also be charged with a crime.
Florida Attorney General James Uthmeier said in an April 9 announcement that officials “discovered that ChatGPT may have been used to assist the killer” during the shooting.
“As big tech rolls out these technologies, they shouldn’t, can’t, put our safety and security at risk,” Uthmeier added.
On Tuesday, Uthmeier launched a criminal investigation into OpenAI and ChatGPT.
(Disclosure: Ziff Davis, CNET’s parent company, sued OpenAI in 2025, alleging that it infringed on Ziff Davis’ copyrights in training and using its AI programs.)
While ChatGPT and other chatbots have been involved in lawsuits over their alleged involvement in deaths and injuries, this marks the first time that ChatGPT and OpenAI have been the subjects of a criminal investigation.
A representative for OpenAI did not immediately respond to a request for comment.
“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this heinous crime,” a company spokesperson told NPR.
A spokesperson said ChatGPT “provided truthful answers to questions with information widely available from all public sources on the Internet, and did not encourage or promote illegal or dangerous activity.”
Suspect’s advice on the type of gun, ammunition, time and place
Criminal investigations are conducted by law enforcement and government officials to determine who is responsible for a crime. During an April 21 press conference, Uthmeier said officials decided a criminal investigation was necessary after finding that “ChatGPT provided valuable advice to the shooter before he committed the heinous crime.”
“Communication between ChatGPT and the shooter revealed that the chatbot advised the shooter on what type of gun to use, which gun to use, and whether or not the gun would help,” Uthmeier said during the press conference, adding that the chatbot also provided advice on what time of day and which area of campus the shooter contacted.
“My prosecutors looked at this, and they told me, if it had been a person on the other side of that screen, we would have filed murder charges,” Uthmeier said.
OpenAI CEO Sam Altman testifies before a US Senate committee in May 2025.
What’s next?
Florida law states that an “aider and abettor” is as guilty of the crime as the perpetrator. However, because ChatGPT is not a person, Uthmeier said this is “uncharted territory,” but Florida officials still want to determine whether OpenAI is guilty of a crime.
Uthmeier said the U.S. Attorney’s Office subpoenaed OpenAI for numerous policies, personnel information and information related to the Florida State University shooting.
Other cases
While this is the first time that ChatGPT and OpenAI have focused on criminal investigations, the company and other chatbot developers are no strangers to crime.
The parents of a 23-year-old man who died by suicide in July 2025 sued OpenAI later that year in a wrongful-death lawsuit, saying the chatbot made his depression worse and pushed him to kill himself.
In October 2025, OpenAI announced that ChatGPT was updated “to better recognize and support people in times of stress.”
Google’s Gemini was recently named in a similar case after the family of a 36-year-old man who died by suicide said the chatbot trained him on it.
In response to the lawsuit, Google said in part, “Gemini was designed not to promote real-world violence or promote self-harm,” later adding: “In this case, Gemini clarified that it was an AI and referred a person to a difficult phone number multiple times.”
The Pew Research Center surveyed 1,458 US teenagers in 2025 and found that 64% of them used a chatbot.
Both cases remain unsolved.
In response to the Florida investigation, attorneys representing one of the victims of the FSU shooting said they plan to “file a lawsuit against ChatGPT, and its ownership structure, in the near future, and will seek to hold them responsible for the sudden and senseless death of our client.”
An OpenAI spokesperson told WCTV: “Our hearts go out to everyone affected by this terrible tragedy. After learning of the incident in late April 2025, we identified a ChatGPT account believed to be associated with the suspect, shared this information with law enforcement and cooperated with authorities. We built ChatGPT to understand the safe and proper human intent and respond in our own safe, and technologically responsive manner.”
If you or someone you know is in danger, call 911. If you struggle with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 988.



