
Authorities have launched a criminal probe into whether the artificial-intelligence platform ChatGPT helped a man plan a deadly mass shooting at Florida State University last year.
Florida Attorney General James Uthmeier said Tuesday that his office opened the probe after reviewing chats that accused gunman Phoenix Ikner had with one of the platform’s bots in the lead-up to the April 17, 2025, horror.
“ChatGPT offered significant advice to the shooter before he committed such heinous crimes,” Utmeier said at a press conference in Tampa.
For example, the chatbot told Ikner what type of gun to use, what type of ammunition to purchase, what kind of firearm is best for a close-range shooting and what part of campus would be the most crowded at the time, the AG said.
“My prosecutors have looked at this, and they’ve told me if it was a person on the other end of the screen, we would be charging them with murder,” Utmeier said.
Ikner, a student at the college, opened fire using his stepmother’s service pistol outside the student union of FSU’s Tallahassee campus, killing Robert Morales, 57, and Tiru Chabba, 45, both Aramark vendors, according to officials. The suspect’s stepmother is a deputy with the Leon County sheriff’s office.
Six students also were wounded before police shot Ikner, leaving his face disfigured.
The accused shooter’s motive is still unclear, with investigators saying Ikner didn’t know his victims. He is charged with first-degree murder, attempted murder and related crimes.
Utmeier said his office has issued subpoenas to ChatGPT’s parent company Open AI for its internal policies and trainings involving users who express self-harm or harm to others.
The AG’s office is also seeking a slew of other information from the company, including the names of its management, all other employees and any press releases they put out related to the shooting.
Neama Rahmani, a former prosecutor and private lawyer, said there are many challenges the AG will face in trying to potentially prosecute the corporation, including difficulties trying to prove intent and causation and overcoming the First Amendment arguments and immunity that tech companies usually benefit from when their users carry out crimes.
“It is unusual, and [Utmeier] is venturing into uncharted legal waters,” Rahmani said.
“I’m not saying that this can’t be an important case that changes the way we think about AI,” the lawyer said. “But I think there are going to be significant challenges to a successful criminal prosecution.
“At the end of the day, you can’t put a corporation in jail anyway, so you’re talking about a fine,” he said.
The expert added that a civil lawsuit would be much easier to prove in this case. Morales’ relatives are planning to file suit against OpenAI, their lawyers announced earlier this month.
A rep for OpenAI said the company is cooperating with law enforcement.
The representative said all of the chatbots answers were “factual responses to questions with information that could be found broadly across public sources on the internet.
“It did not encourage or promote illegal or harmful activity,” the rep said. “ChatGPT is a general-purpose tool used by hundreds of millions of people every day for legitimate purposes.
“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” the representative said.
“After learning of the incident, we identified a ChatGPT account believed to be associated with the suspect and proactively shared this information with law enforcement.”
The company is working to bolster safeguards to help identify users involving “harmful intent” so that it can act appropriately, the rep said.
–Additional reporting by Thomas Barrabi


