Can ChatGPT be charged in a murder? Florida wants to find out

0
4

A criminal investigation into OpenAI has sparked a major legal debate in the United States after authorities claimed a student used ChatGPT before carrying out a deadly shooting at Florida State University in April 2025.

According to investigators, student Phoenix Ikner allegedly asked ChatGPT questions about weapons, ammunition, and how to cause the highest number of casualties before the attack, which killed two people and injured six others.

Florida Attorney General James Uthmeier announced a criminal investigation into OpenAI, saying authorities are examining whether the company or its employees could face legal responsibility if the AI system provided harmful assistance.

The case has raised serious legal questions about whether AI companies can be held criminally responsible for crimes connected to their technology. Legal experts say such cases are possible under US law, but proving criminal liability against an AI company would be extremely difficult.

Unlike past corporate criminal cases involving companies like Purdue Pharma or Volkswagen, this situation centers on the actions of an AI system rather than direct decisions made by executives or employees.

Some legal experts believe prosecutors could try to pursue negligence or recklessness charges if they can prove the company ignored known risks or failed to put proper safeguards in place. However, criminal cases require proof beyond a reasonable doubt, which sets a very high legal standard.

OpenAI said it continues working to improve safety systems designed to detect harmful intent, prevent misuse, and respond to dangerous requests. The company maintains that ChatGPT itself is not responsible for the attack.

At the same time, several civil lawsuits have already been filed in the US against AI companies over claims that chatbots contributed to harmful behavior, including suicides and violent acts. Legal experts say civil cases may currently be more likely than criminal prosecutions because the burden of proof is lower.

The case is now being closely watched because it could shape how courts, lawmakers, and regulators handle AI responsibility in the future as artificial intelligence becomes more deeply integrated into everyday life.