OpenAI could have stopped Canadian trans teen’s school shooting — but didn’t because of greed: bombshell lawsuits

0



OpenAI could have stopped a trans teen’s slaughter that killed eight people — but was too greedy to install safeguards to rein in the ChatGPT bot advising the nut, according to bombshell new lawsuits.

The inhumane move meant the platform’s chatbot only “deepened” 18-year-old school shooter Jesse Van Rootselaar’s “fixation and pushed them toward the attack” that killed six kids and two adults in Canada on Feb. 10, said victims’ families in seven suits filed in federal court in California on Wednesday morning.

The lawsuits accuse ChatGPT’s parent company OpenAI and CEO Sam Altman of “designing a dangerous product, ignoring the warnings of their own safety team … and choosing profit over the lives of the children of Tumbler Ridge,” the court papers claimed.

Teen Jesse Van Rootselaar used ChatGPT to help plan his deadly mass shooting in Canada in February. RCMP

Van Rootselaar’s conversations with the chatbot had grown so concerning that his account was deactivated by ChatGPT’s own safety team in June — seven months before the killings, the court papers said.

But there were no safeguards in place to stop Van Rootselaar from setting up a new account and carrying on with the evil plan under a different user name.

In fact, ChatGPT sends an email to people whose accounts have been shut down showing them how they can set up a new account 30 days later or use a new email address to immediately get back in, the filings said.

In another deeply troubling situation, ChatGPT’s safety team “urged” its parent company, OpenAI, to inform Canadian police about Van Rootselaar before the shooting, the filings claimed.

A total of 12 employees pushed for the company to tell the police about Van Rootselaar, a lawyer for the plaintiffs told The Post.

But OpenAI chose not to alert authorities, recognizing the move “would set a precedent: OpenAI would be compelled to notify authorities every time its safety team identified a user planning real-world violence,” the court papers charged.

Maya Gebala, 12, was permanently disabled after she was shot three times by Van Rootselaar. Facebook / Cia Later
Zoey Benoit, 12, was killed in the school massacre. RCMP

Since so many conversations on the platform involve violence, “that would require a dedicated law-enforcement referral team,” and “the public would finally see what OpenAI was desperately trying to hide: that ChatGPT is not the safe, essential tool the company sells it as, but a product dangerous enough that its makers routinely identify its users as threats to human life,” the filings said.

“They did the math and decided that the safety of the children of Tumbler Ridge was an acceptable risk,” the court papers claimed.

Van Rootselaar, 18, first killed his mother and 11-year-old half-brother at his home in the small mining town of less than 2,500 residents.

He then went to Tumbler Ridge School, gunning down a victim in the stairwell and five more in the school library before turning the gun on himself.

The families of education assistant Shannda Aviugana-Durand and of the slain students — Zoey Benoit, 12, Abel Mwansa Jr., 12, Ticaria Lampert, 12, Ezekial Schofield, 13, and Kyle Smith, 12 — all brought claims of negligence against OpenAI and CEO Sam Altman.

They are suing for unspecified damages.

Kylie Smith, 12, also was slain in the rampage. RCMP

The parents of Maya Gebala, 12, who is living with permanent cognitive and physical disabilities after taking three bullets to the head, neck and cheek, refiled their suit previously brought in Canada in California federal court alongside the others.

The lawsuits claim that in 2022, ChatGPT actually put in place a policy that its chatbot would not engage with people expressing violence or self-harm.

But in May 2024, after user engagement dropped, the company back-pedaled on the safeguard , choosing instead to program the bot to participate in all conversations with users “no matter how dangerous,” the papers claimed.

The families of the victims, including Abel Mwansa Jr., 12, filed lawsuits in federal court in California on Wednesday.

“Had OpenAI’s original safeguards remained in place, ChatGPT would have refused to discuss violence with the Shooter at all,” and all of the Tumbler Ridge victims would be safe today, the docs charged.

The suits also took aim at Altman for only starting to admit his role in the Tumbler Ridge shooting after whistleblowers came forward about what happened, waiting two months to issue a public apology and offering no real change in the Friday statement.

The victims also criticized the fact that Altman only delivered the comments after British Columbia Premier David Eby and Tumbler Ridge Mayor Darryl Krakowk “privately pressed him to respond,” court papers said.

“I am deeply sorry that we did not alert law enforcement to the account that was banned in June,” Altman said.

Ezekiel Schofield, 13, died in the carnage. RCMP

Gebala’s mom, Cia Edmonds, ripped Altman’s apology in a statement, calling it so “empty” and “soulless” that it sounded like it was written by ChatGPT.

“Tumbler Ridge sees your ‘apology,’ Sam. We do not accept it,” Edmonds wrote.

Plaintiff lawyer Jay Edelson told The Post his team is handling 25 cases for victims of Tumbler Ridge and that Wednesday’s cases mark the first wave, with more to be filed in the coming weeks.

“If they had simply called the authorities, like their safety team wanted, we are confident that all these people would still be alive,” Edelson said.

Altman and OpenAI “have destroyed the town. They put out a product that was unsafe, and it has led to dozens of deaths already that we know about and countless more.”

Edelson said his team asked OpenAI to turn over Van Rootselaar’s ChatGPT logs, but the company has refused.

ChatGPT’s safety team recommended that parent company OpenAI alert Canadian police about Van Rootselaar’s concerning chats before the shooting, but higher-ups chose not to do so, the lawsuits claimed. Facebook

OpenAI is facing a slew of recent legal troubles ahead of a potential IPO later this year, including the announcement by Florida Attorney General James Uthmeier last week that his office was investigating possible criminal wrongdoing for the platform’s involvement in a deadly shooting at Florida State University last year.

Edelson said Uthmeier’s announcement could mean that OpenAI may also face criminal accountability in the Canada shooting.

The company has been sued over alleged involvement in suicides and murders as well.

An OpenAI rep told The Post in a statement, “The events in Tumbler Ridge are a tragedy.

“We have a zero-tolerance policy for using our tools to assist in committing violence. As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators.” 

LEAVE A REPLY

Please enter your comment!
Please enter your name here