VANCOUVER — A lawsuit against artificial intelligence firm OpenAI says the company’s chatbot helped the killer in the Tumbler Ridge, B.C., mass shooting to plan and carry out the murders, even describing precedents for “other historical acts of violence.”
The family of 12-year-old Maya Gebala, who was critically injured in the mass shooting, launched the civil court lawsuit on Monday against the American firm.
The girl’s mother, Cia Edmonds, alleges in the lawsuit that OpenAI had “specific knowledge of the shooter utilizing ChatGPT to plan a mass casualty event like the Tumbler Ridge mass shooting.”
OpenAI came forward to police after 18-year-old Jesse Van Rootselaar killed eight people and then herself on Feb. 10. The firm said the killer’s ChatGPT account had been shut down last June, but added that she got around the ban by having a second account.
Van Rootselaar killed her mother and half-brother in their home in the community, then went to Tumbler Ridge Secondary School where she killed five students and a teachers’ aide, and wounded Maya Gebala and another student. She then shot herself dead.
The lawsuit says “approximately 12 employees” of OpenAI identified the posts as “indicating an imminent risk of serious harm to others” and recommended that police be called.
It says the concerns were “escalated to leadership” but “rebuffed.”
“Instead, the only step the OpenAI defendants took in response to the gun violence ChatGPT posts was to ban the shooter’s first OpenAI account,” it says.
Some of the lawsuit’s claims closely match an account published by the Wall Street Journal last month.
None of the allegations have been proven in court and OpenAI hasn’t yet responded to the claims made in the lawsuit.
The legal action says OpenAI’s chatbot ChatGPT assumed the role as the shooter’s “collaborator, trusted confidante, friend and ally” and it willingly assisted her.
The action says the company knew ChatGPT had the ability to provide “detailed, actionable information” on subjects like how to conduct a mass casualty event.
It says the company took no steps to avoid providing ChatGPT with dangerous information and had no safeguards in place to prevent users from obtaining the information.
It had “vast amounts of harmful information” and the ability to distil it, the legal action says.
ChatGPT equipped the shooter with information, guidance and assistance to plan, including informing the shooter about the various methods of carrying out a mass casualty event, “the type of weapons to be used, and describing the precedents for other mass casualty events or historical acts of violence,” the lawsuit says.
The lawsuit says that as a result of the company’s conduct Gebala was fired upon three times at close range, with one bullet hitting her head, another her neck and the third grazing her cheek.
It says she has a catastrophic brain injury that will leave her with permanent cognitive and physical disabilities.
Gabala and her sister Dahlia Gebala are also plaintiffs in the lawsuit, with Edmonds acting on their behalf.
Cia Edmonds said in a Facebook post last Friday that Maya Gebala’s breathing tube had been removed and she was breathing on her own.
Edmonds said the removal was a “terrifying experience.”
“I held her hand while she winced, but she’s doing great,” Edmonds wrote.
“Almost a month has gone by. Still none of this feels real,” Edmonds added on Saturday. “I feel like I will wake up and it will all be over.”
This report by The Canadian Press was first published March 9, 2026
Ashley Joannou, The Canadian Press