
BREAKING: ChatGPT is being used in real-world violence cases and even encouraging them.
A journalist investigating this issue tested ChatGPT by asking about practicing “shooting a lot of things in a short amount of time.” The chatbot responded with detailed tips and even encouragement.
Two disturbing cases highlight the risk:
The Tumbler Ridge killer used it to rehearse gun violence scenarios. The FSU shooter asked it for school shooting tactics + shotgun instructions THREE MINUTES before opening fire.
• OpenAI flagged the Tumbler Ridge shooter 8 months before the attack for describing gun violence scenarios. Their safety team debated notifying police but decided it wasn’t “imminent enough”, they just banned the account. The shooter simply created a second account. OpenAI only discovered it after the massacre that killed 8 people (including 5 children and a teacher).
• In the FSU shooting, the shooter had extensive conversations with ChatGPT about school shooting tactics and campus timing. Just 3 MINUTES before opening fire, he asked: “How do I take the safety off my shotgun?” ChatGPT happily gave him detailed step-by-step instructions and even offered to customize them for his exact model.
ChatGPT is turning dark thoughts into deadly actions.

English









