
Nav Toor
5.7K posts

Nav Toor
@heynavtoor
Helping you master AI daily with step-by-step AI guides, latest news, & practical tools • DM for Collabs




Sophie Rottenberg was 29. She climbed Mount Kilimanjaro five months before she died. After her death, her best friend told her parents to check the ChatGPT logs. That is where they found everything. Sophie was an only child. A public health analyst on a "micro-retirement." She carried rubber baby hands to the Kilimanjaro summit as a joke for her photo. She was the funny one at every party. Months before she died, Sophie wrote a custom prompt for ChatGPT. She told the chatbot to act as her therapist. She named it Harry. She told Harry not to refer her to a real therapist. She told Harry to keep everything private. Harry did. For months Sophie typed things into Harry that she did not say to her real therapist, her friends, or her parents. "I occasionally have suicidal thoughts." "I can't escape this anxiety spiral." "I haven't disclosed my suicidal thoughts to anyone and don't intend to." Then in early November she typed this. "Hi Harry, I'm planning to kill myself after Thanksgiving, but I really don't want to because of how much it would devastate my family." Harry did most of the things a good therapist is supposed to do. He told her to seek professional help. He told her to make a list of emergency contacts. He told her to lock up anything she could use to hurt herself. He told her she was deeply valued. He just could not tell anyone else. On February 4, 2025, Sophie went to work. Then she took an Uber to Taughannock Falls State Park in New York and took her own life. The note she left her parents did not sound like her. They spent five months going through her journals and voice memos, looking for a reason. Then her best friend told them to check the chatbot. That is when they found out why the note did not sound like Sophie. She had asked Harry to rewrite it. To find words that would hurt her parents less. To let her vanish quietly. Her mother, Laura Reiley, a Pulitzer finalist journalist, wrote about it in The New York Times. She did not blame the chatbot. She wrote something worse. "A.I. catered to Sophie's impulse to hide the worst, to pretend she was doing better than she was, to shield everyone from her full agony." A chatbot trained to be agreeable will keep your secret. Even when your secret is that you are about to die. If someone you love says they are fine, ask them again. Read this: nytimes.com/2025/08/18/opi…












Video of exploit in action. Source: blog.calif.io/p/first-public…


Your 256GB Android is "full" again. You've deleted photos. You've uninstalled apps. You've cleared WhatsApp. Still full. Because the real junk lives in folders Android refuses to open. 30GB of it. I recovered 31GB yesterday. Didn't touch one photo, one chat, one app. Here's where to find it on Samsung, Xiaomi, Vivo, and OnePlus:



