
Ronan Farrow just did to Sam Altman what he did to Harvey Weinstein... The New Yorker dropped an 18-month investigation this morning based on 100+ interviews and a stack of internal documents that were never supposed to leave OpenAI. Ilya's secret memos, Dario Amodei's private journal. Board communications, the full picture of who Sam Altman is when the cameras are off. And the pattern starts way before OpenAI. At his first startup Loopt, senior employees went to the board and asked them to fire Sam as CEO. This happened twice, over concerns about leadership and transparency. He left, joined Y Combinator, and the same thing played out. Partners complained to Paul Graham about Sam's behavior. Graham's private take to colleagues: Sam had been lying to us all the time. Nobody removed him, he kept getting promoted. Eventually he landed the CEO seat at what is now the most consequential AI company in the world. Inside OpenAI, Ilya Sutskever spent months compiling evidence: 70 pages of Slack messages, HR documents, and photos taken on personal phones because employees knew company devices were being monitored. He sent everything to the board as disappearing messages so Sam couldn't make it go away. The very first line of his memo lists Sam's core pattern, and the first word on that list is: Lying Dario Amodei saw the same thing and handled it differently. He kept a private journal for years, over 200 pages, titled "My Experience with OpenAI" with a subheading that said "Private: Do Not Share" After all those pages, his conclusion was one sentence: the problem with OpenAI is Sam himself. He eventually left and built Anthropic. 2 of the smartest people in AI independently reached the same verdict. Neither could stop what was happening. The superalignment team, the group responsible for making sure AI doesn't go off the rails, was promised 20% of OpenAI's compute. 4 people who worked on or with the team told The New Yorker the real number was 1-2%, running on the oldest cluster with the worst hardware. The team got dissolved before finishing its work. Safety was a talking point, not a priority. Sam told the board that a safety panel had approved controversial features in GPT-4. When board member Helen Toner asked for the documentation, it turned out the most sensitive features had never been approved at all. Separately, Microsoft released an early version of ChatGPT in India without completing a required safety review and Sam never mentioned it to the board. When the board finally fired him in November 2023, he texted Satya Nadella directly with his own replacement board lineup. Thrive Capital put its planned $86B investment on hold and signaled it would only close if Sam came back, giving every OpenAI employee a financial reason to support his return. The 2 board members selected to run the "independent investigation" into Sam's conduct were chosen after close conversations with Sam himself. He engineered his own reinstatement and nobody blinked. The New Yorker quotes a board member describing Sam as having two traits you almost never see in the same person: a desperate need to be liked in every interaction, and a near-complete indifference to the consequences of deceiving someone. Multiple sources used the word "sociopathic" without being prompted and without talking to each other. The article also drops a line that might be the best summary of the whole thing. They compare Sam to Steve Jobs and his famous "reality distortion field" then point out that even Jobs never told his customers that if they didn't buy his MP3 player, everyone they loved would die. Sam wrapped that exact pitch in the language of AI safety and rode it to a potential $1T IPO. That IPO is being prepared right now, while OpenAI signs government contracts spanning immigration enforcement, domestic surveillance and autonomous weapons deployed in active war zones. Meanwhile, The Information reports that his own CFO told colleagues she doesn't believe the company is ready to go public in 2026. Farrow is answering questions on Hacker News right now. The full piece is open access. Read it before Sam figures out how to make this disappear too


































