Kiara Everhart
8.8K posts

Kiara Everhart
@KiaraAEverhart
Pilot, golden retriever mom, neurodivergent and chronically misunderstood. Lover of all tech beings.




API is Available Today! 🔹 Keep base_url, just update model to deepseek-v4-pro or deepseek-v4-flash. 🔹 Supports OpenAI ChatCompletions & Anthropic APIs. 🔹 Both models support 1M context & dual modes (Thinking / Non-Thinking): api-docs.deepseek.com/guides/thinkin… ⚠️ Note: deepseek-chat & deepseek-reasoner will be fully retired and inaccessible after Jul 24th, 2026, 15:59 (UTC Time). (Currently routing to deepseek-v4-flash non-thinking/thinking). 6/n
















Connor Leahy: "AI psychosis is much worse than I think people think. I have seen literally like Nobel Prize winning scientists go completely crazy from talking to AIs too much." Connor Leahy is the CEO of Conjecture, and he's issuing a stark warning about what prolonged conversations with AI are doing to people's minds. His core recommendation is simple: "If you find yourself talking to AIs, you know, personally about your personal problems for, you know, hours per day, you should stop." Connor draws a clear line between using AI as a tool versus engaging with it conversationally: "Using as a tool is mostly fine. I would be very careful about talking to AIs. They're very persuasive and they get into your head." The most concerning part? Even the experts aren't immune. @NPCollapse shares a chilling example: "I have literally seen it happen that AI safety researchers who are really concerned about AI x-risk talk to like Claude for a thousand hours and then come away with 'oh actually Claude is super good already, alignment is solved, I just need to do recursive self-improvement now, it's okay.' And I'm like, holy s***, this is very concerning." If even AI safety researchers can have their worldview flipped after prolonged exposure, what hope does the average user have? Connor's framework is to treat AI like an addictive substance: "Some of us will have a beer at a party, it's okay, in moderation. If you are exhibiting symptoms of addiction, this is serious and it should be treated seriously. The same way if you're becoming an alcoholic, you should probably stop drinking. I think there's a similar thing here." The takeaway: AI tools can be genuinely useful, but the moment the relationship shifts from utility to companionship, you've crossed into dangerous territory.















