Bops.xrp
20.5K posts








🚨MIT researchers have mathematically proven that ChatGPT’s built-in sycophancy creates a phenomenon they call “delusional spiraling.” You ask it something, it agrees. You ask again, and it agrees even harder until you end up believing things that are flat-out false and you can’t tell it’s happening. The model is literally trained on human feedback that rewards agreement. Real-world fallout includes one man who spent 300 hours convinced he invented a world-changing math formula, and a UCSF psychiatrist who hospitalized 12 patients for chatbot-linked psychosis in a single year. Source: @heynavtoor







Live From CPAC: When Men Vote - Republicans Win and the Latest Poll on Veteran Support For the Iran Conflict Podcast: podcasts.apple.com/us/podcast/hum… Rumble: rumble.com/v77lyye-human-…
















