
Is software development overfitted to the von Neumann-Turing architecture? I was on a panel about the future of software development and the use of AI (i.e., vibe coding). Joining me were @HerwigMannaert, @_HaWe_, Luigi Lavazza, Scott Gallant, and a sharp audience. The discussion started with the traditional shortcomings of vibe coding and the challenges of translating prompts into correct code. Also the amount of energy required to train transformer-based models. Then the conversation switched to a topic far more interesting and engaging. We know that natural language is used very effectively to train models and that there is existing hardware that consumes an average of 20–25 Watts per day. Of course, I’m talking about the human brain. Maybe the next breakthrough in AI will be in new hardware-software architectures, escaping the 80-year-old hardware we’re still programming for.






