Post








Here’s some alpha for you mfs Let’s think about what LLM’s are for a second. 1. They predict the next token Glad we got that out of the way. I’ve said this many times but when a user submits a query to a model, the user does not get an “answer” back from the model. They get the answer they would have given themselves if they were able to extend their thought trajectory using the models parameter space as a medium. Now - under this premise, why the hell are we asking models to generate specific code when they don’t generate in the first place. Current AI systems are adept navigators of high dimensional concept spaces, nothing more. Sure, every now and then you might luck out and cut the right grooves into the embedding space of the model, but if you do, it’s by random chance. Many would have much more success with LLM-driven software and especially systems engineering if they were to operate coding agents with this in mind. Treat docs as compilable specs and LLM’s as deterministic compilers when properly constrained and you’ll start shipping shit that will drop peoples jaws to the floor. You’ll have to figure the rest out for yourself ;)






















