
beethoven
536 posts









One way to address this is to write a language model in a process theory with this connective: ⅋ ⅋ allows a type of non-local connection. Basically your inference pass does constraint solving with an interaction net (graph rewriting)



almost nobody who has children would press red unless they know for an absolute fact that their kids pressed red if you think there’s even a 5% chance your kids pressed blue, you will risk your life to improve their odds even by a tiny bit to take this a step further most women are probably hard-wired to press blue as a motherly instinct anyway, even if they have no children. so at minimum you have a 25% baseline for blue right there, which completely tips the odds so i think blue actually wins by a landslide if this was the real world and not a social media app for incels in other words humanity always had the biological programming to ensure blue wins




Everyone in the world has to take a private vote by pressing a red or blue button. If more than 50% of people press the blue button, everyone survives. If less than 50% of people press the blue button, only people who pressed the red button survive. Which button would you press?



I don’t have to be convinced that LLM’s make programmers more productive. But where’s all the stuff? We’ve now had months and months of 100x or 1000x programmet productivity improvements. Where’s all the stuff they’re building?


Computer scientists often seem incredibly confident one way or the other about computational functionalism. What they should say is that the arguments both for and against provide only inconclusive considerations and the right attitude is therefore one of great uncertainty.


Google DeepMind researcher argues that LLMs can never be conscious, not in 10 years or 100 years. "Expecting an algorithmic description to instantiate the quality it maps is like expecting the mathematical formula of gravity to physically exert weight."





Google DeepMind researcher argues that LLMs can never be conscious, not in 10 years or 100 years. "Expecting an algorithmic description to instantiate the quality it maps is like expecting the mathematical formula of gravity to physically exert weight."




