Zsolt Ero
898 posts

Zsolt Ero
@hyperknot
Building https://t.co/CUfyhT0Ura and https://t.co/GTLrvnmS0h Writing on https://t.co/irgNrwubhY Loves paragliding


Microsoft weighs legal action over $50bn Amazon-OpenAI cloud deal ft.trib.al/6LZe39E






JUST IN: Google to launch ads on Gemini



Don't think of LLMs as entities but as simulators. For example, when exploring a topic, don't ask: "What do you think about xyz"? There is no "you". Next time try: "What would be a good group of people to explore xyz? What would they say?" The LLM can channel/simulate many perspectives but it hasn't "thought about" xyz for a while and over time and formed its own opinions in the way we're used to. If you force it via the use of "you", it will give you something by adopting a personality embedding vector implied by the statistics of its finetuning data and then simulate that. It's fine to do, but there is a lot less mystique to it than I find people naively attribute to "asking an AI".

















