Leo 🏴☠️
149 posts


I don't think Rust is the best programming language for LLMs. The only reason why Greg said this is because Rust limits the LLM from making mistakes by not compiling shitty code, but the LLMs still suggest using unsafe! to bypass security rules.
Therefore, I would say that Go is the best — if there is any "best" language for LLMs. Since LLMs are based on next-token prediction, it's more important that the code looks almost the same to get a good result. Go was designed to make every app around the globe look almost the same; for instance, there is only one way to make a loop compared to Rust, which has at least five that I could think of (loop, for, while, while let, and map).
I want the best language to be rust tho
Greg Brockman@gdb
rust is a perfect language for agents, given that if it compiles it's ~correct
English

@leostera I’ve been keeping the number of people I follow at 255.
I had to unfollow someone to find a place for you ❤️
English

@leostera Long time no see indeed!
I thought you were forever gone.
English

hi folks! long time no see. i’m officially open to work again, primarily interested in the dx + devtools + ai space
if anyone needs a principal engineer or technical product manager with 16+ years in the industry, my DMs are open!
linkedin.com/in/leostera
github.com/leostera
English

@coreyepstein not yet, but i’m building a product and this is what’s open sourceable from it
English

@leostera Do you have this running in production for anything?
English
@leostera @emil_priver i agree to disagree. the best is erlang/otp. ;)
English

@scheminglunatic @emil_priver you got me.
normal ocaml code is pretty great, i haven't tried reason syntax.
English

@leostera @emil_priver Have you noticed a difference in lalamo accuracy between normal ocaml and reasonml syntax for ocaml?
because i have this hunch that if u make it look more like a 'normal language' it might be better. on the other hand it might lead to more confusion or interference.
English

@DanniFriedland oh i'm also the cleanup crew so the types stay like i want them
English

@leostera thats super nice to try as well. how do you constrain the llm not to add more types?
English

Today is my birthday and my wish is exactly 103 stars to github.com/dmtrKovalenko/… today
English

@stuckinforloop good thing is that its easy to make your own provider
impl Provider for stuckinforloop::SlopMaker {}
and just add it to your llm runner
let llm = LlmRunner::with_provider(SlopMaker)
and well, do whatever you want there
English

@stuckinforloop i mean ymmv but agents::llm uses our own internal representation that gets mapped from/to the different provider modules i have
English

if you're building agents in rust and you need or have built your own:
* multi provider llm runners with fallback and usage tracking
* turn-based / streamed agent loops with cancels and steering
* type-safe tool runners, messages, and response formats
i got good news for you!
Leo 🏴☠️@leostera
cargo add agents i did a thing github.com/leostera/agent…
English



