固定されたツイート

I don't know if this was ever released but 10M is actually doable now
If you're interested I'm sure it would help with code generation and many other things @spawn @cline @replit @cursor_ai @windsurf_ai @boltdotnew @scoutdotnew
lmk my dms are open
Magic@magicailabs
LTM-2-Mini is our first model with a 100 million token context window. That’s 10 million lines of code, or 750 novels. Full blog: magic.dev/blog/100m-toke… Evals, efficiency, and more ↓
English





















