Transformer Lab retweetledi

Looks like it’s confirmed Cursor’s new model is based on Kimi! It reinforces a couple of things:
- open-source keeps being the greatest competition enabler
- another validation for chinese open-source that is now the biggest force shaping the global AI stack
- the frontier is no longer just about who trains from scratch, but who adapts, fine-tunes, and productizes fastest (seeing the same thing with OpenClaw for example).
Lee Robinson@leerob
Yep, Composer 2 started from an open-source base! We will do full pretraining in the future. Only ~1/4 of the compute spent on the final model came from the base, the rest is from our training. This is why evals are very different. And yes, we are following the license through our inference partner terms.
English














