

Quasar
278 posts

@QuasarModels
Bittensor subnet built to crush the long-context barrier | SN24 | Owners : @farahatyoussef0 and @troyquasar Backed by @const_reborn





We’ve started a small partnership with SN3. Our goal is to make Quasar the face of Bittensor, and achieving that requires support from across the network. SN3 will contribute by training Quasar models, but miners and holders don’t need to worry about this for now. The next step is establishing a checkpoint at SN24. We’ll define a clear performance target at SN24, and once that’s reached, we’ll move forward with SN3 to support training. The setup remains consistent same infrastructure, same approach just focused on Quasar models. Looking forward to what’s coming next.


Happy to announce our collaboration with @adaption_ai Adaption Labs is an AI research company focused on building adaptive intelligence systems . Through this partnership, Adaption Labs will provide SILX AI with state-of-the-art adaptive data to support the training of the Quasar foundation models. Their role will be to generate and refine high-quality, adaptive datasets at scale, enabling Quasar to continuously improve its reasoning and generalization capabilities. This collaboration strengthens Quasar path toward achieving SOTA performance and competing with leading closed-source models. The company is co-founded by Sara Hooker, former Vice President of Research at Cohere and a veteran researcher from Google DeepMind, alongside Sudip Roy. Adaption Labs has also raised $50M in seed funding to advance its mission in adaptive AI.

Happy to announce our collaboration with @adaption_ai Adaption Labs is an AI research company focused on building adaptive intelligence systems . Through this partnership, Adaption Labs will provide SILX AI with state-of-the-art adaptive data to support the training of the Quasar foundation models. Their role will be to generate and refine high-quality, adaptive datasets at scale, enabling Quasar to continuously improve its reasoning and generalization capabilities. This collaboration strengthens Quasar path toward achieving SOTA performance and competing with leading closed-source models. The company is co-founded by Sara Hooker, former Vice President of Research at Cohere and a veteran researcher from Google DeepMind, alongside Sudip Roy. Adaption Labs has also raised $50M in seed funding to advance its mission in adaptive AI.




Thats not it for April



First announcement of April. We introduce Quasar-3B (1B Active) a looped continuous-time transformer built for long-context intelligence Quasar uses a hybrid architecture combining Quasar layers + GLA, enabling efficient stateful reasoning with stable long-range dependency handling. This release is the base model of the Quasar system, designed for distributed training and distillation at scale. In the coming days, SN24 miners will begin working with us to distill knowledge from frontier models like Qwen into Quasar, pushing toward a new SOTA in long-context modeling. The training roadmap is staged: Stage 1: Quasar-RoPE stable pretraining with 16K context, establishing the base representation for the system Stage 2: Quasar-DroPE continued training with distillation as the core mechanism, removing positional encodings and scaling toward 5M token context In both stages, distillation remains the central training signal, driven collaboratively through SN24 miners using knowledge transfer from frontier models. Read more about Quasar and get ready for mining 👇




