

bitstarter
244 posts

@bitstarterAI
Bittensor’s first crowdfunding platform. Discover new teams. Pledge TAO. And get liftoff 🚀 Open for all: TG @ https://t.co/16ayiZ35IL From @macrozack 🫶



















We are pleased to announce our partnership with bitstarter.ai. Together with @bitstarterAI, Systango will act as a development partner for subnet teams, supporting everything from architecture & incentive design to production-grade deployment. bit.ly/421vbQZ









First announcement of April. We introduce Quasar-3B (1B Active) a looped continuous-time transformer built for long-context intelligence Quasar uses a hybrid architecture combining Quasar layers + GLA, enabling efficient stateful reasoning with stable long-range dependency handling. This release is the base model of the Quasar system, designed for distributed training and distillation at scale. In the coming days, SN24 miners will begin working with us to distill knowledge from frontier models like Qwen into Quasar, pushing toward a new SOTA in long-context modeling. The training roadmap is staged: Stage 1: Quasar-RoPE stable pretraining with 16K context, establishing the base representation for the system Stage 2: Quasar-DroPE continued training with distillation as the core mechanism, removing positional encodings and scaling toward 5M token context In both stages, distillation remains the central training signal, driven collaboratively through SN24 miners using knowledge transfer from frontier models. Read more about Quasar and get ready for mining 👇