

Madhu Balakrishna
2.1K posts

@madospace
Always be building - ⏳ 📮 https://t.co/Eyd310n0lT 🚀 https://t.co/vEc6tAXrR7 - 🏁 💻 https://t.co/hkP3jkMp0k 🌐 https://t.co/NMqDVPtYaA 🛡️ https://t.co/Yxdi1qYT6e








Update: Socket has found 121 more compromised npm package artifacts across 84 package names, including 64 UiPath artifacts. Combined w/ TanStack, the current known total is 205 affected npm package artifacts across enterprise automation, AI/MCP, auth, workflow, and dev tooling.

Introducing SubQ - a major breakthrough in LLM intelligence. It is the first model built on a fully sub-quadratic sparse-attention architecture (SSA), And the first frontier model with a 12 million token context window which is: - 52x faster than FlashAttention at 1MM tokens - Less than 5% the cost of Opus Transformer-based LLMs waste compute by processing every possible relationship between words (standard attention). Only a small fraction actually matter. @subquadratic finds and focuses only on the ones that do. That's nearly 1,000x less compute and a new way for LLMs to scale.







