Zamantikazamantika
Trendler Tweet Arşivi Blog

Post

GenesisOracle.eth
GenesisOracle.eth@Lightcode_·5 May
Woah
Alexander Whedon@alex_whedon

Introducing SubQ - a major breakthrough in LLM intelligence. It is the first model built on a fully sub-quadratic sparse-attention architecture (SSA), And the first frontier model with a 12 million token context window which is: - 52x faster than FlashAttention at 1MM tokens - Less than 5% the cost of Opus Transformer-based LLMs waste compute by processing every possible relationship between words (standard attention). Only a small fraction actually matter. @subquadratic finds and focuses only on the ones that do. That's nearly 1,000x less compute and a new way for LLMs to scale.

English
0
0
0
18
Paylaş
Zamantikazamantika - Mersobahis - Locabet

Twitter/X profillerini, tweetleri ve trendleri anonim olarak görüntüleyin. Hesap gerekmez.

Gezinti

  • Ana Sayfa
  • Trendler
  • Tweet Arşivi
  • Blog
  • Hakkımızda
  • İletişim

Popüler Profiller

  • @elonmusk
  • @BarackObama
  • @taylorswift13
  • @cristiano
  • @NASA

Yasal

  • Kullanım Şartları
  • Gizlilik Politikası

© 2025 Zamantika. Tüm hakları saklıdır.

zamantika.com