
Chris Seltzer
4.6K posts

Chris Seltzer
@ChrisSeltzer
Software Developer, Infantryman


Introducing SubQ - a major breakthrough in LLM intelligence. It is the first model built on a fully sub-quadratic sparse-attention architecture (SSA), And the first frontier model with a 12 million token context window which is: - 52x faster than FlashAttention at 1MM tokens - Less than 5% the cost of Opus Transformer-based LLMs waste compute by processing every possible relationship between words (standard attention). Only a small fraction actually matter. @subquadratic finds and focuses only on the ones that do. That's nearly 1,000x less compute and a new way for LLMs to scale.


Trump:







Additional footage of Marco Rubio DJing at a family wedding last night.










i feel like nobody knows how to read this movie. the basic idea is we spend the whole movie getting madder and madder at the maya for what they did to this guy and his family. then suddenly at the end the spanish show up and we realize they're about to do the same exact shit on a much greater scale. it's an anti-colonialism movie from a very non-woke perspective, which people think is a contradiction for some reason




















