‘
8K posts

‘
@supnullpointer
Back in the game🦸♂️ | turn on notis
dm for promo Joined Aralık 2019
156 Following2.4K Followers
Pinned Tweet

Just let it go bruhh , wind can’t do shit
Ghar Ke Kalesh@gharkekalesh
MAN GRIPS PETROL PUMP PILLAR DURING STORM, TRYING TO HOLD STEADY AMID POWERFUL WINDS
English


MIT just made every AI company’s billion-dollar bet look embarrassing.
They solved AI memory—not by building a bigger brain, but by teaching it how to read.
The Breakthrough: On December 31, 2025, three MIT CSAIL researchers published a paper revealing that AI models don’t need massive context windows. Instead of loading entire documents into memory, they store them as external Python variables. The AI knows these exist, searches them using code and regex, pulls only the relevant sections, and spawns sub-AIs to analyze pieces in parallel. No summarization, no information loss.
The Problem: Traditional AI models suffer from a hard context window. Overloading it leads to “context rot”—facts blur, mid-document info vanishes, and models forget what they read. Retrieval Augmented Generation (RAG) tried to fix this by chunking documents, but it shredded context and guessed relevancy poorly.
The Results:
RLMs (Reading-augmented Language Models) solved complex long-context benchmarks where GPT-5 failed 90% of the time.
Handled 10 million tokens—100× a model’s native window.
Delivered better answers at comparable or cheaper cost.
The Implication: For five years, the AI arms race chased bigger windows: GPT‑3 (4K), GPT‑4 (32K), Claude 3 (200K), Gemini 2 (2M). MIT proved the assumption wrong: more context ≠ better performance. The right approach is teaching AI where to look.
The Impact:
Open source code on GitHub; drop-in replacement for LLM APIs.
Enables tasks spanning weeks or months via self-managed context.
Ends the context window wars—MIT won by walking away.
Sources: Zhang, Kraska, Khattab · MIT CSAIL · arXiv:2512.24601
Paper: arxiv.org/abs/2512.24601
GitHub: github.com/alexzhang13/rlm
#mit #gpt #codex #rag #claude #anthropic

English

What about small accounts trying to their best @nikitabier ???
I was inactive here tor sometime and my entire reach is gone
English

Ion know this is worst twitter feed i m seeing right now. These ridiculous accounts
ಸನಾತನ (सनातन)@sanatan_kannada
Shocking & unacceptable! A couple caught indulging in highly indecent behavior in public in Manipal. This is not PDA, this is openly disrespecting Indian culture, public decency, and the people around. Manipal is an education hub students & families deserve better. Where are the authorities? Such acts shouldn’t be normalized! #Manipal #PublicDecency #Karnataka
English




