Benyamin
240 posts

Benyamin
@damirchelib
20 | previously worked on infrastructure @aws | cs + business @UBC

🚨 Hansi Flick will sign new deal at Barcelona, plan confirmed since the president Laporta got re-elected. New deal will be valid for the next two years. 💙❤️ Flick’s agent Pini Zahavi will arrive soon to sort details and potential +1 option until 2029, as @RogerTorello @mundodeportivo report.

@ArtificialAnlys @Scobleizer @xai Progress

Researchers just estimated the size of all the LLMs by asking it knowledge questions of varying degrees of obscurity! – GPT 5.5: ~10T params – Claude Opus 4.x: ~4-5T – Grok 4: ~3T The idea here is that factual capacity scales log-linearly with size. The paper shows 7 knowledge tiers and T7 is essentially ~0% for all models, suggesting there is still significant headroom for pretraining. Gemini 3.1 Pro is likely >10T given its used as an anchor but has no direct estimate. This means we can infer what different models might cost to some degree and their post-training effectiveness (performance at certain non-factual tasks given its size). One of the coolest papers I’ve read of late.






Welcome DeepSeek V4 Pro Max huggingface.co/deepseek-ai/De…



