Rony

2.3K posts

Rony banner
Rony

Rony

@LocalHost_8083

Emanating

Bergabung Ocak 2011
320 Mengikuti68 Pengikut
Rony me-retweet
How To AI
How To AI@HowToAI_·
Yann LeCun was right the entire time. And generative AI might be a dead end. For the last three years, the entire industry has been obsessed with building bigger LLMs. Trillions of parameters. Billions in compute. The theory was simple: if you make the model big enough, it will eventually understand how the world works. Yann LeCun said that was stupid. He argued that generative AI is fundamentally inefficient. When an AI predicts the next word, or generates the next pixel, it wastes massive amounts of compute on surface-level details. It memorizes patterns instead of learning the actual physics of reality. He proposed a different path: JEPA (Joint-Embedding Predictive Architecture). Instead of forcing the AI to paint the world pixel by pixel, JEPA forces it to predict abstract concepts. It predicts what happens next in a compressed "thought space." But for years, JEPA had a fatal flaw. It suffered from "representation collapse." Because the AI was allowed to simplify reality, it would cheat. It would simplify everything so much that a dog, a car, and a human all looked identical. It learned nothing. To fix it, engineers had to use insanely complex hacks, frozen encoders, and massive compute overheads. Until today. Researchers just dropped a paper called "LeWorldModel" (LeWM). They completely solved the collapse problem. They replaced the complex engineering hacks with a single, elegant mathematical regularizer. It forces the AI's internal "thoughts" into a perfect Gaussian distribution. The AI can no longer cheat. It is forced to understand the physical structure of reality to make its predictions. The results completely rewrite the economics of AI. LeWM didn't need a massive, centralized supercomputer. It has just 15 million parameters. It trains on a single, standard GPU in a few hours. Yet it plans 48x faster than massive foundation world models. It intrinsically understands physics. It instantly detects impossible events. We spent billions trying to force massive server farms to memorize the internet. Now, a tiny model running locally on a single graphics card is actually learning how the real world works.
How To AI tweet media
English
431
2.1K
12.2K
1.3M
Rony me-retweet
NASA
NASA@NASA·
Sky full of stars. Following a successful lunar flyby, the Artemis II astronauts captured this breathtaking photo of our galaxy, the Milky Way, on April 7, 2026.
NASA tweet media
English
4.3K
66.6K
361.6K
59.3M
Rony me-retweet
TDM (e/λ) (L8 vibe coder 💫)
Defending my Spring Boot Java app that uses 64GB RAM to return { "status": "ok" }
English
73
739
11.4K
541.5K
Rony me-retweet
sam🎮
sam🎮@sam_niac·
my daughter lost her first baby teeth! So proud.
sam🎮 tweet media
English
4.1K
4.5K
146.9K
10.8M
Rony me-retweet
Ruben Hassid
Ruben Hassid@rubenhassid·
BREAKING: Apple just proved AI "reasoning" models like Claude, DeepSeek-R1, and o3-mini don't actually reason at all. They just memorize patterns really well. Here's what Apple discovered: (hint: we're not as close to AGI as the hype suggests)
Ruben Hassid tweet media
English
2.6K
9.1K
62.9K
14.2M
Rony me-retweet
Satya Nadella
Satya Nadella@satyanadella·
A couple reflections on the quantum computing breakthrough we just announced... Most of us grew up learning there are three main types of matter that matter: solid, liquid, and gas. Today, that changed. After a nearly 20 year pursuit, we’ve created an entirely new state of matter, unlocked by a new class of materials, topoconductors, that enable a fundamental leap in computing. It powers Majorana 1, the first quantum processing unit built on a topological core. We believe this breakthrough will allow us to create a truly meaningful quantum computer not in decades, as some have predicted, but in years. The qubits created with topoconductors are faster, more reliable, and smaller. They are 1/100th of a millimeter, meaning we now have a clear path to a million-qubit processor. Imagine a chip that can fit in the palm of your hand yet is capable of solving problems that even all the computers on Earth today combined could not! Sometimes researchers have to work on things for decades to make progress possible. It takes patience and persistence to have big impact in the world. And I am glad we get the opportunity to do just that at Microsoft. This is our focus: When productivity rises, economies grow faster, benefiting every sector and every corner of the globe. It’s not about hyping tech; it’s about building technology that truly serves the world.
Satya Nadella tweet media
English
5.1K
18.5K
105.4K
27.1M
Rony me-retweet
Sundar Pichai
Sundar Pichai@sundarpichai·
Introducing Willow, our new state-of-the-art quantum computing chip with a breakthrough that can reduce errors exponentially as we scale up using more qubits, cracking a 30-year challenge in the field. In benchmark tests, Willow solved a standard computation in <5 mins that would take a leading supercomputer over 10^25 years, far beyond the age of the universe(!).
English
2.8K
12.1K
75.5K
19.2M
Rony me-retweet
Mind Essentials
Mind Essentials@Mind_Essentials·
Mind Essentials tweet media
ZXX
36
1.4K
20.5K
907.8K
Dmitrii Kovanikov
Dmitrii Kovanikov@ChShersh·
A gifted kid who won a math competition now writes JIRA tickets for a living
English
110
786
10.7K
776.4K
Anthony Cumia
Anthony Cumia@AnthonyCumia·
Hmmmm…what’s missing??? 🧐
English
332
259
5.4K
639.9K
Rony me-retweet
Years Progress
Years Progress@YearsProgress·
2023 is 100% complete.
Years Progress tweet media
English
197
9.9K
59.8K
2M
Rony
Rony@LocalHost_8083·
@9GAG The Expanse
English
0
0
0
137
Rony
Rony@LocalHost_8083·
@isro 👍👍🇮🇳🇮🇳
QME
0
0
0
18
Rony me-retweet
ISRO
ISRO@isro·
Chandrayaan-3 Mission: 'India🇮🇳, I reached my destination and you too!' : Chandrayaan-3 Chandrayaan-3 has successfully soft-landed on the moon 🌖!. Congratulations, India🇮🇳! #Chandrayaan_3 #Ch3
English
68.2K
268.4K
816.5K
71.1M
Rony me-retweet
Prof. Feynman
Prof. Feynman@ProfFeynman·
Humans are hooked. Machines are Learning. 🧠
English
48
966
6.3K
0