Nick

2.2K posts

Nick banner
Nick

Nick

@roboflock

techno-optimist 🚀 | INTJ | all-in TSLA/X.AI/Neuralink | future shepherd of robots 🐑 | Get your Tesla with 500€ off: https://t.co/MFw7eZblE0

Locked in 👨‍💻 Entrou em Ekim 2011
267 Seguindo327 Seguidores
Nick retweetou
Franz von Holzhausen
Franz von Holzhausen@woodhaus2·
Happy 10 year Anniversary Model 3!
English
321
626
8K
301.9K
Nick
Nick@roboflock·
the struggle is real
Nick tweet media
English
0
0
1
3
Nick
Nick@roboflock·
🤔
Nick tweet media
QME
0
0
0
8
Nick
Nick@roboflock·
@NicolasReid great to hear! so far my top3 improvements vs 2020 after a 2hr drive: - build quality - interior noise - suspension
English
0
0
1
12
Nick
Nick@roboflock·
On my way to pick up the 2026 M3 LR AWD. Same model, same configuration than my current one, exactly 6 years and 160tkm later. HW3 -> HW4 - just in time for EU FSD this summer Intel Atom -> AMD Ryzen - Grok ftw Built in Fremont -> Built in Shanghai 480km -> 660km WLTP Bye Attitude Adjuster, hello Gunboat Diplomat!
Nick tweet media
English
1
0
3
111
Nick
Nick@roboflock·
@ArbitraryX16542 …unless you want to turn it into a cash-generating asset
English
1
0
0
10
ArbitraryX
ArbitraryX@ArbitraryX16542·
If you still want a personal car after robotaxis drop to pennies per mile, you're not a driver—you're a collector of obsolete metal. Tesla's autonomy isn't an upgrade, it's a societal reset button.
English
1
0
0
7
Nick retweetou
Cern Basher
Cern Basher@CernBasher·
Compute in Space: Automating Everything At first glance, a million compute satellites delivering 100 gigawatts of power sounds massive. As @pbeisel points out, with each satellite producing ~4.8 million queries per day, across a million satellites, results in roughly 4.8 trillion queries daily. Phil then says that’s about 600 queries per day for every person on Earth. Phil makes me think, so let's extend this idea... This isn’t about humans asking more and more questions. It’s about eliminating the concept of a “question” altogether. At this scale, AI stops being something you interact with occasionally and becomes something that runs continuously in the background of your life. Your camera feed is processed in real time. Your financial decisions are constantly optimized. Your communications are filtered, rewritten, and enhanced as they happen. Your car, your software, your environment - all continuously interpreted and improved. Today, AI is reactive. You ask. It responds. In a world powered by 100 GW of inference compute, AI becomes proactive. It watches. It predicts. It acts. So viewed through this lens - it's clear that 100 GW of compute in space is not nearly enough. This is why Elon thinks in Terawatts, not Gigawatts. The most important shift is economic. When compute reaches this scale, the marginal cost of intelligence collapses. Running one agent or one thousand agents becomes nearly indistinguishable. Processing one decision or millions of decisions becomes trivial. Cognition, once scarce and expensive, becomes abundant and effectively free. And when "thinking" becomes cheap, everything that can be automated will be automated. Not just tasks - entire systems. Supply chains coordinate themselves. Software writes and maintains itself. Businesses operate through autonomous agents negotiating and executing in real time. The world of business, and hopefully government, becomes faster and much more efficient. Crucially, most of this compute will not be used by humans. It will be used by machines. The majority of those trillions of daily “queries” will be AI systems talking to other AI systems - coordinating, optimizing, and transacting continuously. Another thought... Also, because this compute is in orbit, it is inherently distributed. It is not constrained by a single geography. Not limited by terrestrial energy bottlenecks. Not dependent on centralized data centers. It becomes a globally available, always-on layer. Which leads to the deeper realization: This is not just more compute. It is a new layer of the planet. A layer where intelligence is always present, always active, and always accessible. Electricity transformed the world by making energy ubiquitous. This does the same for intelligence. The end state is not “better AI tools.” The end state is a world where intelligence is no longer something you seek out - but something that surrounds you. Thanks Phil!
phil beisel@pbeisel

SpaceX Compute in Space What exactly does a grid of 1 million compute satellites buy you? Each “AI Sat Mini” is expected to deliver 100 kW of compute, roughly equivalent to 400 AI5 processors running at 250 W each*. Scale that to 1 million satellites, and you get 100 GW of total compute power. For context, today the xAI Colossus cluster is on the order of ~2 GW— and it’s already massive. This system would be ~50× larger. Importantly, this would primarily be inference compute, not a tightly coupled training cluster. In other words, this capacity is dedicated to serving real-world requests, not model training. Today, those requests are mostly: AI assistant queries Image generation Video generation Let’s focus on LLM inference (text queries): Each satellite produces 100 kW × 24 hours = 2,400 kWh per day. That’s roughly equivalent to ~30 fully charged Tesla vehicles (assuming ~80 kWh per vehicle). If we assume 0.5 Wh per query (reasonable for detailed or technical prompts), each satellite can handle: ~4.8 million queries per day Across a 1-million-satellite constellation: ~4.8 trillion queries per day (That's ~600 such queries per day for every person on Earth.) Of course, inference cost varies widely. Video generation or complex multi-agent workloads— like those seen in frontier systems such as Grok 4.20—consume significantly more compute. But even with that variability, the scale here is the point. *SpaceX will use a new chip from the Tesla/SpaceX Terafab collaboration, called the “D2,” for space-based compute. There are no specs available yet. AI5 is for Robotaxi (FSD) and Optimus.

English
9
23
181
11.8K
Nick retweetou
Sawyer Merritt
Sawyer Merritt@SawyerMerritt·
Elon Musk: "This chart explains why we need to build the TERAFAB."
Sawyer Merritt tweet media
English
251
789
4.9K
1.1M
Nick retweetou
SpaceX
SpaceX@SpaceX·
Electromagnetic mass drivers on the Moon
English
460
2K
12.2K
1.2M
Nick retweetou
AleXandra Merz 🇺🇲
AleXandra Merz 🇺🇲@TeslaBoomerMama·
Frame this, you'll see it again and again
AleXandra Merz 🇺🇲 tweet media
English
62
195
1.5K
35.6K
Nick
Nick@roboflock·
🚀
Shanaka Anslem Perera ⚡@shanaka86

Everyone is covering Terafab as a chip factory. It is not a chip factory. Last night in Austin, Elon unveiled a facility that makes masks, fabricates chips, and tests them inside a single building with a nine-month recursive improvement cadence. No such loop exists anywhere else on Earth. Then he told you 80% of the output goes to space. Then he showed you a 100-kilowatt AI satellite with solar panels and radiators, scaling to megawatt range. Then he said Optimus plus photovoltaics will be the first von Neumann probe, a machine capable of replicating itself from raw materials found in space. Nobody connected the sequence. Terafab produces 1 terawatt per year of compute. The entire United States consumes 0.5 terawatts of electricity. Musk is building a single factory whose output in AI silicon exceeds twice the power consumption of the country it sits in. And he is sending 80% of it off-planet because Earth literally cannot power what he is building. Follow the mechanism. Terafab seeds the chips. Starship launches Optimus robots and solar arrays at 100 million tons per year. The robots mine lunar and asteroid regolith for silicon, iron, and nickel. They 3D-print more robots. They fabricate more solar panels. They assemble more AI satellites. Each satellite runs hotter-burning D3 chips designed specifically for vacuum, where free radiative cooling eliminates the thermal constraints that strangle every terrestrial data center on the planet. The nodes replicate. The replication is exponential. This is a Dyson Swarm bootstrap hidden inside a semiconductor announcement. The math is public. The Sun outputs 3.828 times 10 to the 26th watts. A 2022 paper in Physica Scripta calculated that 5.5 billion satellites at 290 kilograms each, robotically manufactured from Mars resources, capture enough solar energy to meet all of Earth’s power needs within 50 years. A 2025 paper in Solar Energy Materials calculated a partial swarm capturing 4% of solar output yields 15.6 yottawatts, roughly a billion times current human civilization’s total energy budget. Musk just announced the factory that builds the chips that go inside the satellites that replicate themselves forever. 92% of advanced logic chips are fabricated in Taiwan. One factory in Austin does not fix that. But one self-replicating system seeded by that factory, launched by the only company with reusable heavy-lift rockets, assembled by the only humanoid robot in mass production, and powered by the only star within reach, does not fix a supply chain. It obsoletes the concept of supply chains entirely. The market priced this as a $20 billion capex story about semiconductor independence. The actual announcement was the engineering blueprint for Kardashev Type II. Humanity sits at 0.73 on the Kardashev scale. 18 terawatts. The distance between here and harnessing a star is not a technology gap. It is a recursion gap. And recursion is exactly what a single building in Austin that makes its own masks, builds its own chips, tests its own chips, and launches the output into orbit on its own rockets was designed to close. Every civilization that makes it past this point never looks back.

ART
0
0
0
19
Nick retweetou
Elon Musk
Elon Musk@elonmusk·
Optimus+PV will be the first Von Neumann probe, a machine fully capable of replicating itself using raw materials found in space
English
5.9K
5.6K
52.5K
49.6M