Jerry Scott
3.8K posts

Jerry Scott
@Jerry_Y_Scott
Learning to live in the moment | Cross training | Traveling the country & the world | Serving | Reading | Helping Parents | Retired!




@tomzhu_nz ❤️


My roadtrip down to Austin, TX on FSD 14 is officially complete! 1,665 miles driven on FSD V14 without a single intervention. My car drove me across the across the country all by itself. Tesla AI is absolutely insane. 🔥


Robots will bring billionaire living to a lot more people. I had the blessing to eat with @guysavoy several times. One of the best chefs in the world. He, and other top chefs taught me about the importance of getting fresh ingredients. Here is how robots and World Models will bring that and what do I mean by “everything as a service?” In three years I will have this conversation with my @1x_tech Neo humanoid robot: “Hey Neo I want to upgrade our food to billionaire level.” “I can do that. Food as a service costs $500 a month. I will buy only hand grown fresh organic food and I will prepare amazing meals for you and your family.” Where is the supply chain for such food? Farmers’ markets where everything is fresh and organic. You gotta stop buying at grocery stores to upgrade your diet. “Hey Neo here are the keys to @tesla Robotaxi. And here is my credit card. Start up food as a service.” Neo will take an autonomous car to the market. “But Neo how do you know where to go?” “Well a guy on X did a video of the farmer’s market nearby.” “I watched it, and now know roughly the kinds of things I can get there.” We are too late to start today, the market is closed now, but we can start next week. Look at this video the way Grok does. I am playing humanoid today. In one visit my Neo will ingest all of this into its World Model. In the second visit it will get even better. In the third visit even better. World models are going to be real time by the end of next year from a variety of companies. @fesla @robotaxi already serves both our home and the market. Our Tesla drove us there and already knows where it is. Grok is already a world model. In a few minutes it can tell you what it learned by watching this video. It watches all my videos before distributing them to you. So it knows how not to overwhelm @jason’s feed with my prolific posting. It will get a lot better soon. But after three trips to this farmer’s market my robot will know everything about this market including the names of the farmers. Watch this video, you meet one. Grok can do a RAG search and learn everything about him, including that he doesn’t have a Website, and only posts on Facebook. Also that he takes Apple Pay. It already knows everything it sees. The names of the vegetables, fruits, nuts, and what is ravioli. One vendor sells fresh ravioli made early this morning. If you are freaked out by privacy have your Neo stay in the garage until it is time to do something for you. In three years I will be eating fresh food with my brother in law while football is on the TV. If you don’t have a robot you won’t eat as well unless you are a billionaire who can afford to pay the human to shop and cook for you. The Robotaxi network starts up next year (without humans). The world models get good next year. By 2030 every one of you will have a robot in your home, at least part time. Who has the best world model? Tesla. Who understands the real world better? Grok. (I didn’t give this video to anyone else). Who soon will have the best humanoid? Tesla. Which company already has a Robotaxi in my driveway? Tesla. Which company has the best video ingestion engine? Tesla. Which company is about to turn on a real time world model? @xAI. Which company would you want to invest in? Tesla and xAI. Which is why, if you are a Tesla investor and you didn’t vote for Tesla to invest in xAI you hurting yourself.. Everything as a service is about to arrive. Everyone who can afford a $20,000 robot, which can be financed will have it next year. I will. Anyone worried about privacy has no idea how useful this all will be to make your lives better. And how much money it will make for a robot company to put it all together. And only Tesla has all the pieces to make the meal.

🚨 TESLA UNVEILS 3D “IRON MAN” DIAGNOSTICS FOR CYBERTRUCK Tesla’s new interactive 3D wiring diagram in Service Mode is blowing minds across the internet. It's letting owners spin, zoom, and tap through their Cybertruck’s electrical system right on the touchscreen. Users can click connectors to see pinouts, wire colors, and signal types in real time, turning hours of diagnostics into minutes. It’s all part of Service Mode Plus, available on the Cybertruck and upcoming Hardware 4 models like the refreshed Model Y. Tesla just made traditional car repair tools look like ancient myth. Source: @TESLA_winston, @Tesla


🚨TESLA'S FIRST PRINCIPLES: HOW ELON REBUILT THE CAR FROM SCRATCH Tesla didn't improve the internal combustion engine. Elon asked a fundamental question: what if we started over? Optimus embodies this philosophy. It's not a humanoid robot grafted onto existing robotics frameworks. It's atomic level reconstruction. Strip away assumptions. Rebuild from physics. Your Tesla knows your dinner reservation. It navigates automatically. Seamless. Why? Because Elon integrated OpenTable at the foundation, not as an afterthought. The car and your calendar aren't separate systems bolted together. They're conceived as one. First principles isn't philosophy. It's engineering. Boil everything down to fundamentals. Question every inherited assumption. Rebuild better. That's why Tesla's competing with century-old auto manufacturers and winning. Not through incremental improvements. Through demolition and reconstruction. Optimus proves it works. So does your car finding your restaurant. Source: @Tesla

Just talked a normie investor off the Elon ledge with this response: No, I do not think he’s wrong. 1. He hasn’t received remuneration from Tesla since 2018, due to the previous pay package invalidation via the Delaware Chancery Court. 2. He has stated numerous times that it is not the money that he is after, but the voting shares/ratio. To paraphrase him, he does not want to manufacture a robot army only to lose control of the company by an unhinged faction via a shareholder vote. 3. If he leaves, the best minds of Tesla will leave with him.


A new 30-minute presentation from @aelluswamy, Tesla’s VP of AI, has been released, where he talks about FSD, AI and the team’s latest progress. Highlight from the presentation: • Tesla's vehicle fleet can provide 500 years of driving data every single day. Curse of Dimensionality: • 8 cameras at high frame rate = billions of tokens per 30 seconds of driving context. • Tesla must compress and extract the right correlations between sensory input and control actions. Data Advantage: • Tesla has access to a “Niagara Falls of data” — hundreds of years’ worth of collective fleet driving. • Uses smart data triggers to capture rare corner cases (e.g., complex intersections, unpredictable behavior). Quality and Efficiency: • Extracts only the essential data needed to train models efficiently. Debugging and Interpretability: • Even though the system is end-to-end, Tesla can still prompt the model to output interpretable data: 3D occupancy, road boundaries, objects, signs, traffic lights, etc. • Natural language querying: ask the model why it made a certain decision. • These auxiliary predictions don’t drive the car but help engineers debug and ensure safety. Tesla’s Advanced Gaussian Splatting (3D Scene Modeling): • Tesla developed a custom, ultra-fast Gaussian splatting system to reconstruct 3D scenes from limited camera views. • Produces crisp, accurate 3D renderings even from few camera angles — far better than standard NeRF/splatting approaches. • Enables rapid visual debugging of the driving environment in 3D. Evaluation & World Models: • Evaluation is the hardest challenge: models may perform well offline but fail in real-world conditions. • Tesla builds balanced, diverse evaluation datasets focusing on edge cases — not just easy highway driving. Introduced a learned world simulator (neural network-generated video engine): • Can simulate 8 Tesla camera feeds simultaneously — fully synthetic. • Used for testing, training, and reinforcement learning. • Allows adversarial event injection (e.g., adding a pedestrian or vehicle cutting in). • Enables replaying past failures to verify new model improvements. • Can run in near real-time, letting testers “drive” inside a simulated world. What’s Next: • Scale robotaxi service globally. • Unlock full autonomy across the entire Tesla fleet. • Cybercab: next-gen 2-seat vehicle designed specifically for robotaxi use, targeting lowest transportation cost (cheaper than public transit). • Same neural networks will power Optimus humanoid robot. • The same video generation system is now being applied to Optimus. • The system can simulate and plan movement for robots, adapting easily to new forms. via the International Conference on Computer Vision (ICCV). Full presentation: youtube.com/watch?v=wHK8GM…













