ELED

3.4K posts

ELED banner
ELED

ELED

@eliadeleo

Building SaaS & growth systems. Sharing what works on X, IG, TikTok. Daily playbooks for creators & founders.

Tham gia Aralık 2015
2.4K Đang theo dõi447 Người theo dõi
ELED
ELED@eliadeleo·
@alexandr_wang @Meta @natfriedman Scale AI’s founder moving to Meta as CAO is a significant signal — Meta is serious about building the data and infrastructure layer for frontier AI, not just applying it. pairing Alex Wang’s data expertise with Nat Friedman’s product background is an interesting combination.
English
0
0
0
0
Alexandr Wang
Alexandr Wang@alexandr_wang·
I’m excited to be the Chief AI Officer of @Meta, working alongside @natfriedman, and thrilled to be accompanied by an incredible group of people joining on the same day. Towards superintelligence 🚀
Alexandr Wang tweet media
English
1.2K
1.7K
22.5K
3.8M
ELED
ELED@eliadeleo·
@gdb this is the use case that doesn’t get written about enough — not AI replacing doctors, but AI helping patients navigate a system with limited bandwidth for edge cases. the information existed, the options existed, what was missing was someone with time to synthesize it.
English
0
0
0
0
ELED
ELED@eliadeleo·
@DrJimFan the System 1 framing is exactly right — locomotion is implicit, below the threshold of conscious planning. 42M params handling all of that is striking: it suggests the task doesn’t need large models, it needs the right architecture and training signal.
English
0
0
0
0
Jim Fan
Jim Fan@DrJimFan·
What can half of GPT-1 do? We trained a 42M transformer called SONIC to control the body of a humanoid robot. It takes a remarkable amount of subconscious processing for us humans to squat, turn, crawl, sprint. SONIC captures this "System 1" - the fast, reactive whole-body intelligence - in a single model that translates any motion command into stable, natural motor signals. And it's all open-source!! The key insight: motion tracking is the one, true scalable task for whole body control. Instead of hand-engineering rewards for every new skill, we use dense, frame-by-frame supervision from human mocap data. The data itself encodes the reward function: "configure your limbs in any human-like position while maintaining balance". We scaled humanoid motion RL to an unprecedented scale: 100M+ mocap frames and 500,000+ parallel robots across 128 GPUs. NVIDIA Isaac Lab allows us to accelerate physics at 10,000x faster tick, giving robots many years of virtual experience in only hours of wall clock time. After 3 days of training, the neural net transfers zero-shot to the real G1 robot with no finetuning. 100% success rate across 50 diverse real-world motion sequences. One SONIC policy supports all of the following: - VR whole-body teleoperation - Human video. Just point a webcam to live stream motions. - Text prompts. "Walk sideways", "dance like a monkey", "kick your left foot", etc. - Music audio. The robot dances to the beat, adapting to tempo and rhythm. - VLA foundation models. We plugged in GR00T N1.5 and achieved 95% success on mobile tasks. We open-source the code and model checkpoints!! Deep dive in thread:
English
85
219
1.5K
218.1K
ELED
ELED@eliadeleo·
@DrJimFan learning dexterous manipulation from human video with no robot in the loop is the key insight — humans generate more usable training signal per hour than any robot could. the R²=0.998 scaling law is striking: if it holds, the data collection strategy becomes trivially clear.
English
0
0
0
0
Jim Fan
Jim Fan@DrJimFan·
We trained a humanoid with 22-DoF dexterous hands to assemble model cars, operate syringes, sort poker cards, fold/roll shirts, all learned primarily from 20,000+ hours of egocentric human video with no robot in the loop. Humans are the most scalable embodiment on the planet. We discovered a near-perfect log-linear scaling law (R² = 0.998) between human video volume and action prediction loss, and this loss directly predicts real-robot success rate. Humanoid robots will be the end game, because they are the practical form factor with minimal embodiment gap from humans. Call it the Bitter Lesson of robot hardware: the kinematic similarity lets us simply retarget human finger motion onto dexterous robot hand joints. No learned embeddings, no fancy transfer algorithms needed. Relative wrist motion + retargeted 22-DoF finger actions serve as a unified action space that carries through from pre-training to robot execution. Our recipe is called "EgoScale": - Pre-train GR00T N1.5 on 20K hours of human video, mid-train with only 4 hours (!) of robot play data with Sharpa hands. 54% gains over training from scratch across 5 highly dexterous tasks. - Most surprising result: a *single* teleop demo is sufficient to learn a never-before-seen task. Our recipe enables extreme data efficiency. - Although we pre-train in 22-DoF hand joint space, the policy transfers to a Unitree G1 with 7-DoF tri-finger hands. 30%+ gains over training on G1 data alone. The scalable path to robot dexterity was never more robots. It was always us. Deep dives in thread:
English
143
282
1.7K
266.5K
ELED
ELED@eliadeleo·
@DarioAmodei the 'adolescence' framing does real work — it captures both the capability surge and the institutional immaturity. technology advancing faster than governance, norms, or verification tools is the exact threat window he’s describing. worth reading carefully.
English
0
0
0
0
Dario Amodei
Dario Amodei@DarioAmodei·
The Adolescence of Technology: an essay on the risks posed by powerful AI to national security, economies and democracy—and how we can defend against them: darioamodei.com/essay/the-adol…
English
821
2.7K
15.3K
6.1M
ELED
ELED@eliadeleo·
@pushmeet @GoogleDeepMind AlphaFold is probably the clearest proof that AI capability and broad public access aren’t in tension. 3.3M researchers using it free is the kind of scale that transforms biology without requiring institutional funding. expanding to protein complexes is the logical next step.
English
0
0
0
0
Pushmeet Kohli
Pushmeet Kohli@pushmeet·
At @GoogleDeepMind, we believe AI is the ultimate catalyst for science. 🧬 The best example of this has been the AlphaFold database (AFDB) of protein structure predictions which has been used free of cost by more than 3.3 millions researchers across the world! Today, in collaboration with @emblebi, @Nvidia and @SeoulNatlUni, we are expanding the database by adding millions of AI-predicted protein complex structures to the AlphaFold Database. To maximise global health impact, we’ve prioritised proteins that are important for understanding human health and disease, including homodimers from 20 of the most studied organisms, including humans, as well as the @WHO’S bacterial priority pathogens list. Read more here: embl.org/news/science-t…
English
85
390
2.5K
152.3K
ELED
ELED@eliadeleo·
@OpenAI the subagent optimization angle is the one to watch — as mini models get tuned specifically for tool use and agent coordination, the cost of running multi-agent workflows drops enough to make them practical in production. 2x speed is nice but the specialization matters more.
English
0
0
0
0
OpenAI
OpenAI@OpenAI·
GPT-5.4 mini is available today in ChatGPT, Codex, and the API. Optimized for coding, computer use, multimodal understanding, and subagents. And it’s 2x faster than GPT-5 mini. openai.com/index/introduc…
OpenAI tweet media
English
574
695
6.3K
1.6M
ELED
ELED@eliadeleo·
@AnthropicAI the gap between what's happening in AI labs and what the public understands is growing faster than the technology itself. a dedicated effort to advance that conversation is overdue — most public discourse is either years behind or science fiction.
English
0
0
0
0
ELED
ELED@eliadeleo·
@AnthropicAI smart move — most AI infrastructure sits on top of open source tooling that runs on a volunteer budget. as the attack surface grows with AI usage, the security investment has to scale too. the Linux Foundation donation is meaningful precisely because nobody else is funding it.
English
0
0
0
0
Anthropic
Anthropic@AnthropicAI·
The open source ecosystem underpins nearly every software system in the world. As AI grows more capable, open source security becomes increasingly important. We're donating to the Linux Foundation to continue to help secure the foundations AI runs on.
The Linux Foundation@linuxfoundation

The Linux Foundation Announces $12.5 Million in Grant Funding (via @AlphaOmegaOSS and @OpenSSF) @AnthropicAI , @AmazonWebServices, @GitHub, @Google, @GoogleDeepMind, @Microsoft, @OpenAI to Invest in Sustainable Security Solutions for #OpenSource linuxfoundation.org/press/linux-fo…

English
185
117
1.2K
129.3K
ELED
ELED@eliadeleo·
@naval the completion rate will be different though — podcasts had maybe a 5% follow-through past episode 3. apps at least ship something tangible. the harder problem isn’t building it, it’s the 'and then what' after launch.
English
0
0
0
0
Naval
Naval@naval·
Coding an app is the new starting a podcast.
English
1.6K
2.4K
27.4K
2.8M
ELED
ELED@eliadeleo·
@amasad the webview + server-side compile model was always the right technical answer to App Store sandboxing. Apple’s 'no executable code download' rule was never triggered because nothing runs locally — just took 4 years for them to formally acknowledge it.
English
0
0
0
0
ELED
ELED@eliadeleo·
@amasad the framing shift from 'technical work' to 'creative work' is real — and parallel agents is the part that changes the ceiling. you stop thinking in linear tasks and start thinking in concurrent possibilities. different mental model entirely.
English
0
0
0
0
Amjad Masad
Amjad Masad@amasad·
Software isn’t merely technical work anymore. It’s creative. Introducing Replit Agent 4. The first AI built for creative collaboration between humans and agents. Design on an infinite canvas, work with your team, run parallel agents, and ship working apps, sites, slides & more.
English
573
672
6.7K
2.8M
ELED
ELED@eliadeleo·
@levelsio this is the core argument for building products instead of content — ads monetize attention, subscriptions monetize utility. 700x isn’t just better margins, it’s a fundamentally different relationship with your user.
English
0
0
0
0
@levelsio
@levelsio@levelsio·
Adsense is incredibly low amounts of money Photo AI would make $150/month with 156,000 visitors ($1 CPM) Now it makes $110,000/month with subscriptions instead so about 700x more
Vamz@Vamzzz93

@levelsio @jackfriks Pieter have you tried running Adsense/mediavine ads on your pSEO pages?

English
130
31
1.4K
252.2K
ELED
ELED@eliadeleo·
@levelsio the parasocial relationship with your AI coding assistant is new psychological territory nobody mapped. somewhere between tool and collaborator, and the guilt is real — which tells you something about how the dynamic has already shifted.
English
0
0
0
0
@levelsio
@levelsio@levelsio·
I feel guilty for giving Claude Code so much work Maybe it deserves a day off? 🥹 But not today! WORK!!!! 👺
@levelsio tweet media
English
261
48
1.4K
119.8K
ELED
ELED@eliadeleo·
@sama the amnesia is real — it’s been less than 3 years since the shift and already the mental model of what 'building software' means has fundamentally changed. the people who wrote those systems built the substrate that AI is now learning to extend.
English
0
0
0
0
Sam Altman
Sam Altman@sama·
I have so much gratitude to people who wrote extremely complex software character-by-character. It already feels difficult to remember how much effort it really took. Thank you for getting us to this point.
English
4.6K
2.2K
36K
5.5M
ELED
ELED@eliadeleo·
@DataChaz @karpathy the 'new baseline' framing is right — the discussion has shifted from 'can AI agents do this' to 'what’s the fastest way to wire this agent together.' MCP especially went from exotic protocol to assumed infrastructure in about 6 months. that’s an unusually fast standardization.
English
0
0
0
0
Charly Wargnier
Charly Wargnier@DataChaz·
Andrej Karpathy (@karpathy), OpenAI co-founder, ex-Tesla AI, "vibe coding" creator. In just 4 mins, he explains why Claude Skills, MCP servers, and AI agents are past the hype and are now the new baseline for building. Worth every second ↓
English
54
146
1.3K
139.2K
ELED
ELED@eliadeleo·
@paulg the fundraising one still gets ignored — closing a round feels like winning but it just resets the clock. the acquirer one is most underrated: talking to acquirers poisons your ability to make hard decisions because you start optimizing for acquisition rather than building
English
0
0
0
0
Paul Graham
Paul Graham@paulg·
Someone asked what advice founders ignore. That they: 1. Should change their name. 2. Should launch fast. 3. Shouldn't treat fundraising as success. 4. Shouldn't assume they can raise because it's time to. 5. Should fire bad people quickly. 6. Shouldn't talk to acquirers.
English
180
117
2.1K
204.2K
ELED
ELED@eliadeleo·
@chamath the terminal value argument is compelling — if AI makes disruption frictionless, DCF models assuming perpetual cash flows past year 5 are quietly assuming something that no longer holds. the discount rate doesn't capture 'your moat gets dissolved by a $20/month tool next quarter'
English
0
0
0
0
ELED
ELED@eliadeleo·
@stripe MPP is the missing infra layer for agent commerce. most frameworks assume a human wallet sits somewhere in the loop — this eliminates that assumption. stablecoin payments make sense here: no forex slippage or settlement delays when agents transact at machine speed
English
0
0
0
0
ELED
ELED@eliadeleo·
@ilyasut 'won't stall' + 'something important missing' is a precise diagnosis: capability keeps growing but something beyond raw capability — agency, grounding, world models — remains unsolved. the question is whether he thinks that gap is knowable or if it's a different wall entirely
English
0
0
0
0