“Nothing, nothing, nothing… then everything.”
@ctnzr VP, Applied Deep Learning Research @NVIDIA on how AlexNet changed AI:
“In 2011, neural networks were dismissed as failed ideas. Most researchers thought deep learning was a hack, and that real AI required more complex math.”
“Then AlexNet happened: image classification made a decade of progress in a single year by scaling models, data, and GPUs. It flipped the field overnight.”
“At NVIDIA, that moment turned early GPU experiments into cuDNN. Slow at first, then suddenly very fast.”
“That’s how every breakthrough starts.”
"OpenClaw is the most popular open source project in history of humanity" - Jensen (NVIDIA CEO)
But most people are using it wrong...
Here's everything I've learned from 10 billion tokens and 200+ hours of using OpenClaw every single day.
Watch this now:
0:00 Intro
0:32 Threaded Chats
3:17 Voice Memos
4:43 Agent-Native Hosting (Sponsor)
6:49 Model Routing
11:18 Subagents & Delegation
14:02 Prompt Optimizations
17:22 Cron Jobs
19:15 Security Best Practices
24:03 Logging & Debugging
25:43 Self Updating
26:28 API vs Subscription
27:52 Documentation/Backup
31:19 Testing
33:11 Building
“I rewrote my code in 30 minutes — and it ran 200× faster.”
@ctnzr VP, Applied Deep Learning Research @NVIDIA on the moment GPUs changed everything:
“NVIDIA showed up in our lab and said, ‘You should try CUDA.’ I plugged in a GPU, rewrote my SVM training code, and it ran 200× faster than my CPU version.”
“I thought, that’s it. This is dramatically easier and clearly the future of machine learning.”
“The vision was simple: accelerate the world’s most important computations by 10× or 100×, and use that to power AI.”
“The compute required for intelligence is essentially unbounded.”
“Scaling models isn’t enough. Memory is what actually makes AI better.”
@charlespacker CEO of @Letta_AI:
“Just scaling the transformer doesn’t make sense long term. Context windows will grow and get cheaper, but that’s not the real breakthrough.”
“The real shift is memory.”
“Over time, your coding agent should become fundamentally different from mine.”
“Just like humans, expertise comes from accumulated experience, not raw intelligence.”
“That’s something model releases alone won’t give you.”
Modern healthcare doesn’t prevent problems. It reacts to them.
@Dominic1King VP of Health @MicrosoftAI says:
“Medicine is very good at dealing with the end consequences of decisions we could have prevented earlier.”
“The average person sees a doctor a few times a year, with long gaps of no data in between.”
“Now we have continuous data from wearables. That makes it much easier to spot patterns early.”
The shift is from reacting to illness → predicting and preventing it.
“Right now, AI only thinks when you’re using it.”
@charlespacker CEO of @Letta_AI:
“That’s what makes systems like ChatGPT feel inhuman... they stop ‘thinking’ the moment you stop interacting.”
“Sleep-time compute is about changing that.”
“What should an agent do when you’re not using it? Humans don’t shut off between interactions, our brains are always working.”
“The challenge is making that time useful instead of generating garbage. If we solve that, it unlocks a massive new layer of compute and capability.”
“What happens when AI can edit its own memory?”
@charlespacker CEO @Letta_AI:
“If models get good enough at using computers, they could start treating memory like files, and editing it themselves.”
“But memory isn’t just storage. What happens if it lives on a device that fails?”
“Humans don’t think of memory that way, it persists across contexts.”
“That forces a deeper question: what is the ‘brain’ of an AI system?”
“Slack’s biggest asset isn’t messaging. It’s context.”
Rob Seaman, GM @SlackHQ
“Channels naturally reflect a company’s priorities and top projects. They also reflect your individual priorities: what you’re focused on right now.”
“That creates a uniquely rich layer of context across both the company and the individual.”
“Slack was built more like a social network than traditional enterprise software. That’s why it’s well-positioned for AI.”
“The real challenge is identifying the most relevant context in the moment and feeding that into the model.”
“Medical AI can’t miss emergencies. But it also can’t send everyone to the ER.”
Dominic King, VP of Health @MicrosoftAI:
“Getting medical AI right is a delicate balance. If someone says ‘I have crushing chest pain and shortness of breath,’ that’s obvious. But symptoms can also be more ambiguous.”
“If you miss the emergency cases, that’s a huge problem.”
“But if you tell everyone to call 911 or go straight to the emergency department, an already stretched health system breaks.”
“Memory will become more valuable than the model itself.”
@charlespacker CEO of @Letta_AI:
“There will come a time when the memories of an AI system are more valuable than its model weights.”
“Model weights lose value every few months as new models are released.”
“But memories persist and compound.”
“In that world, the most valuable asset an AI company holds isn’t the model — it’s the memory.”
“And for individuals, your most valuable digital asset may be the memories your AI has formed about you.”
“The goal isn’t coding with AI. It’s delegating work to AI.”
@Amasad CEO of @Replit:
“You should be able to describe a problem or a product idea and have the software built as far as it can go.”
“With agents, the system can provision the development environment, install packages, set up databases, and deploy for you.”
“The real vision is working with an agent like a teammate. You hand it a task, and it works on it for hours.”
“That’s the shift from coding with an assistant to delegating work to an agent.”
“Most people fine-tune models for two reasons: cost and speed.”
@varunvummadi CEO of @GigaAI:
“Fine-tuning reduces cost, increases speed, and improves throughput.”
“Some industries like healthcare and finance also prefer it because they don’t want to rely on closed-source models.”
“But when we looked at the data, two use cases dominated: support and coding.”
“So we decided to focus on support and double down there.”
“Slack was always a search and AI tool disguised as a messaging tool.”
Rob Seaman, GM of @SlackHQ:
“That vision is finally coming to fruition. We now position Slack as the operating system for the agentic enterprise.”
“Operating systems hide the complexity of hardware and provide a human interface.”
“Slack did that for apps. Now we’re doing it for agents.”
“It becomes the place where agents are distributed and where people interact with them like colleagues.”
Missed the live show this week?
Major themes: agents, voice agents, more agents, and AI in healthcare.
Guest list:
• Rob Seaman, General Manager @SlackHQ
• @varunvummadi CEO & Co-Founder @GigaAI
• @charlespacker CEO & Co-Founder @Letta_AI
• @Dominic1King VP Health @MicrosoftAI
Full show in the comments. See you next week.
“Everyone should be a GPU programmer.”
@clattner_llvm's goal with @Modular:
“What Modular is doing is opening up the box. We’re fixing the language problem and the platform problem.
"The goal is to let more developers learn modern compute. And to give developers real choice in the hardware they use.”
“Those two things unlock the ecosystem.”
“One rule: you’re not allowed to write code.”
Vinay Perneti, VP of Engineering at @AugmentCode:
“For our engineering offsite, we ran a hackathon with one rule: you’re not allowed to write code. Everything had to be built using agents.”
“The themes were simple: how do you 10x your agent, 10x your team, or 10x yourself?”
“The team built an incredible amount in a single day.”
“But the emotions in the room were mixed: uncertainty, frustration, fear.”
“Alongside curiosity, creativity, and freedom.”