Vanishing Gradients Podcast

318 posts

Vanishing Gradients Podcast banner
Vanishing Gradients Podcast

Vanishing Gradients Podcast

@VanishingData

a data podcast hosted by @hugobowne. What do we all do when we do data science? What can we change? What does the future hold for DS, ML, and AI?

Katılım Şubat 2022
15 Takip Edilen1.6K Takipçiler
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
Most AI benchmarks test memorization. ARC-AGI, built on @fchollet's definition of intelligence, tests how efficiently models learn new skills — tasks humans solve in 1–2 tries, but even state-of-the-art LLMs still fail. @GregKamradt joined me on @VanishingData to explain why... and one way to know when AGI has arrived 🤖 🎧 Full ep here: vanishinggradients.fireside.fm/48
English
0
1
8
20.5K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🧠 Most AI benchmarks measure memorization. @arcprize — proposed by @fchollet — tries to measure intelligence. lu.ma/ohgcnyql This Friday I’m speaking with @GregKamradt for @VanishingData about: 🧩 Why abstraction & reasoning are central to AGI 📐 How the ARC benchmark actually works 💰 Running a $1M benchmark for generalization 🤔 Whether ARC is the right measure at all Register for free 👇
English
3
10
62
616.5K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
“You’re not inventing new notes. You’re composing a new melody—from pre-existing software.” — @gregce10 From our conversation on the @VanishingData podcast. Full episode + demos in the thread ↓ What if building software felt more like composing than coding? 1/
English
1
2
5
569
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
Mind still spinning from livestreaming a podcast with @gregce10 for @VanishingData yesterday about Vibe Coding, Software Composing, and the Future of Programming. link in comment and podcast out soon. Topics discussed (and hour long demo, including exploring JFK pdfs with VIBE): 💫
Hugo Bowne-Anderson tweet media
English
1
1
6
493
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🚨 Fragile LLM demos won’t cut it in 2025. 🚨 We’re starting in 2.5 hours – last chance to join and build apps that actually make it to production. 👉 Use code LASTCALL25 for: • 25% off the course • A free 30-minute consultation with me – bring your toughest AI/ML problem, and we’ll solve it together. Can’t make the first class? Don’t worry – everything’s recorded, and we’re here to support you. maven.com/hugo-stefan/bu… 🔧 What you’ll build: • Production-ready LLM apps – PDF querying tools, agent workflows, and more. • Monitoring & Debugging – track issues before they break things. • Prompt Engineering – consistent, structured outputs without hacks. 👥 Who’s already in? Engineers from Meta, Deloitte, Netflix, Salesforce, Atlassian, and the U.S. Air Force. 💡 Guest Speakers & Sessions: • @swyx – Engineering AI Agents for 2025 • @HamelHusain – Data Literacy for Debugging LLMs • @SanderSchulhoff – Prompt Engineering in the LLM SDLC • @charles_irl - LLM Hardware for Developers • @canyon289 – End-to-End LLM Product Development 🎯 Bonuses for Participants: • $1,000 in Modal cloud compute credits • 3 months of Learn Prompting Plus 👉 Sign up now – LASTCALL25 for 25% off + a free 30-min consult.
English
0
2
5
1.2K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🤖 @HamelHusain and I were bouncing around ideas for the guest talk he’s giving in our LLM course – and we ended up diving deep into why just looking at your data can solve more LLM issues than you’d think. From developing a physical connection to data to why pivot tables might be an underrated debugging tool for ML/AI engineers, this clip gives a glimpse into how Hamel thinks about data. 🚀 The course kicks off tomorrow – Hamel’s session is just one piece. There’s also $1K in @modal_labs credits, guest lectures from @swyx, @charles_irl, @SanderSchulhoff, and @canyon289, plus 3 months free of @learnprompting Plus. 👉 Sign up now to grab a spot ⚡ maven.com/s/course/d5606…
English
0
8
57
7.1K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🚨 Tomorrow: Live-stream on LLM Agent Deployments Over 300+ real-world LLM agent deployments—what’s working, what’s not? Let’s break it down. lu.ma/464o16an?utm_s… I’m going live tomorrow at 8am CET with @strickvl ( @zenml_io ) to unpack lessons from teams like Anthropic, Amazon, and Dropbox. 🔧 Workflows? Still essential. 🏗️ Scaling? Not as smooth as it looks. ⚠️ Pitfalls? We’ll help you spot them early. Alex’s team has gathered data from hundreds of actual deployments over the past two years—so this is grounded in reality, not just hypotheticals. If you’re building or scaling LLM agents in 2025, this session is worth your time. 👉 Join us live tomorrow at 8am CET. Register for free here:
English
0
2
4
615
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
💫 I am thrilled to share that @swyx will be speaking on “Engineering AI Agents in 2025” – in our course "Building LLM Applications for Data Scientists and Software Engineers," expanding on his recent talk from OAI DevDay. maven.com/s/course/d5606… 👉 If you don’t know Swyx (LOL, what?), here’s a quick rundown: - Co-host of the Latent Space podcast – two years of top-tier AI guests - Organizer of ai dot engineer conferences (with AIE NYC 2025 coming up) - Builder of smol dot ai – 30k daily insiders, 99% automated by AI agents Swyx has been driving the AI Engineer movement, and his session will dive deep into what’s next for AI agents. 🚨 The course starts in 48 hours! Don’t miss out on this opportunity to learn directly from Swyx and other leading experts. Sign up now to secure your spot! 👉
English
2
2
22
4.3K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🎙️ New episode of @VanishingData -- this time, I’m the one being interviewed. @alex_andorra from Learning Bayesian Statistics had me on his podcast, and we covered a lot of ground—from deploying LLM apps to the realities of AI product development. Alex kindly let me share the full episode on Vanishing Gradients. In the full episode, we dig into: ⚡ How LLM apps go from “five lines of magic” to integration nightmares 🧰 Why hallucinations, drift, and monitoring issues tank excitement 🛠️ How focusing on first principles—like logging, tracing, and iteration—keeps projects on track Here’s a clip that highlights why excitement around LLM apps often fades—and what can be done to fix it. 🎧 Full episode available wherever you get your podcasts—and here: vanishinggradients.fireside.fm/42 Big thanks to Alex for the conversation and for letting me share this with the Vanishing Gradients audience!
English
0
1
3
744
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🚨 Excited to share that @HamelHusain will teach “Basic Data Literacy for Debugging and Evaluating LLMs” in our course “Building LLM Applications for Data Scientists and Software Engineers.” maven.com/s/course/d5606… Hamel (ex-Airbnb, GitHub) now consults on LLM apps across industries. His session will focus on essential data skills for debugging and evaluating LLMs – critical knowledge often overlooked. The course starts in 3 days. Join us to learn hands-on techniques for building reliable LLM systems. 👉 Reserve your spot now.
English
0
7
20
5.1K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
🎧 Building LLMs is hard – and getting the GPU piece wrong can derail everything. I sat down with @charles_irl from @modal_labs to break down what AI/ML developers really need to know about GPUs – from running inference to fine-tuning and training from scratch. We didn’t just talk hardware specs – we dove into the real-world pain points developers hit when scaling models and the trade-offs that make or break performance. 💾 Why GPU memory, not compute, is the real bottleneck for LLMs 🔧 Fine-tuning headaches – why it eats RAM and how to stay efficient 💸 How to actually size your hardware for AI/ML workloads (and avoid overspending) 🛠️ What developers should prioritize when buying GPUs – beyond the marketing 🤖 Where quantization, RAG, and agent-based methods fit into GPU demands Charles also shared insights from the Modal GPU Glossary – a fantastic breakdown of GPU internals – and walked through a live demo during the episode (linked below). 🎧 Full @VanishingData podcast episode: vanishinggradients.fireside.fm/40 📺 Watch the live stream (with demo): youtube.com/live/INryb8Hjk… Happy Holidays – hope you enjoy the episode!
YouTube video
YouTube
English
0
7
47
2.3K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
Yesterday I did a 2 hour livestream with @charles_irl (@modal_labs) called What Every LLM Developer Needs to Know About GPUs. This are some of the topics we discussed. You can watch it here right now and podcast out next week: youtube.com/live/INryb8Hjk…
YouTube video
YouTube
Hugo Bowne-Anderson tweet media
English
10
174
1K
67.3K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
❓ Is Data Science Dead in the Age of AI ❓ In our latest High Signal episode, @hmason shares why the answer is a resounding “No!”—but the field is undergoing profound changes. Automation is transforming workflows, generative AI is reshaping how problems are solved, and the community is moving beyond its “shiny” phase into something more mature. 💡 Fun fact: Hilary was my guest on the first podcast I ever released, nearly a decade ago. It’s amazing to see how much has changed—and how much has stayed the same. Key themes we covered: ✨ From Hype to Maturity Automation is redefining entry-level roles and workflows. What does this mean for the future of data science? 📜 Why Generative AI Needs Context, Not Prompts “Prompting feels like spellcasting, not engineering,” Hilary says. She explains how context-rich systems—like structured data and multimodal inputs—are the key to unlocking AI’s potential. ⏳ A Decade of Interfaces, Not Algorithms Even if machine learning progress stopped today, it would take years to design the workflows, tools, and interfaces needed to fully harness its power. 🎧 This episode, produced by @delphina_ai with @dsgilchrist, dives deep into the evolving landscape of data science and what it takes to thrive in an AI-driven world. Catch the full episode wherever you listen to podcasts, and share your thoughts: • Apple: podcasts.apple.com/us/podcast/hig… • Spotify: open.spotify.com/show/0VewaA4Bl… • YouTube: youtube.com/watch?v=X4yfFN… • More episodes and show notes: high-signal.delphina.ai/episode/what-h…
YouTube video
YouTube
English
1
15
26
6.6K
Vanishing Gradients Podcast retweetledi
Hugo Bowne-Anderson
Hugo Bowne-Anderson@hugobowne·
Building generative AI systems is NOT about choosing the right model—it’s about balancing technical depth with real-world impact. 💫 In this episode of @VanishingData, I sit down with @canyon289, Senior Research Data Scientist at Google Labs, to discuss: 👉 His journey from SpaceX to Sweetgreen to Google Labs, and what he’s learned about building AI systems along the way. ⚙️ How to define meaningful evaluations and avoid jumping too quickly into model selection. 📈 Real-world applications of generative AI, from helping businesses like bakeries to scaling systems at Google. Here’s a key moment from our conversation, where Ravin explains why starting with evaluations is the principled way to build AI products. Want to hear more? Catch the full episode here (or on your app of choice): vanishinggradients.fireside.fm/39
English
0
4
12
1.3K