
Context memory essentially unlocks Agentic AI Much needed for Opus 4.6's "multi-agent swarms" In this SemiDoped pod, @vikramskr talks to Val Bercovici from Weka about context storage. - How token warehouses save inference costs - A new networking tier? Context Storage Network! - High Bandwidth Flash for context? - Weka's Augmented Memory Grid for context storage - Where this is all headed The convo is info packed. Don't miss out on it! @AccBalanced Chapters (00:00) Introduction to Weka and AI Storage Solutions (05:18) The Evolution of Context Memory in AI (09:30) Understanding Memory Hierarchies and Their Impact (16:24) Latency Challenges in Modern Storage Solutions (21:32) The Role of Networking in AI Storage Efficiency (29:42) Dynamic Resource Utilization in AI Networks (30:04) Introducing the Context Memory Network (31:13) High Bandwidth Flash: A Game Changer (32:54) Weka’s Neural Mesh and Storage Solutions (35:01) Axon: Transforming GPU Storage into Memory (39:00) Augmented Memory Grid Explained (42:00) Pooling DRAM and CXL Innovations (46:02) Token Warehouses and Inference Economics (52:10) The Future of Storage Innovations




















