Rob Toews

1.3K posts

Rob Toews banner
Rob Toews

Rob Toews

@_RobToews

Partner @RadicalVCFund, AI columnist @Forbes. "the machine does not isolate man from the great problems of nature but plunges him more deeply into them."

San Francisco Bay Area, CA Katılım Eylül 2012
856 Takip Edilen4.9K Takipçiler
Sabitlenmiş Tweet
Rob Toews
Rob Toews@_RobToews·
10 (bold) predictions for AI in 2026: 1⃣ Anthropic will go public. OpenAI will not. 📈 2⃣ Details of SSI’s research and technology will leak to the public. The big labs will make meaningful adjustments to their research roadmaps as a result. 🤫
English
5
1
22
7.1K
Rob Toews retweetledi
Bryan Johnson
Bryan Johnson@bryan_johnson·
This is it. Everything learned spending millions on longevity. From: Your Immortal Unc and Auntie. To: Our Immortal nieces and nephews. 0. Sleep is the world's most powerful drug. 1. Be in your bed for 8 hours 2. Same bedtime every night, any time before midnight 3. Don’t eat right before bed 4. Calm foods for dinner 5. No screens 1 hour before bed 6. Avoid added sugar (be aware it’s in everything) 7. Avoid all things in an American convenience store 8. Avoid fried foods 9. Shoes off at the door 10. Eat whole foods, particularly veggies fruits nuts legumes berries 11. Walk a little after meals or air squats 12. Get your heart rate high routinely 13. Lift heavy things 14. Stretch daily 15. Water pik, floss, brush, tongue scrape, morning and night 16. Make an effort to drink water 17. Get sunlight when you wake up (UV is low) 18. Protect skin in midday sun 19. Stand up straight 20. See at least one friend once a week 21. Avoid plastic where you can (in all things) 22. Circulate air in rooms 23. When stressed, breathe, learn to calm your body 24. Go to the dentist 25. Avoid sitting for long times 26. Protect your hearing, the world is too loud 27. Alcohol is bad for you 28. Finish coffee before noon 29. Avoid bright lights after sunset 30. If obese, look into a GLP 31. Sleep in a cold room 32. Texting while driving is dangerous 33. Turn off all notifications 34. Limit social media use 35. Don’t smoke anything 36. If you struggle to sleep, read a physical book before bed 37. 1 hour before bed have a calm wind down routine: bath, read, light walk, listen to music 38. The body is a clock and loves routine. Have a daily morning and evening schedule. 39. Avoid long distance travel where you can 40. Baby steps first: incorporate new things slowly 41. Do less… most things don’t work. Bonus points if you get your blood checked. Start here, it will change your life.
English
1K
4.8K
42.8K
5.5M
Peter H. Diamandis, MD
Peter H. Diamandis, MD@PeterDiamandis·
Sundar Pichai just said data centers in space will be "the new normal" within a decade. @elonmusk has been saying this for years. When the CEO of Google starts agreeing with Elon, pay attention. The orbital compute era is closer than you think.
English
744
1.2K
10.8K
52.2M
Michael Dempsey
Michael Dempsey@mhdempsey·
me to claude after it tells me my sentences are too long-winded. also, nabokov to his new yorker editor.
Michael Dempsey tweet media
English
22
588
5.6K
174.3K
Rob Toews retweetledi
Nick Levine
Nick Levine@status_effects·
New work with @AlecRad and @DavidDuvenaud: Have you ever dreamed of talking to someone from the past? Introducing talkie, a 13B model trained only on pre-1931 text. Vintage models should help us to understand how LMs generalize (e.g., can we teach talkie to code?). Thread:
English
171
367
2.9K
1M
Rob Toews
Rob Toews@_RobToews·
Is xAI still a frontier lab?
English
2
0
3
3.1K
Shahin Farshchi
Shahin Farshchi@Farshchi·
We absolutely must understand how the brain works to solve brain diseases. @unconvAI is taking inspiration from the brain, and is ultimately solving a different problem: it is building machines that can solve many problems in parallel in a massive scale. @unconvAI's goal is to meet the stringent reliability and scalability requirements of tomorrow’s AI workloads, while being inspired by the raw energy efficiency of the human brain. Fantastic piece by a dear friend:
Rob Toews@_RobToews

a call to action.... forbes.com/sites/robtoews…

English
1
1
9
1.7K
Rob Toews retweetledi
Juan Benet
Juan Benet@juanbenet·
Excited to launch a new podcast dedicated to conversations on the future of neurotech, computing, intelligence, and more. First guest: @maxhodak_ founder & CEO of @ScienceCorp_, which is building PRIMA, a retinal prosthetic that’s restoring meaningful vision for patients with blindness caused by age-related macular degeneration. Science is also developing a biohybrid brain implant that grows living neurons directly onto a silicon chip, then interfaces that system with the cortex. The possibility space here is vast and new. Imagine growing new areas of the brain. Sections 00:00 What counts as neurotech? 01:45 History of brain-computer interfaces and the smartphone dividend 07:25 PRIMA - How Science is restoring vision in blind patients 10:10 Why stimulating bipolar cells works when the optic nerve doesn't 30:30 Are we bottlenecked by biology or engineering? 32:40 Expanding the brain's bandwidth beyond 10 bits per second 37:00 Can we add new areas to the brain? 37:46 Biohybrid BCIs: neurons growing on a chip 39:20 What could neural augmentation look like? 01:13:20 How Science drives Fast R&D 01:44:00 How founders learn and level up This is the kind of discussion I’m excited to explore on this podcast. Enjoy! Full Episode 1 here and in links below.
English
27
82
362
75K
Rob Toews retweetledi
Marlos C. Machado
Marlos C. Machado@MarlosCMachado·
A couple of months ago, we released a preprint of one of my favourite papers I’ve ever written. It lies at the intersection of representation learning and neuroscience. I have now written a blog post about it. Preprint: biorxiv.org/content/10.110… Blog post: @marlos.cholodovskis/from-pixels-to-place-cells-where-representation-learning-meets-neuroscience-72140afe6e3f" target="_blank" rel="nofollow noopener">medium.com/@marlos.cholod…
English
3
35
182
13.5K
Rob Toews retweetledi
Clay Dumas
Clay Dumas@claydumas·
Near the top of @vsiv's (almost annoyingly) long list superhuman abilities, is how smoothly he can enlist people to a very big mission. That cause is turning data centers intro flexible assets, unlocking capacity and affordability right right when the world is desperate for both.
Shanu Mathew@ShanuMathew93

Nvidia and @EmeraldAi_ announced a collaboration with AES, Constellation, NextEra, Invenergy and Vistra to develop "flexible AI factories" that adjust power consumption based on grid conditions. Pairs Nvidia's reference architecture with Emerald software to modulate compute workloads dynamically. No firm official project commitments yet but eagerly watching...

English
2
2
11
1.9K
Nik
Nik@NikMilanovic·
Sooner or later, everyone in fintech works for @Plaid.
Nik tweet media
English
71
16
337
33.1K
Rob Toews retweetledi
Gaurab Chakrabarti
Gaurab Chakrabarti@Gaurab·
The human brain: 2% body mass, but consumes 20% of its energy. Cortical neurons fire 0.16 times per second. BUT they are capable of firing at 40 or more. A 250-fold gap. If more than a few percent of neurons fired at high rates simultaneously, the brain would literally overheat. So less than 1% fire at any given moment. Frontier AI models have the same two constraints: sparse activation and thermal limits. Mixtral activated 27.6% of its parameters per token. DeepSeek-V2 activated 8.9%. DeepSeek-V3 has 671 billion parameters and activates 37 billion of them. That's 5.5%. NVIDIA hit the same wall. The GB200 generates 120 kilowatts per rack. Air couldn't cool it. They switched to liquid and unlocked 30% more compute. Now, what would happen if we could cool our brains? Neurons that fire faster produce measurably higher IQ scores, but three things stop us: heat dissipation, oxygen delivery, and ion channel reset time. There's already a device that achieved a 3°C brain temperature drop in 30 minutes by running chilled saline through the nasal cavity. So the first human IQ-overclock device might look less like Neuralink and more like a beer helmet with tubes running up your nose.
English
26
55
477
42.4K
Rob Toews retweetledi
JJ
JJ@JosephJacks_·
We don't have a compute problem… We have an architecture problem. Paramecium Caudatum are single-celled organisms roughly the width of a human hair. They have no brain, no neurons, no synapses, and no central nervous system of any kind. But what they do have is ~100,000 microtubules… With that substrate alone, they can: → Swim in controlled helical trajectories → Modulate speed continuously → Execute graded avoidance reactions (reverse, pivot, resume) → Escape predators with emergency burst reversals → Fire localized volleys of 8,000 trichocyst harpoons → Navigate toward food via chemotaxis → Orient in electric fields (galvanotaxis) → Orient to gravity (gravitaxis) → Sense and navigate thermal gradients → Sense and navigate toward light → Detect and follow surfaces (thigmotaxis) → Forage biofilms → Generate feeding currents and sort particles at the cytostome → Engage in reciprocal sex with mating-type recognition, nuclear exchange, and complete genomic reconstruction → Self-fertilize when no partner is available (autogamy) → Habituate to repeated stimuli (primitive learning) → Inherit cortical MT architecture epigenetically independent of the genome 17 distinct behaviors. One lattice. Zero neurons. The coordination layer is the infraciliary lattice — a microtubule-based grid connecting all 5,000 ciliary basal bodies into a single cell-wide network. Every cilium is a terminal node on a microtubule mesh that coordinates metachronal waves across the entire cell surface — thousands of appendages phase-locked into coherent motion by a substrate that predates the nervous system by a billion years. The neuron didn't invent computation. It inherited microtubules.
JJ tweet media
English
81
331
1.5K
75.2K
Rob Toews retweetledi
Hadi Vafaii
Hadi Vafaii@hadivafaii·
The "decoupling of information and energy" is a major point of divergence between biological and artificial computers. Brains are efficient, modern AI isn't. And energy consumption is the biggest bottleneck in scaling AI (you can't hallucinate electrons into existence). To address this we need an "energy-aware theory of computation." And this new preprint is an attempt to address this. [1/11] 🧵
Hadi Vafaii tweet media
English
17
72
335
54.3K
Rob Toews retweetledi
Jonathan Gorard
Jonathan Gorard@getjonwithit·
I think, in hindsight, we will come to view the development of AI as more akin to a Eukaryotic Revolution than an Industrial one.
English
74
75
1.1K
70.9K
Rob Toews retweetledi
George Sivulka
George Sivulka@gsivulka·
Financial AI is here. Wall Street, meet the future of institutional intelligence. See how Oak Hill Advisors, LionTree, @NewYorkLife, @MetLife, & @HSFKramer are already putting it to work.
English
25
40
301
228.4K
Kenan Saleh
Kenan Saleh@kenanhsaleh·
What's the best AI personal assistant product out now? Looking for something that can call and book restaurant reservations, email for refunds, etc. - that's fully productized and easy to use
English
49
1
98
35.6K
Rob Toews retweetledi
himanshu
himanshu@himanshustwts·
dude i love when ideas from biology / neuroscience shape how we train AI systems. > coined the term “pre-pre-training” > training pipeline becomes: synth data → language data → downstream tasks > synth data is generated using “neural cellular automata” > each step is basically cell_state(t+1) = neural_net(neighborhood) which creates evolving patters also if this idea holds true on scale, the future training pipeline might look like synth worlds + structured simulations to language + tools/RL (or basically what the thesis of “world models” is revolving around)
himanshu tweet media
Seungwook Han@seungwookh

Can language models learn useful priors without ever seeing language? We pre-pre-train transformers on neural cellular automata — fully synthetic, zero language. This improves language modeling by up to 6%, speeds up convergence by 40%, and strengthens downstream reasoning. Surprisingly, it even beats pre-pre-training on natural text! Blog: hanseungwook.github.io/blog/nca-pre-p… (1/n)

English
5
34
308
30.2K