

Robert M. Gower @ Neurips 2025
553 posts

@gowerrobert
Often found scribbling down math with intermittent bursts of bashing out code.









Woah, how did I never hear of this? An optimizer paper that got published in Nature, looks quite substantial









Want to do fundamental ML research in NYC? 🧠 The Center for Computational Mathematics @FlatironInst @SimonsFdn is hiring! – Flatiron Research Fellow (postdoc, by Dec 1): apply.interfolio.com/173401 – Open Rank (by Jan 15): apply.interfolio.com/173640










How did we improve the sensitivity to learning rates? MuonAdam/MuonMax are steepest descent methods, thus we can import tricks such as truncation. Truncation changes the steepest descent model, by making use of a known lower bound on the loss. Scaling laws give us a lower bound










How did we improve the sensitivity to learning rates? MuonAdam/MuonMax are steepest descent methods, thus we can import tricks such as truncation. Truncation changes the steepest descent model, by making use of a known lower bound on the loss. Scaling laws give us a lower bound

I'm on the academic job market! I design and analyze probabilistic machine-learning methods---motivated by real-world scientific constraints, and developed in collaboration with scientists in biology, chemistry, and physics. A few highlights of my research areas are:

