Miguel Liu-Schiaffini

11 posts

Miguel Liu-Schiaffini

Miguel Liu-Schiaffini

@mliuschi

CS PhD at Stanford

Stanford, CA Katılım Temmuz 2023
63 Takip Edilen79 Takipçiler
Miguel Liu-Schiaffini retweetledi
Liana
Liana@lianapatel_·
🚀 Thrilled to launch DeepScholar, an openly-accessible DeepResearch system we've been building at Berkeley & Stanford. DeepScholar efficiently processes 100s of articles, demonstrating strong long-form research synthesis capabilities, competitive with OpenAI's DR, while running up to 2x faster! Try it out: deep-scholar.vercel.app
Liana tweet media
English
68
485
2.7K
200.4K
Miguel Liu-Schiaffini retweetledi
Prof. Anima Anandkumar
Prof. Anima Anandkumar@AnimaAnandkumar·
Neural Operators – Deep learning at any resolution Extending neural networks to function spaces: While many phenomena are inherently described by functions, neural networks define vector-to-vector mappings that rely on fixed discretization of the input and output. Neural Operators instead define learnable function-to-function mappings that guarantee consistent predictions across different discretization of the input and output functions. By respecting the functional nature of the data, neural operators can achieve improved performance and generalization. Translating the success of deep learning to operator learning: Careful engineering of neural architectures has been a key factor in deep learning’s success. Translating these architectures to neural operators is crucial for operator learning to enjoy the same empirical optimizations. Key principles for constructing Neural Operators: *Recipes for converting popular architectures (CNNs, GNNs, transformers, etc.) into Neural Operators *Guidance for practitioners arxiv.org/abs/2506.10973 github.com/neuraloperator @julberner @mliuschi @JeanKossaifi Valentin Duruisseaux Boris Bonev @Azizzadenesheli @caltech
Prof. Anima Anandkumar tweet mediaProf. Anima Anandkumar tweet media
English
6
45
259
18.3K
Miguel Liu-Schiaffini retweetledi
Prof. Anima Anandkumar
Prof. Anima Anandkumar@AnimaAnandkumar·
2024 was a pivotal year for AI+Science. Our team made exciting contributions. Here are some highlights: tensorlab.cms.caltech.edu/users/anima/20… 1. Neural Operators as a unifying AI framework for modeling multi-scale processes. We got to write a perspective article in @NatRevPhys and released an open-source library that has been widely adopted. 2. FourCastNet, our Neural-Operator-based model, featured in the @WHOSTP PCAST report on AI+Science. Its speed enables large ensembles to obtain unprecedented predictions of extreme weather events. 3. Designed a novel medical catheter using Neural Operator-based inverse design. 4. Used Neural Operators with reinforcement learning for turbulence stabilization and wall friction reduction. 5. Many other applications where Neural Operators have shown a big impact: nuclear fusion, computational imaging, Carbon capture and storage etc. 6. Gave a @TEDTalks highlighting AI+Science. 7. Exascale training of genome-scale language models for protein design that was recognized as a finalist for the ACM Gordon-Bell prize. 8. State-of-the-art protein-ligand structure prediction that can handle changing protein conformations that Alphafold and other methods cannot. 9. AI+Math innovations with LLM + Lean for theorem proving. 10. Hardware-efficient large-scale training such as gradient projections and mini-sequence transformers. 11. Many honors such as @iitmadras Distinguished Alumnus award, and @BlavatnikAwards
English
6
36
158
11.6K
Miguel Liu-Schiaffini retweetledi
Jean Kossaifi
Jean Kossaifi@JeanKossaifi·
Introducing NeuralOperator 1.0: a Python library that aims at democratizing neural operators for scientific applications by providing all the tools for learning neural operators in PyTorch : state-of-the-art models, built-in trainers for quick starting and modular neural operator blocks for advanced used in your own workflow or to build new architectures.
English
2
26
163
23.4K
Miguel Liu-Schiaffini retweetledi
Armeet
Armeet@armeetjatyani·
Excited to present at NeurIPS 2024! 🎉 We propose a unified neural operator for Compressed Sensing MRI, adapting to multiple undersampling patterns/rates, with 11% SSIM & 4dB PSNR gains. Full paper & code: armeet.ca/nomri #NeurIPS2024 #ML #MRI #NeuralOperators
Armeet tweet media
English
3
5
18
2.5K
Miguel Liu-Schiaffini retweetledi
Zongyi Li
Zongyi Li@zongyili_nyu·
#NeurIPS I am on the 2024-25 job market seeking faculty positions and postdocs! My goal is to advance AI for scientific computing and discovery. I develop neural operators for partial differential equations (PDEs) with applications in fluid, solid, and earth science.
Zongyi Li tweet media
English
7
60
385
68.1K
Miguel Liu-Schiaffini retweetledi
Julius Berner
Julius Berner@julberner·
Drop by our #ICML2024 posters to chat about neural operators and PDE solvers 1⃣Solving Poisson Eqs. using Neural Walk-on-Spheres 2⃣DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training 3⃣Neural Operators w/ Localized Integral & Differential Kernels
Julius Berner tweet media
English
1
16
72
7.7K
Miguel Liu-Schiaffini retweetledi
Hong Chul Nam
Hong Chul Nam@HongChulNam·
📢Check out our work at #ICML2024 on solving high-dimensional Poisson equations using neural Walk-on-Spheres (WoS)  ⚡️ faster and more accurate than PINNs, Deep Ritz, and other SDE-based methods Session: Tue 23 Jul 11:30 am - 1 pm, Hall C 4-9 #2811 @julberner @AnimaAnandkumar
Hong Chul Nam tweet media
English
6
8
14
2.1K