Peichen Zhong

221 posts

Peichen Zhong banner
Peichen Zhong

Peichen Zhong

@zhongpc

Moved to LinkedIn: https://t.co/kYc30jlZ5n

Berkeley, CA Katılım Ocak 2023
301 Takip Edilen295 Takipçiler
Peichen Zhong retweetledi
Biology+AI Daily
Biology+AI Daily@BiologyAIDaily·
A universal augmentation framework for long-range electrostatics in machine learning interatomic potentials 1. This novel study introduces the Latent Ewald Summation (LES) method, a universal framework to incorporate long-range electrostatics into machine learning interatomic potentials (MLIPs) without needing explicit charge labels. The LES method infers electrostatic interactions directly from energy and force data, significantly enhancing the accuracy of MLIPs for systems with significant electrostatics. 2. The LES framework is designed as a standalone library compatible with various short-range MLIPs, including MACE, NequIP, CACE, and CHGNet. It demonstrates remarkable improvements in predicting Born effective charges (BECs) and dipole moments, even when trained exclusively on energy and force data. This capability is crucial for accurately simulating systems with significant dielectric response, such as polar materials and charged molecules. 3. The study benchmarks LES-enhanced models on diverse systems, including bulk water, polar dipeptides, and gold dimer adsorption on defective substrates. Across all systems, LES not only reduces energy and force prediction errors but also accurately captures physical observables like BECs and adsorption energies. This highlights the robustness and versatility of the LES method. 4. The LES framework is scalable to large and chemically diverse datasets. The authors demonstrate this by training MACELES-OFF on the SPICE dataset, which includes organic molecules and biomolecules. MACELES-OFF outperforms its short-range counterpart in predicting bulk liquid properties and electrical response properties like IR spectra. 5. The LES method addresses a core limitation of current MLIPs by enabling efficient long-range electrostatics without additional training on electrical properties. This opens the door for developing universal MLIPs with full electrostatic physics, which can be applied across a wide range of chemical and biological systems. 💻Code: github.com/ChengUCB/les 📜Paper: arxiv.org/abs/2507.14302 #MachineLearning #Electrostatics #InteratomicPotentials #MaterialsScience #ComputationalChemistry #LongRangeInteractions
Biology+AI Daily tweet media
English
0
3
8
922
Peichen Zhong retweetledi
Bingqing Cheng
Bingqing Cheng@ChengBingqing·
Guess what? By learning from energies and forces, machine learning interatomic potentials can now infer electrical responses like polarization and BECs! This means we can perform MLIP MD simulations under electric fields! arxiv.org/pdf/2504.05169
English
5
11
137
8.2K
Peichen Zhong retweetledi
Tom Dörr
Tom Dörr@tom_doerr·
GPU-accelerated NumPy/SciPy library
Tom Dörr tweet media
Filipino
33
475
4.2K
264.8K
Peichen Zhong retweetledi
David D. Baek
David D. Baek@dbaek__·
1/9 🚨 New Paper Alert: Cross-Entropy Loss is NOT What You Need! 🚨 We introduce harmonic loss as alternative to the standard CE loss for training neural networks and LLMs! Harmonic loss achieves 🛠️significantly better interpretability, ⚡faster convergence, and ⏳less grokking!
GIF
English
73
520
4K
1.2M
Tian Xie
Tian Xie@xie_tian·
Excited to finally announce the publication of MatterGen on Nature. MatterGen represents a new paradigm of materials design with generative AI. We are releasing the training and inference code of MatterGen under MIT license. Look forward to seeing how the community will use the tool and build on top of it.
Microsoft Research@MSFTResearch

Microsoft researchers introduce MatterGen, a model that can discover new materials tailored to specific needs—like efficient solar cells or CO2 recycling—advancing progress beyond trial-and-error experiments. msft.it/6012U8zX8

English
37
153
1K
100.4K
Peichen Zhong retweetledi
Ceder Group
Ceder Group@cedergroup·
Can universal MLIPs handle complex atomic environments? 🤔 Our study uncovers systematic PES softening in M3GNet, CHGNet, & MACE-MP-0, linking it to biased pre-training datasets. 🌐 A path forward: better PES sampling! Read here 👇 nature.com/articles/s4152…
English
0
2
21
989
Peichen Zhong retweetledi
Materials Research Society (MRS)
👏 Celebrate excellence in materials research at the #F24MRS Graduate Student Awards Special Talk Sessions! 🌟 Join us to support award finalists as they showcase their groundbreaking research! 🧪🎓 🗓️ Dec 3, 12:15–3:15 PM ET 📍 Marriott, 1st Floor #MRSCommunity
Materials Research Society (MRS) tweet mediaMaterials Research Society (MRS) tweet media
English
0
3
14
2.9K
Peichen Zhong retweetledi
Peichen Zhong retweetledi
Miruna Cretu
Miruna Cretu@MirunaCretu2·
We’ve released an improved version of SynFlowNet! SynFlowNet can now handle large action spaces (up to 220k building blocks) and we experiment with training different GFlowNet backward policies. (1/3) arXiv: arxiv.org/abs/2405.01155 code: github.com/mirunacrt/synf…
GIF
English
5
39
137
24.4K
Peichen Zhong
Peichen Zhong@zhongpc·
The BIDMaP Postdoc Fellows Program is now open for applications from recent PhDs interested in working at the interface of machine learning and natural sciences. Great opportunity to work on AI for Science! bidmap.berkeley.edu/fellowship-opp…
English
1
2
11
1.3K
Peichen Zhong retweetledi
Janosh
Janosh@jrib_·
should have mentioned earlier since people had issues using CHGNet with numpy v2: v0.4.0 is out now and supports both numpy 1 and 2. the Trainer class also now comes with built-in wandb logging. just pass it an account and a project and you're ready to log. full release notes: github.com/CederGroupHub/… pip install chgnet==0.4.0
Janosh tweet media
English
0
4
18
1.6K