Chen Lin

8 posts

Chen Lin

Chen Lin

@WillLin1028

Research Scientist, Isomorphic Labs

London, England Katılım Aralık 2016
67 Takip Edilen69 Takipçiler
Joey Bose
Joey Bose@bose_joey·
🎉Personal update: I'm thrilled to announce that I'm joining Imperial College London @imperialcollege as an Assistant Professor of Computing @ICComputing starting January 2026. My future lab and I will continue to work on building better Generative Models 🤖, the hardest AI4Science applications in computational biology 🧬and chemistry 🧪, and also a sprinkling of Deep Learning theory 📚 that supports these goals.
English
98
33
631
62.9K
Chen Lin retweetledi
Kalyan R
Kalyan R@kalyan_einstein·
New work with @lars__schaaf, @WillLin1028, @wanggrun, and @philiptorr: We optimize neural networks to smoothly represent minimum energy paths and predict transition states for chemical reactions. Compared to the traditional approach, our method shows (i) improved resilience to the initial guess, (ii) easy adaptability to escape local minima, (iii) the ability to capture a complex path on its own, and (iv) potential to generalize to unseen systems. This offers a flexible alternative to discrete methods that could unlock building a universal reaction path predictor. Our paper should appeal to anyone interested in machine learning to advance computational chemistry. Preprint: arxiv.org/abs/2502.15843.
GIF
English
1
7
17
1.3K
Francesco Di Giovanni
Francesco Di Giovanni@Francesco_dgv·
Really looking forward to giving this talk! I will provide an overview of my research on oversquashing and graph rewiring over and I will discuss future directions I am excited to work on within and beyond GNNs 🦕
I-X@ImperialX_AI

Our next I-X Seminar is with @Francesco_dgv @UniofOxford discussing "Understanding message passing: limitations of the paradigm and new possibilities" 🕐13.00 📅 8 Feb 📍 In person | @WCIDLondon Register below! ⬇️ ix.imperial.ac.uk/event/i-x-semi… @ImperialSci @ImperialAI @imperialeee

English
3
5
64
8K
Chen Lin retweetledi
Sebastian Raschka
Sebastian Raschka@rasbt·
How can we leverage successful pretraining techniques from transformers to improve purely convolutional networks? The answer is *Sparse Convolutions*! Let's see what happens when purely convolutional networks are pretrained with 1.28 million unlabeled images ... 1/7
Sebastian Raschka tweet media
English
9
104
490
139.9K
Chen Lin retweetledi
Dmytro Mishkin 🇺🇦
Dmytro Mishkin 🇺🇦@ducha_aiki·
Designing BERT for convolutional networks: sparse and hierarchical masked modeling Keyu Tian, Yi Jiang, Qishuai Diao, Chen Lin, Liwei Wang, Zehuan Yuan tl;dr: create image, which looks to CNN same, as transformers -> MIM starts working arxiv.org/abs/2301.03580…
Dmytro Mishkin 🇺🇦 tweet mediaDmytro Mishkin 🇺🇦 tweet mediaDmytro Mishkin 🇺🇦 tweet mediaDmytro Mishkin 🇺🇦 tweet media
English
4
25
118
25.2K