Tom Barrett

122 posts

Tom Barrett banner
Tom Barrett

Tom Barrett

@tomdbarrett

Staff Research Scientist @instadeepai

Katılım Ekim 2020
165 Takip Edilen175 Takipçiler
Tom Barrett retweetledi
Bernardo Almeida
Bernardo Almeida@deAlmeida_BP·
🚀 Introducing Nucleotide Transformer v3 (NTv3) Today, we are very excited to share our latest foundation model for biology - Nucleotide Transformer v3 (NTv3). NTv3 is @instadeepai new multi-species genomics foundation model, designed for 1 Mb, single-nucleotide-resolution prediction, and for bridging representation learning, sequence-to-function modeling, and generative regulatory design within a single framework 🧬 This work was developed in close collaboration with @AlexanderStark8 , @volokuleshov , and @pkoo562 , and reflects several years of joint effort at the intersection of machine learning and regulatory genomics.
Bernardo Almeida tweet media
English
10
107
496
56.7K
Tom Barrett retweetledi
charliebtan
charliebtan@charliebtan·
Super excited to announce our recent work was accepted to NeurIPS 2025! 🌟 We introduce Prose, a 280M-parameter transferable normalizing flow proposal for efficient sampling of unseen peptide sequences 😮 Many thanks to the fantastic team!
Majdi Hassan@majdi_has

(1/7) New paper!🚀 arxiv.org/abs/2508.18175 ✅Boltzmann distribution sampling for peptides up to 8 residues ✅4.3ms of training MD trajectories ✅Open-source codebase With @charliebtan, @leonklein26, Saifuddin Syed, @dom_beaini @mmbronstein @AlexanderTong7 @k_neklyudov Read on 👇

English
1
6
49
7.5K
Tom Barrett retweetledi
机器之心 JIQIZHIXIN
机器之心 JIQIZHIXIN@jiqizhixin·
This is huge! A UCLA team managed to build an optical generative model that runs on light instead of GPUs. In their demo, a shallow encoder maps noise into phase patterns, which a free-space optical decoder then transforms into images—digits, fashion, butterflies, faces, even Van Gogh–style art—without any computation during synthesis. ⚡ The results rival digital diffusion models, pointing to ultra-fast, energy-efficient AI powered by photonics. Optical generative models | Nature Paper: #MOESM1" target="_blank" rel="nofollow noopener">nature.com/articles/s4158…
English
53
379
2.5K
173.4K
Tom Barrett retweetledi
Jorge Bravo Abad
Jorge Bravo Abad@bravo_abad·
Solving the many-electron Schrödinger equation with Transformers Every material property, in principle, comes from solving the many-electron Schrödinger equation. But the math is brutal: the Hilbert space grows exponentially, and even the best methods—DFT, coupled-cluster, DMRG—hit hard limits when strong electron correlation or large active spaces appear. Honghui Shang and coauthors present QiankunNet, a neural-network quantum state inspired by large language models. At its core is a Transformer wavefunction ansatz, where attention captures long-range electron correlations directly. Instead of slow Markov chains, it uses autoregressive sampling—generating uncorrelated electron configurations one by one, guided by Monte Carlo tree search. Physics-informed initialization from truncated CI keeps the model close to physical reality from the start. The result is striking: QiankunNet recovers 99.9% of FCI correlation energy for molecules up to 30 spin orbitals, handles N₂/cc-pVDZ (56 qubits, 14 e⁻) within 3.3 mHa of a DMRG reference, and even tackles the Fenton reaction with a CAS(46e,26o) active space—capturing complex multi-reference chemistry around Fe(II)/Fe(III) oxidation. Compared to previous NNQS, it is both faster (∼10× at 30 orbitals) and more accurate. This points toward a future where attention models don’t just process words, but represent quantum wavefunctions—bringing LLM-inspired architectures into the heart of quantum chemistry. Paper: nature.com/articles/s4146…
Jorge Bravo Abad tweet media
English
8
33
194
15K
Tom Barrett retweetledi
InstaDeep
InstaDeep@instadeepai·
🔧 Introducing AbBFN2: our multi-modal antibody foundation model. AbBFN2 jointly models 45 data modes spanning sequences, genetic information and developability attributes to provide a rich framework with which to define conditional generation tasks. Join Research Scientist Bora Guloglu as they explore AbBFN2's integrated approach to antibody design— using a unified generative model to potentially enhance scientific efficiency.
English
0
7
20
1.6K
Tom Barrett retweetledi
Karim Beguir
Karim Beguir@kbeguir·
🧬Introducing AbBFN2, our latest generative AI model for multi-objective antibody design!✨ Built on our BFN work published in @NatureComms, AbBFN2 masters the dependencies between sequence, genetic attributes, and developability, taking antibody design to the next level! 🧵
GIF
English
3
6
34
2.1K
Tom Barrett
Tom Barrett@tomdbarrett·
AbBFN2 highlights how Bayesian Flow Networks enable "condition anywhere, generate anywhere," transforming antibody design workflows from annotation and prediction to complex optimisation tasks.
Tom Barrett tweet media
English
1
0
1
78
Tom Barrett
Tom Barrett@tomdbarrett·
Excited to announce our latest work: AbBFN2, a generative antibody model co-modelling sequences, genetic origins, and developability attributes across 45 diverse data modalities! 🧬🔬 🧵 Thread to follow, including links to the paper 📄 , code 💻, blog ✒️ and web app 🌐!
GIF
English
1
1
11
278
Tom Barrett retweetledi
Biology+AI Daily
Biology+AI Daily@BiologyAIDaily·
AbBFN2: A flexible antibody foundation model based on Bayesian Flow Networks 1. AbBFN2 is a generative foundation model for antibodies built on the Bayesian Flow Network (BFN) paradigm, allowing conditional generation across 45 sequence, genetic, and biophysical data modes without task-specific training. 2. Unlike typical models trained for one specific task, AbBFN2 consolidates multiple design tasks—such as sequence inpainting, humanisation, biophysical property optimisation, and de novo library generation—into a unified, flexible framework. 3. The model is trained on over 2M paired human, mouse, and rat antibody sequences from the OAS database, annotated with species, germline gene identity, CDR lengths, and TAP-derived developability metrics. 4. AbBFN2 learns the joint distribution of these diverse data modes, enabling concurrent generation of antibody sequences and associated labels that faithfully reflect natural antibodies across sequence, structural, and genetic properties. 5. Generated sequences from AbBFN2 match held-out data in CDR loop lengths, amino acid frequencies, and structural conformations, as validated by dynamic time warping and t-SNE embedding of loop structures. 6. Germline gene usage patterns learned by the model match natural V(D)J recombination biases, indicating its understanding of antibody genetics even though it operates on amino acid sequences. 7. AbBFN2 achieves state-of-the-art performance in sequence annotation, outperforming tools like ANARCI and IgBLASTp in predicting germline gene identities and biophysical properties from sequences. 8. In sequence inpainting tasks, AbBFN2 accurately recovers framework and CDR residues, with Rosetta-predicted VH-VL interface stabilities indistinguishable from real antibodies, suggesting functional plausibility. 9. For sequence humanisation, AbBFN2 performs iterative masked sampling guided by species logits, achieving >95% human confidence across 25 precursor antibodies while preserving structural similarity (mean CDR RMSD \~0.7Å). 10. The model's species classification logits correlate with clinical anti-drug antibody (ADA) response rates (R = –0.52), matching results from p-IgGen and supporting its use in immunogenicity risk assessment. 11. In a multi-objective optimisation task, AbBFN2 successfully humanised 91 non-human antibodies while removing TAP liabilities, generating diverse candidates with developable biophysical profiles in a single-step workflow. 12. AbBFN2 can generate rare antibody types conditionally—e.g., 1715 VRC01-like antibodies with rare germline, CDR loop length, and developability constraints—demonstrating its ability to explore highly specific regions of sequence space. 13. By merging multiple antibody design objectives into a single generative pass, AbBFN2 minimizes the inefficiencies of sequential pipelines and provides a practical, tunable platform for next-gen therapeutic antibody design. 💻Code: github.com/instadeepai/Ab… 📜Paper: biorxiv.org/content/10.110… #AntibodyDesign #ProteinEngineering #MachineLearning #FoundationModels #Bioinformatics #Therapeutics #AntibodyEngineering #BayesianFlowNetwork #GenerativeAI #Biotech
Biology+AI Daily tweet media
English
0
6
29
2.4K