Emilien Dupont

93 posts

Emilien Dupont

Emilien Dupont

@emidup

phd student in machine learning @oxcsml @UniofOxford 🐳 previously research intern @Apple, computational maths @Stanford, theoretical physics @imperialcollege

California, USA Katılım Ekim 2017
140 Takip Edilen1.8K Takipçiler
Emilien Dupont
Emilien Dupont@emidup·
We introduce 🌸✨ AlphaEvolve ✨🌸, an evolutionary coding agent using LLMs coupled with automatic evaluators, to tackle open scientific problems 🧑‍🔬 and optimize critical pieces of compute infra ⚙️ deepmind.google/discover/blog/…
GIF
English
0
13
56
5.4K
Emilien Dupont retweetledi
Jean-François Ton
Jean-François Ton@jeanfrancois287·
📢New Paper on Reward Modelling📢 Ever wondered how to choose the best comparisons when building a preference dataset for LLMs? Our latest paper revives classic statistical methods to do it optimally! Here’s a 🧵on how it works 👇 arxiv.org/abs/2502.04354
Jean-François Ton tweet media
English
1
9
37
8.5K
Emilien Dupont retweetledi
Stephen James
Stephen James@stepjamUK·
🚨Important update from our Robot Learning Lab in London. Following recent news, we’re moving on after a wonderful 2 years… Today, we unveil 4 big pieces of research from our incredible team. Check out the compilation video and thread below to see our final work! 📽️👇
English
5
37
184
31.1K
Emilien Dupont
Emilien Dupont@emidup·
We build neural codecs from a *single* image or video, achieving compression performance close to SOTA models trained on large datasets, while requiring ~100x fewer FLOPs for decoding ⚡ #CVPR2024 c3-neural-compression.github.io
Emilien Dupont tweet media
English
4
35
160
13.6K
Emilien Dupont
Emilien Dupont@emidup·
We present #FunSearch in @Nature today - a system combining LLMs with evolutionary search to generate new discoveries in math and computer science! 👩‍🔬🔬✨
Google DeepMind@GoogleDeepMind

Introducing FunSearch in @Nature: a method using large language models to search for new solutions in mathematics & computer science. 🔍 It pairs the creativity of an LLM with an automated evaluator to guard against hallucinations and incorrect ideas. 🧵 dpmd.ai/x-funsearch

English
3
4
45
2.6K
Emilien Dupont retweetledi
Jin Xu
Jin Xu@jinxu06·
We construct neural processes by iteratively transforming a simple stochastic process into an expressive one, similar to flow/diffusion-based models, but in function space! Join us at our #NeurIPS2023 poster session: neurips.cc/virtual/2023/p… on Wednesday morning!
Jin Xu tweet media
English
2
8
44
5.2K
Emilien Dupont retweetledi
Emilien Dupont retweetledi
Jonathan Richard Schwarz
Jonathan Richard Schwarz@schwarzjn_·
Very happy to announce that our latest paper on Neural data compression with INRs, Meta Learning & Sparse Subnetwork selection has been accepted to #ICML2023 (Scores 7, 7, 7). 1/N Paper: arxiv.org/abs/2301.09479
Jonathan Richard Schwarz tweet media
English
1
14
94
18.6K
Emilien Dupont retweetledi
Bobby
Bobby@bobby_he·
Can deep transformers be trained without skip connections nor normalisation layers? Our ICLR 2023 paper shows you how, using wide NN signal propagation ideas. We hope this can potentially pave the way to more efficient deep LLMs! (1/9) Paper: arxiv.org/abs/2302.10322
GIF
English
9
75
451
103.7K
Emilien Dupont retweetledi
Hyunjik Kim
Hyunjik Kim@hyunjik11·
Previously we had introduced *functa*, a framework for representing data as neural functions (aka neural fields, INRs) and doing deep learning on them. In our recent work *spatial functa* we show how to scale up the approach to ImageNet-1k 256x256. 📝arxiv.org/abs/2302.03130
Hyunjik Kim tweet media
Hyunjik Kim@hyunjik11

Ever wondered why deep learning is always done on array data?🤔 Happy to announce our work: From data to functa: Your data point is a function and you can treat it like one 📝arxiv.org/abs/2201.12204 w/ @emidup @arkitus @DaniloJRezende @danrsm, to appear in ICML22

English
1
36
176
48.4K