
pierre orhan
98 posts

pierre orhan
@PierreOrhan
Exploring with giants’ treasure maps. PhD student with Yves Boubenec and Jean-Rémi King





This is often misunderstood by commenters. The cost of training a model is not the cost of the final run. It is that cost plus the cost of all preceding experiments and tuning.













We then showed with optogenetic experiments and @PierreOrhan's clever simulations that it's because interneurons inherit their tuning from local excitatory inputs and NOT from the upstream thalamic inputs. We predict this rule holds for other cortical networks... [3/n]









Want to train your own Bark/MusicGen-like TTS/TTA models? 👀 The SoTA Encodec model by @MetaAI has now landed in 🤗Transformers! It supports compression up to 1.5KHz and produces discrete audio representations. ⚡️ Model: #overview" target="_blank" rel="nofollow noopener">huggingface.co/docs/transform…
Colab: github.com/Vaibhavs10/not…


A Cookbook of Self-Supervised Learning Self-supervised learning, dubbed the dark matter of intelligence, is a promising path to advance machine learning. Yet, much like cooking, training SSL methods is a delicate art with a high barrier to entry. While many components are familiar, successfully training a SSL method involves a dizzying set of choices from the pretext tasks to training hyper-parameters. Our goal is to lower the barrier to entry into SSL research by laying the foundations and latest SSL recipes in the style of a cookbook abs: arxiv.org/abs/2304.12210









