Eva Dyer

203 posts

Eva Dyer banner
Eva Dyer

Eva Dyer

@evadyer

Associate Professor @GeorgiaTech

Atlanta, GA Katılım Ağustos 2011
358 Takip Edilen916 Takipçiler
Eva Dyer retweetledi
𝐊𝐞𝐧𝐣𝐢 𝐋𝐞𝐞
Josh Siegle and I are thrilled to be chairing the first workshop on "Bridging the gap between cell types and spike trains"! We see this as the key link between population-level descriptions of dynamics and real mechanistic understanding from cell types. celltypestospikesworkshop.github.io/2026/
𝐊𝐞𝐧𝐣𝐢 𝐋𝐞𝐞 tweet media
English
1
16
95
6.3K
Eva Dyer retweetledi
Nanda H Krishna
Nanda H Krishna@nandahkrishna·
I’ll be presenting POSSM at #NeurIPS2025 tomorrow, together with @averyryoo. Come by our poster for cute stickers and to chat about neural decoding, BCIs, and foundation models for neuroscience! 🧠🤖 🗓️ Dec 3rd 🕚 11:00 am – 2 pm 📍 Poster #2000 Exhibit Hall C,D,E
Nanda H Krishna tweet media
Nanda H Krishna@nandahkrishna

New preprint! 🧠🤖 How do we build neural decoders that are: ⚡️ fast enough for real-time use 🎯 accurate across diverse tasks 🌍 generalizable to new sessions, subjects, and species? We present POSSM, a hybrid SSM architecture that optimizes for all three of these axes! 🧵1/7

English
2
7
21
3.8K
Eva Dyer retweetledi
Vinam Arora
Vinam Arora@vinam_arora·
Excited to share our #NeurIPS2025 work: NuCLR, a framework for learning neuron-level representations 🧠 These embeddings capture the biological identity of neurons and work out-of-the-box on new animals; no finetuning needed 💃 This offers some of the first evidence that large-scale neuroscience models can truly generalize across animals. Paper: arxiv.org/abs/2512.01199 Code: github.com/nerdslab/nuclr If you are at NeurIPS in San Diego, come find us at Poster Session 5 (11am-3pm PT, Exhibit Hall C,D,E, # 2107) 🎉 1/x 🧵
English
4
33
137
23.2K
Eva Dyer retweetledi
Mehdi Azabou @ NeurIPS
Mehdi Azabou @ NeurIPS@mehdiazabou·
I’m excited to share that I’ve started a new company: Constellation. Our thesis is simple: the next frontier of AI will be in modeling human experience in all its richness: brain🧠, body🧍and environment 🌐 I’ll be at NeurIPS with my co-founder @Biofall. We're hiring, DM me!
Mehdi Azabou @ NeurIPS tweet media
English
4
4
18
1.6K
Eva Dyer retweetledi
Mehdi Azabou @ NeurIPS
Mehdi Azabou @ NeurIPS@mehdiazabou·
The Foundation Models for the Brain and Body workshop is happening this week at #NeurIPS2025 🏝️🧠 We have an amazing lineup of keynote speakers, spotlight talks, posters and demos. We can’t wait to welcome everyone on Saturday!
Mehdi Azabou @ NeurIPS tweet media
English
2
10
31
4.8K
Eva Dyer retweetledi
The Transmitter
The Transmitter@_TheTransmitter·
How can we make progress in developing a general model of neural computation rather than a series of disjointed models tied to specific experimental circumstances, ask @evadyer and Blake Richards @tyrell_turing in the latest entry in our NeuroAI series. thetransmitter.org/neuroai/accept…
English
1
13
26
6.4K
Eva Dyer retweetledi
Denise J. Cai, Ph.D.
Denise J. Cai, Ph.D.@denisejcai·
Looking forward to a visit from @evadyer on Thursday! Eva is working at the forefront of the intersection between machine learning, neuroscience, and neuroAI 🧠 Come check out her talk and learn more about her work here: dyerlab.gatech.edu
Denise J. Cai, Ph.D. tweet media
English
0
5
25
2.6K
Eva Dyer retweetledi
Cole Hurwitz
Cole Hurwitz@cole_hurwitz·
What will a foundation model for the brain look like? We argue that it must be able to solve a diverse set of tasks across multiple brain regions and animals. Check out our preprint where we introduce a multi-region, multi-animal, multi-task model (MtM): arxiv.org/abs/2407.14668
English
5
62
257
36.4K
Eva Dyer retweetledi
Georgia Tech Computing
Georgia Tech Computing@gtcomputing·
Meet the @GeorgiaTech experts who are helping unlock the future of #AI. These experts will share their latest research findings in machine learning on the world stage at @icmlconf (July 21-27). Tech experts are part of 40 teams with new research, and the institute is the lead organization on 22 of the teams. Explore the work now through interactive 📊 charts and news highlights from @GTCSE: 🔗sites.gatech.edu/research/icml-…
Georgia Tech Computing tweet media
English
1
10
45
8.2K
Eva Dyer retweetledi
Ted Werbel
Ted Werbel@tedx_ai·
This might be the secret to breaking through the next plateau in deeper reasoning, planning and retrieval capabilities for AI agents 🤔 LGGMs (large generative graph models) are on the rise! While @Adobe and @intel were first at it with LGGMs, researchers at @GeorgiaTech have trained their own model called GraphFM (links below). It’s important that more progress is made on this front to improve causal grounding with graph-based retrieval, DAG generation for LLM Compiler like planning mechanisms and for graph-based self-discovery + continual learning agents with graph-based CLIN to further enhance reasoning, decision-making and environmental grounding for AI agents. Existing implementations of knowledge graph generation (like with GraphRAG) rely on LLMs to define entities/relationships which isn’t always accurate... moving to LGGMs may finally unlock the potential of graphs for many of the use-cases outlined above. Excited to see some infrastructure providers in the next 6-12 months start scaling and offering these kinds of models which I think will play a critical role in making agents substantially more reliable when combined with similar design patterns as linked to below - and especially when combined with optimization frameworks like DSPy and Agent Symbolic Learning. Have a feeling that domain-specific SGMs (small graph models) or frameworks to build your own SGMs for distributed agentic systems will be next to come… Read for yourself, connect the dots and thank @divyyansha1115, @mehdiazabou, @vinam_arora and @evadyer for their amazing work! 🔥 GraphFM by Georgia Tech: arxiv.org/abs/2407.11907 LGGMs by Intel & Adobe: arxiv.org/pdf/2406.05109 LGGM Code, Demo & Datasets: lggm-lg.github.io Self Discover: arxiv.org/abs/2402.03620 CLIN: arxiv.org/abs/2310.10134 LLM Compiler: arxiv.org/abs/2312.04511 DSPy: github.com/stanfordnlp/ds… Agent Symbolic Learning: arxiv.org/abs/2406.18532
English
2
3
15
1.4K
Eva Dyer retweetledi
Patrick Mineault
Patrick Mineault@patrickmineault·
Be a content creator for the Neuromatch NeuroAI course! We're looking for people to write tutorials on transfer learning and RL in PyTorch over the next 3 weeks. If you want to help build this amazing course, DM me for details or fill out this app: airtable.com/app32npl2ZlbJv…
English
4
13
41
7.7K
Eva Dyer retweetledi
Chris Versteeg
Chris Versteeg@chris_versteeg·
Cosyne Workshop Alert! On Tuesday March 5th, @chethan and I are proud to bring you: Understanding Neural Computation using Task-trained and Data-trained Networks. youtu.be/bJ0stLORdgQ
YouTube video
YouTube
English
3
20
60
12.2K
Eva Dyer retweetledi
Jascha Sohl-Dickstein
Jascha Sohl-Dickstein@jaschasd·
Have you ever done a dense grid search over neural network hyperparameters? Like a *really dense* grid search? It looks like this (!!). Blueish colors correspond to hyperparameters for which training converges, redish colors to hyperparameters for which training diverges.
English
298
2.2K
11.3K
1.8M
Eva Dyer retweetledi
Sara Hooker
Sara Hooker@sarahookr·
Today, I am very proud share what we have been working on for the last 14 months. ✨ Introducing Aya -- a new state-of-art for massively multilingual models. 🔥🎉
Cohere Labs@Cohere_Labs

Today, we’re launching Aya, a new open-source, massively multilingual LLM & dataset to help support under-represented languages. Aya outperforms existing open-source models and covers 101 different languages – more than double covered by previous models. cohere.com/research/aya

English
70
158
998
97.9K
Eva Dyer retweetledi
Kording Lab 🦖
Kording Lab 🦖@KordingLab·
"Why the simplest explanation isn’t always the best" - commentary with @evadyer highlighting how dimensionality reduction does not usually give us what we want. pnas.org/doi/10.1073/pn… "Major Achievement: Dino scatterplot in paper" unlocked.
English
7
65
253
23.8K
Eva Dyer retweetledi
Guillaume Lajoie
Guillaume Lajoie@g_lajoie_·
Neural spiking data and Transformers are a tricky match. Temporal segmentation and tokenization are the crux. Together with an all-star team, we figured out a scalable sol. The results are exciting: training and transferring on multi-sessions, multi-subjects neural decoding tasks
Mehdi Azabou @ NeurIPS@mehdiazabou

Is a universal brain decoder possible? Can we train a decoding system that easily transfers to new individuals/tasks? Check out our #NeurIPS2023 paper where we show that it’s possible to transfer from a large pretrained model to achieve SOTA 🧠! Link: poyo-brain.github.io 🧵

English
0
5
65
5.2K