Markus Frey

206 posts

Markus Frey banner
Markus Frey

Markus Frey

@CYHSM

Post-Training @ Lamarr Institute https://t.co/tv32QsnNsx

Katılım Ağustos 2015
336 Takip Edilen464 Takipçiler
Sabitlenmiş Tweet
Markus Frey
Markus Frey@CYHSM·
Our paper "Magnetic resonance-based eye tracking using deep neural networks" is now published in @NatureNeuro Paper: nature.com/articles/s4159… Code: github.com/DeepMReye/Deep… Source Data & Model weights: osf.io/mrhk9/
Matthias Nau@NauMatt

#DeepMReye is out! Use #deeplearning to perform #eyetracking in #fMRI without camera! nature.com/articles/s4159… @CYHSM & I are thrilled to finally share our Code + Data + Notebooks + User Documentation + Paper @NatureNeuro! With @doellerlab @KISNeuro @MPI_CBS Thread below!👇1/7

English
3
23
128
0
Markus Frey
Markus Frey@CYHSM·
I will be presenting this at the ICLR 2026 Latent & Implicit Thinking Workshop. Paper Link: arxiv.org/abs/2603.08391 We are also hiring at Lamarr Institute. If you're interested in working on these kinds of problems, feel free to reach out.
English
0
0
0
32
Markus Frey
Markus Frey@CYHSM·
RT @maxplanckpress: Congratulations to all our amazingly talented young researchers who were honoured with an Otto-Hahn-Medal today for the…
English
0
3
0
12
Markus Frey retweetledi
Mackenzie Weygandt Mathis, PhD
Mackenzie Weygandt Mathis, PhD@TrackingActions·
🎉 new preprint posted of potential interest, lead by talented masters student @AChRD4: 🔥fully unsupervised 3D cell segmentation that is as good (or better) than leading supervised methods, with @napari_imaging integration! 👏 biorxiv.org/content/10.110… ➕new 3D GT dataset ⬇️🧵
GIF
M- Lab of Adaptive Intelligence @EPFL@mwmathislab

Have you tried #CellSeg3D yet for automating your 3D lightsheet microscopy analysis? We just hit >75K pip installations! 🥳 We have some 🔥 coming soon ... but in the meantime, github.com/AdaptiveMotorC… lead by @AChRD4 @Neuro_X_EPFL

English
3
38
165
31.1K
Markus Frey retweetledi
AndrejBicanski
AndrejBicanski@AndrejBicanski·
Dear colleagues, please circulate these ads for two fully funded PhD positions in my new group. Topics can cover grid cells, retrosplenial cortex, head direction, subiculum and boundary coding + cog maps generally. cbs.mpg.de/stellenmarkt/s…
English
1
41
45
14K
Markus Frey
Markus Frey@CYHSM·
@ProfData @ken_lxl @Rob_Mok Yes CLIP is trained using image-text pairs but the vision and language layers are separated, with the language layers capturing less of the spatial information (see lesioning plot); also, there are many spatial cells in models with random weights
Markus Frey tweet mediaMarkus Frey tweet media
English
0
0
1
105
Bradley Love
Bradley Love@ProfData·
@CYHSM @ken_lxl @Rob_Mok Cool blog! Thanks for sharing. My memory (probably wrong) of CLIP is that it has a joint embedding of visual and text (caption) information. I wonder if language could introduce spatial info? It's nice to see another case of spatial info lurking where one might not suspect.
English
1
0
1
450
Bradley Love
Bradley Love@ProfData·
"The inevitability and superfluousness of cell types in spatial cognition" w @ken_lxl @Rob_Mok Whether place, border, head direction, Jennifer Aniston, or whatever cells, are we fooling ourselves? Are these intuitive findings scientific discoveries? 1/6 doi.org/10.1101/2024.0…
English
5
33
114
22K
Markus Frey retweetledi
Mackenzie Weygandt Mathis, PhD
Mackenzie Weygandt Mathis, PhD@TrackingActions·
🚨job alert! Myself, Silvia Arber and Alex Mathis are looking for a postdoc to work with us to do hierarchical modeling of sensorimotor control, using a lot of incredible data 🔥🐭 This builds on our lines of forelimb motor control tasks & modeling! Ad: recruiting.epfl.ch/Vacancies/3178…
English
0
60
128
31.6K
Markus Frey retweetledi
Mona Garvert
Mona Garvert@mona_garvert·
I am hiring a PhD student in cognitive computational #neuroscience! 👩🏻‍🎓👨🏻‍🎓Come and help us discover how the human brain learns, decides and guides flexible behaviour in beautiful Würzburg 🧠🙌 Find out more & apply here: shorturl.at/twHL9 @bioDGPs_DGPA @FENSorg @SfNtweets
Mona Garvert tweet media
English
7
166
269
81.4K
Markus Frey retweetledi
Martin Hebart
Martin Hebart@martin_hebart·
I am looking for two postdocs to join us at Giessen University @jlugiessen, in collab. with the Max Planck Institute @MPI_CBS in Leipzig, funded by @ERC_Research & @ProLOEWE. We have exciting projects lined up but also offer a lot of freedom with implementing own ideas!👇 pls RT!
Martin Hebart tweet media
English
1
122
205
46.8K
Markus Frey retweetledi
Auschwitz Memorial
Auschwitz Memorial@AuschwitzMuseum·
@elonmusk Support memory by excluding tweets of @AuschwitzMuseum from the limits. Most of our tweets commemorate individual victims of the Auschwitz camp on their birthdays. We remind their names, their faces and their fate. They all deserve to be seen and remembered.
English
80
1K
10.1K
834.9K
Markus Frey retweetledi
Matthias Nau
Matthias Nau@NauMatt·
#DeepMReye has a new @streamlit app! 🎊 Camera-less #eyetracking in #fMRI has never been easier. #option-4-streamlit-app" target="_blank" rel="nofollow noopener">github.com/DeepMReye/Deep… ◦ Install & open ◦ Pick a pretrained model ◦ Upload fMRI data ◦ Download gaze coordinates Kudos to @CYHSM for the app & thanks to all contributors! 1/3
Markus Frey@CYHSM

@NauMatt and I are excited to introduce DeepMReye v0.2 🥳Now featuring a local Streamlit app for efficient extraction of gaze coordinates using pre-trained model weights: #option-4-streamlit-app" target="_blank" rel="nofollow noopener">github.com/DeepMReye/Deep… Please let us know if you run into any problems with this!

English
1
5
40
10.6K
Markus Frey
Markus Frey@CYHSM·
@NauMatt and I are excited to introduce DeepMReye v0.2 🥳Now featuring a local Streamlit app for efficient extraction of gaze coordinates using pre-trained model weights: #option-4-streamlit-app" target="_blank" rel="nofollow noopener">github.com/DeepMReye/Deep… Please let us know if you run into any problems with this!
GIF
English
2
7
28
13.2K
Markus Frey
Markus Frey@CYHSM·
Excited to share our work on scene perception in artificial neural networks #CVPR2023. Will talk about egocentric to allocentric reference frame transformations & unsupervised scene segmentation (10.30am - TUE-AM, 202) Can't make it? See Video below👇 youtu.be/anPiPsqjJz8
YouTube video
YouTube
English
1
5
24
3.4K