Anirvan Sengupta

67 posts

Anirvan Sengupta

Anirvan Sengupta

@AnirvanMS

Theoretical Physicist interested in Neuroscience, Machine Learning, Quantum Physics. Professor @RutgersU and Senior Research Associate @FlatironInst.

Katılım Ocak 2015
87 Takip Edilen41 Takipçiler
Anirvan Sengupta retweetledi
Yasaman Bahri
Yasaman Bahri@yasamanbb·
I'm looking forward to giving a talk tomorrow morning at the ICML workshop on High-Dimensional Learning Dynamics (HiDL) sites.google.com/view/hidimlear…. Come by at 9 am!
English
3
5
134
10.1K
Anirvan Sengupta retweetledi
Jiequn Han
Jiequn Han@JiequnH·
Sad to miss #ICML2025 this time; go check out the awesome @rdMorel presenting our DISCO model. It learns the evolution operator from short trajectory data without knowing physics and stays SOTA on next-step prediction! Poster W-107, this Wed (July 16)!
Rudy Morel@rdMorel

For evolving unknown PDEs, ML models are trained on next-state prediction. But do they actually learn the time dynamics: the "physics"? Check out our poster (W-107) at #ICML2025 this Wed, Jul 16. Our "DISCO" model learns the physics while staying SOTA on next states prediction!

English
1
2
13
1.4K
Anirvan Sengupta
Anirvan Sengupta@AnirvanMS·
We also show how this attention model relates to a one-step update of a modern Hopfield network, clarifying a potential connection indicated by Ramsauer et al. (openreview.net/forum?id=tL89R…)
English
0
0
0
29
Anirvan Sengupta
Anirvan Sengupta@AnirvanMS·
We investigate how a one-layer transformer can solve an in-context denoising problem by averaging over samples from a distribution with the right attention weights.
Anirvan Sengupta tweet media
English
2
0
0
84
Anirvan Sengupta
Anirvan Sengupta@AnirvanMS·
At #ICML2025, presenting work done with Matthew Smart and @albertobietti on in-context denoising (arxiv.org/abs/2502.05164). Matthew's oral is on Thursday, 4:15-4:30 PM, at West Ballroom A, and our poster #E-3207 is presented on Thursday, 4:30-7:00 PM, at East Exhibition Hall A-B.
English
1
2
16
977
Anirvan Sengupta
Anirvan Sengupta@AnirvanMS·
We also show how this attention model relates to a one-step update of a modern Hopfield network, clarifying a potential connection indicated by Ramsauer et al. (openreview.net/forum?id=tL89R…)
English
1
0
0
47
Anirvan Sengupta retweetledi
Dmitry Krotov
Dmitry Krotov@DimaKrotov·
Nice article! I appreciate that it mentions my work and the work of my students. I want to add to it. It is true that there is some inspiration from spin glasses, but Hopfield is much bigger than spin glasses. The key ideas that resurrected artificial neural networks in 1982 were: collective and distributed computation. Individual compute units (neurons) comply with only local rules. Despite that computation EMERGES at the network level. We can kill individual neurons and connections (up to an extent) and the network will continue performing computation. The ideas of emergence (things that you don’t put into the system “by hands” but that appear spontaneously by means of interactions) are ubiquitous in Physics - far beyond a relatively small field of spin glasses. Physicists LOVE emergence and have some of the best tools to study it. Some of the ideas that were credited by the Nobel Prize existed before. What was remarkable about Hopfield and Hinton is that they had good taste, right vision, and relentlessly executed on their vision despite community at large believing that they were wrong. I find it amusing when I speak with some of my colleagues from more traditional physics departments and hear comments like “hmmm, this is nice, but is this really Physics?”. The field of Physics of Computation was created by R.Feynman, J.Hopfield, and C.Mead in the 1980s. Back then it was not anything controversial that THIS IS PHYSICS. Fast forward to 2025, AI is running a trillion dollar economy and impacting every aspect of our lives, some of the most consequential experiments in human history are being run in data centers around the world, the Nobel Prize in Physics has been awarded to artificial neural networks. How come we still wonder if this is Physics? My humble prediction: those departments and companies that embrace Physics of Neural Computation and invest in it will thrive, and those who don’t will become obsolete very soon.
Quanta Magazine@QuantaMagazine

The 2024 Nobel Prize in Physics went to Geoffrey Hinton (left) and John Hopfield for their work on the statistical physics of neural networks. Modern AI would not exist without their clever methods of studying and applying randomness. quantamagazine.org/the-strange-ph…

English
0
12
45
9.7K
Anirvan Sengupta retweetledi
Alberto Bietti
Alberto Bietti@albertobietti·
Come hear Matt Smart's talk about in-context denoising with transformers at the Associative memory workshop #ICLR25, 2:15pm! This task refines the connection between transformers and associative memories. w/ M Smart and @AnirvanMS at @FlatironInst Paper: arxiv.org/abs/2502.05164
English
0
2
9
737
Anirvan Sengupta
Anirvan Sengupta@AnirvanMS·
@SuryaGanguli Done! Also, I don’t think the current madness is just a result of left’s overreach (however irritating). The trouble is that part of the madness, especially the attack on academia, there is method in’t.
English
0
0
2
329
Anirvan Sengupta retweetledi
Kelton Minor
Kelton Minor@keltonminor·
There are days in life that shake you. I’m shattered 💔 to share that I just found out that the US Government terminated my 2024 NIH Director’s Early Independence Award (~$2 million), threatening my long-promised assistant professor job at @Columbia & academic career... 1/🧵
Kelton Minor tweet media
English
354
1.1K
5.1K
613.7K
Anirvan Sengupta retweetledi
Nihal Sarin
Nihal Sarin@NihalSarin·
Nihal Sarin tweet mediaNihal Sarin tweet media
ZXX
97
172
1.9K
148.8K
Anirvan Sengupta
Anirvan Sengupta@AnirvanMS·
@Rahulda04798151 @ChessbaseIndia @MagnusCarlsen @DGukesh Nihal Sarin is tantalizingly close to 2700+ rating (classical live rating 2694.2 right now). Two wins over strong enough players would do it. He can surely get there, but perhaps needs some strategizing. His long time coach Srinath is now working exclusively with Arjun Erigaisi.
English
1
0
1
77
Dr Das
Dr Das@Rahulda04798151·
@ChessbaseIndia @MagnusCarlsen @DGukesh As a keralite im curious to know about Nihal sarin how well he is doing recently, has he completely gone or just out of sight of media?
English
1
0
0
327
ChessBase India
ChessBase India@ChessbaseIndia·
World no.1 Magnus Carlsen, FIDE World Cup 2023 winner, on his best match at the tournament: " I think I had by far my best day at the tournament on the first day against Gukesh, which was a game that I really enjoyed... ... I think Gukesh is extremely strong, and he was unlucky to face me on that one day when I really showed my best." Photo: @photochess
ChessBase India tweet media
English
49
467
12.2K
1.1M