Tanmoy Chakraborty

1.5K posts

Tanmoy Chakraborty banner
Tanmoy Chakraborty

Tanmoy Chakraborty

@Tanmoy_Chak

Chair Prof in AI, Associate Prof @iitdelhi; ACM Distinguished Speaker; Lab @lcs2lab; Previously @IIITDelhi @UofMaryland @iitkgp; #NLP #LLMs

New Delhi, India Katılım Ekim 2014
817 Takip Edilen2.5K Takipçiler
Sabitlenmiş Tweet
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
🌟 𝐀 𝐍𝐞𝐰 T𝐞𝐱𝐭𝐛𝐨𝐨𝐤 -- 𝐈𝐧𝐭𝐫𝐨𝐝𝐮𝐜𝐭𝐢𝐨𝐧 𝐭𝐨 𝐋𝐚𝐫𝐠𝐞 𝐋𝐚𝐧𝐠𝐮𝐚𝐠𝐞 𝐌𝐨𝐝𝐞𝐥𝐬 🌟 I am excited to share the release of my new textbook, 𝘐𝘯𝘵𝘳𝘰𝘥𝘶𝘤𝘵𝘪𝘰𝘯 𝘵𝘰 𝘓𝘢𝘳𝘨𝘦 𝘓𝘢𝘯𝘨𝘶𝘢𝘨𝘦 𝘔𝘰𝘥𝘦𝘭𝘴 (#LLMs) -- Perhaps the first textbook on LLMs. Target Audience: 👉 Students/beginners, Looking for a structured starting point to learn LLMs 👉 Teachers, planning to offer a course on LLMs 👉 Industry professional, seeking to deepen their understanding of LLMs Explore the Book: 🔗 Book Website: tanmoychak.com/llmbook/ 📑 Table of Contents: tanmoychak.com/llmbook/toc.pdf 🛒 Available on Amazon: amazon.in/dp/936386474X/ Enhance Your Learning Experience: 👉 Slides & Lecture Videos: Chapter-wise resources -- lcs2-iitd.github.io/ELL881-AIL821-… 👉 Exercises & Solutions: Practice with detailed chapter exercises (solutions available on request). 👉 Upcoming @nptel_official Course: Starting January 2025! Preview here: onlinecourses.nptel.ac.in/noc25_cs45/pre… Book Endorsement: 📖 Foreword by Prof. Tim Baldwin @eltimster 👏 Endorsements from Prof. Iryna Gurevych @IGurevych and Prof. Pushpak Bhattacharyya #LLMs #Textbook @iitdelhi @WileyIndiaPL @lcs2lab
Tanmoy Chakraborty tweet media
English
0
22
76
9.2K
Tanmoy Chakraborty retweetledi
Lossfunk
Lossfunk@lossfunk·
2/ The organising committee for CAISc 2026 is led by @paraschopra, @dhruvtrehan9, and @gargdhruv36. We are glad to have @Tanmoy_Chak (IIT Delhi), Palash Goyal (Google Research), Dr Mohan Kankanhalli (NUS AI Institute), Shirish Karande (TCS Research) on our steering committee, and @murari_ai and Pratik Narang as our Program Committee Chairs. Additionally, our program committee for final human review spans CS, Mathematics, electrical engineering, and not just ML.
Lossfunk tweet media
English
1
2
21
2.1K
SpiceJet
SpiceJet@flyspicejet·
@Tanmoy_Chak We sincerely regret any inconvenience caused, Tanmoy. We are in receipt of your email, and have updated our team to check and revert soon.
English
1
0
0
119
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
@flyspicejet Refund of ~₹1.97L for a failed booking on 6 Mar is still pending despite multiple follow-ups (even after the promised 5 working days). Case no: 10509519. No response to email and unsatisfactory customer support. If the refund is not processed immediately, I will be forced to escalate this to consumer protection authorities and financial regulators. Please don't expect people to have infinite bandwidth to follow up regularly.
English
1
0
5
843
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
@flyspicejet our flight from FJR to Delhi got cancelled again today -- SJ9085 at 1305. No rescheduled notification. This is the second time we booked tickets. My 4.5L of ticket fees is on hold. I am accompanied by my 3 yrs old son and wife. Pls reschedule the flight asap. We are not in a position to book other flight due to financial constraints. Pls understand the situation.
English
1
0
0
70
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
@flyspicejet We booked tickets from Fujairah to Delhi for 7th Mar at 13:05. The payment was successful. But we did not receive tickets. Please check asap.
English
1
0
4
829
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
@flyspicejet @flyspicejet pls reply asap to my DM. We are waiting. Either you send us the tickets or cancel the transaction and initiate a refund so that we can book it again.
English
1
0
0
132
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
Our new study on interpretability explains -- 𝐭𝐡𝐞 𝐏𝐡𝐲𝐬𝐢𝐜𝐬 𝐨𝐟 𝐊𝐕 𝐂𝐚𝐜𝐡𝐞 𝐂𝐨𝐦𝐩𝐫𝐞𝐬𝐬𝐢𝐨𝐧 𝐟𝐨𝐫 𝐋𝐋𝐌𝐬 Pre-print: arxiv.org/abs/2603.01426 As context lengths continue to grow, the KV cache has become the primary memory bottleneck during inference. While many compression techniques report impressive memory savings with minimal drops in benchmark accuracy, we asked a more structural question: 👉 𝘞𝘩𝘢𝘵 𝘢𝘤𝘵𝘶𝘢𝘭𝘭𝘺 𝘩𝘢𝘱𝘱𝘦𝘯𝘴 𝘵𝘰 𝘢𝘵𝘵𝘦𝘯𝘵𝘪𝘰𝘯 𝘢𝘯𝘥 𝘳𝘦𝘢𝘴𝘰𝘯𝘪𝘯𝘨 𝘸𝘩𝘦𝘯 𝘸𝘦 𝘤𝘰𝘮𝘱𝘳𝘦𝘴𝘴 𝘵𝘩𝘦 𝘒𝘝 𝘤𝘢𝘤𝘩𝘦? We frame KV compression as a 𝐜𝐨𝐧𝐭𝐫𝐨𝐥𝐥𝐞𝐝 𝐩𝐞𝐫𝐭𝐮𝐫𝐛𝐚𝐭𝐢𝐨𝐧 𝐨𝐟 𝐭𝐨𝐤𝐞𝐧-𝐥𝐞𝐯𝐞𝐥 𝐫𝐨𝐮𝐭𝐢𝐧𝐠 𝐢𝐧 𝐬𝐞𝐥𝐟-𝐚𝐭𝐭𝐞𝐧𝐭𝐢𝐨𝐧. Rather than evaluating only final task accuracy, we design synthetic datasets to probe: (1) Multi-entity tracking, (2) Coreference resolution, and (3) Multi-hop reasoning. This setup allows us to disentangle three critical dimensions: Information Retention, Accessibility, and Utilisation. Our findings reveal an interesting pattern: 👉 𝐌𝐨𝐝𝐞𝐫𝐚𝐭𝐞 𝐜𝐨𝐦𝐩𝐫𝐞𝐬𝐬𝐢𝐨𝐧 often preserves surface-level accuracy despite substantial internal representational degradation — suggesting significant redundancy in current models. 👉 𝐍𝐞𝐚𝐫 𝐞𝐱𝐭𝐫𝐞𝐦𝐞 𝐜𝐨𝐦𝐩𝐫𝐞𝐬𝐬𝐢𝐨𝐧, we observe a sharp "safety cliff" in hallucinations, driven by global erasure of answer-critical tokens. 👉 We also uncover a second failure mode -- 𝐫𝐞𝐩𝐫𝐞𝐬𝐞𝐧𝐭𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐫𝐢𝐠𝐢𝐝𝐢𝐭𝐲 -- where tokens remain present, but routing flexibility collapses. These results suggest that evaluating compression solely through downstream accuracy can mask stronger structural effects on reasoning. Understanding these internal dynamics is crucial as we move toward longer-context and more memory-efficient LLMs. Brilliant work by Ayan Sengupta and Samhruth Ananthanarayanan. #ScienceofLLMs #Interpretability #KVCache #ModelCompression
Tanmoy Chakraborty tweet media
English
1
5
58
3.6K
Joseph Imperial
Joseph Imperial@josephimperial_·
Excited to share that my paper was desk rejected for referencing a table in the appendix #ACL2026
English
11
1
133
26.2K
Tanmoy Chakraborty retweetledi
LCS2 Lab
LCS2 Lab@lcs2lab·
Happy to announce that our paper has been accepted to #ICLR2026! 🎉 📜 Beyond Markovian Drifts: Action-Biased Geometric Walks with Memory for Personalized Summarization 👥 Parthiv Chatterjee, Asish Batha, Tashvi Patel, @sourish_rygbee, @Tanmoy_Chak Congratulations to all authors!
LCS2 Lab tweet media
English
0
1
4
297
Tanmoy Chakraborty
Tanmoy Chakraborty@Tanmoy_Chak·
@DrDatta_AIIMS Always yes. PhD is not about research, but also a life experience -- dedication, perseverance, tolerance, hard work, persistence and many more.
English
0
0
3
211
Manu Awasthi
Manu Awasthi@mnwsth·
Alright, who wants to nominate me for a NASI fellowship?
English
2
1
11
1.7K