Hadeel Alnegheimish

342 posts

Hadeel Alnegheimish banner
Hadeel Alnegheimish

Hadeel Alnegheimish

@halnegheimish

I tweet, occasionally. Postdoc @MIT_CSAIL. Previously Computing PhD @imperialcollege, R.S. intern @DeepMind

Cambridge, MA Beigetreten Mayıs 2016
361 Folgt384 Follower
Hadeel Alnegheimish retweetet
alex zhang
alex zhang@a1zhang·
All the recordings for the @GPU_MODE x @scaleml series are up as a playlist in case you missed it 😁 There's so much value in these ~8 hours of lectures, from proving quantization error bounds on a whiteboard to a deep-dive into GPU warp schedulers! Plz take advantage of it!
alex zhang tweet media
English
7
101
644
61.2K
Hadeel Alnegheimish retweetet
Omar Khattab
Omar Khattab@lateinteraction·
Imagine learning the latest research/practice on MoEs, quantization, positional encodings, and other foundation model topics in single week by listening to this mixture of wonderful experts. Don’t miss it!
alex zhang@a1zhang

announcing the @GPU_MODE x @scaleml summer speaker series happening next week, a 5⃣-day series where top researchers will teach about the algorithmic and systems-level advances that underpin `gpt-oss`! all content will be live-streamed & recorded for FREE on GPU MODE's YouTube!

English
0
10
108
8.9K
Hadeel Alnegheimish
Hadeel Alnegheimish@halnegheimish·
@yassoma Oh I meant it's the year your lunar and solar birthdays are within the same week
English
1
0
2
150
Yasmeen
Yasmeen@yassoma·
@halnegheimish me being a year younger than my class will never stop haunting me
English
1
0
1
176
Yasmeen
Yasmeen@yassoma·
20 days till I turn 33
English
2
0
16
836
Hadeel Alnegheimish retweetet
ibkfellowship.mit
ibkfellowship.mit@ibkfellowship·
We are excited to share the recent MIT News article on Sarah Alnegheimish, a former researcher with our sibling program, CCES. Sarah is a PhD student in @kveeramac's lab in @MITLIDS. Her goal is to make machine learning systems more accessible. news.mit.edu/2025/anomaly-d…
English
0
2
9
650
Shahad | شهد الطخيم
Shahad | شهد الطخيم@Shahad_cpc·
حين تصبح القصيدة سيرة.. ويصبح الحلم واقعًا.. بكل فخر واعتزاز، أبارك لوالدتي هنادي عبدالعزيز المسلم @H_M__ksa ترقيتها إلى منصب وزير مفوض في وزارة الخارجية. هذه الترقية ليست فقط تتويجًا لمسيرة مهنية حافلة بالعطاء، بل هي شهادة على ما يمكن أن تصنعه النية المخلصة، والعمل الصادق، والصبر في وجه التحديات. والدتي.. كانت وما زالت نموذجًا يحتذى به في التوازن بين الأدوار المختلفة، دون أن تتنازل عن طموحها ولا عن إنسانيتها. هذه اللحظة تحمل لي بعدًا خاصًا… لأن أمي لا تسير وحدها في هذا الدرب. بل تكمل إرث والدها – جدي – عبدالعزيز محمد المسلم، الذي كان دبلوماسيًا شغوفًا بهذا الوطن، يرى في خدمته شرفًا لا يُعلى عليه. كان الوطن يسكن قلبه، وتفاصيله تشغله حتى في لحظاته الخاصة. جدي لم يكن دبلوماسيًا فحسب، بل شاعرًا يسطر حروفه بكل شغف. ترك أثره في كلماته، وفي من أحبهم، وفي من حملوا رسالته من بعده. وكان الوطن، قصيدته الكبرى.. أمي، بكل وفاء وحب وعطاء، تكمل تلك القصيدة. تسير على خطاه، ببصمتها الخاصة، وصوتها الخاص. وأنا متأكدة، لو كان بيننا اليوم، لكانت عيناه تمتلئان فخرًا وحبًا…لأن المسيرة التي بدأها، لم تنقطع.. بل امتدت واستمرت لتتألق بأجمل صورة لها، فمن لها غيرك يا أمي؟ أمي، فخري بكِ لا تصفه الكلمات. الوطن محظوظ بك… وأنا محظوظة أكثر وأكثر. مبروك علينا جميعاً بك!
Shahad | شهد الطخيم tweet mediaShahad | شهد الطخيم tweet media
العربية
2
4
11
11K
Hadeel Alnegheimish
Hadeel Alnegheimish@halnegheimish·
@_joestacey_ Break it down! It's much less daunting when it's small, concrete tasks. One thing I really loved from the thesis retreat is having a meta doc where you plan your writing. Defines each chapter, items to be done, materials you have, .. etc. Having everything in one place helped!!
English
1
0
2
134
Joe Stacey @ EACL
Joe Stacey @ EACL@_joestacey_·
90 minutes into writing up my PhD thesis, and my word count is still 0 Any nice tips on thesis writing?
English
11
0
29
3.3K
Hadeel Alnegheimish retweetet
MIT NLP
MIT NLP@nlp_mit·
Hello everyone! We are quite a bit late to the twitter party, but welcome to the MIT NLP Group account! follow along for the latest research from our labs as we dive deep into language, learning, and logic 🤖📚🧠
MIT NLP tweet media
English
26
54
551
105.5K
Hadeel Alnegheimish retweetet
Lakshya Jain
Lakshya Jain@lxeagle17·
I'm teaching databases this semester at Berkeley. My students all seem unusually brilliant. Not many go to office hours, and not too many folks post on the course forum asking project questions. Weirdly, the exam had the lowest recorded average in my 10 semesters teaching it.
English
1.2K
5.5K
118.8K
10.2M
Hadeel Alnegheimish retweetet
Isha Puri
Isha Puri@ishapuri101·
[1/x] can we scale small, open LMs to o1 level? Using classical probabilistic inference methods, YES! Joint @MIT_CSAIL / @RedHat AI Innovation Team work introduces a particle filtering approach to scaling inference w/o any training! check out …abilistic-inference-scaling.github.io
Isha Puri tweet media
English
2
67
234
45K
Hadeel Alnegheimish retweetet
Jay Alammar
Jay Alammar@JayAlammar·
We're ecstatic to bring you "How Transformer LLMs Work" -- a free course with ~90 minutes of video, code, and crisp visuals and animations that explain the modern Transformer architecture, tokenizers, embeddings, and mixture-of-expert models. @MaartenGr and I have developed a lot of the visual language over the last several years (tens of thousands of iterations for hundreds of figures) for the book. But to have an opportunity to collaborate with the legendary @AndrewYNg, we took them to the next level with animations and a concise narrative meant to enable technical learners to pick up an ML paper and understand the architecture description. Link in comments
Andrew Ng@AndrewYNg

Announcing How Transformer LLMs Work, created with @JayAlammar and @MaartenGr, co-authors of the beautifully illustrated book, “Hands-On Large Language Models.” This course offers a deep dive into the inner workings of the transformer architecture that powers large language models (LLMs). The transformer architecture revolutionized generative AI; in fact, the "GPT" in ChatGPT stands for "Generative Pre-Trained Transformer." Originally introduced in the Google Brain team's groundbreaking 2017 paper "Attention Is All You Need," by Vaswani and others, transformers were a highly scalable model for machine translation tasks. Variants of this architecture now power today’s LLMs such as those from OpenAI, Google, Meta, Cohere, Anthropic and DeepSeek. In this course, you’ll learn in detail how LLMs process text. You'll also work through code examples that illustrate that transformer's individual components. In details, you’ll learn: - How the representation of language has evolved, from Bag-of-Words to Word2Vec embeddings to the transformer architecture that captures a word's meanings taking into account the context of other words in the input. - How inputs are broken down into tokens before they are sent to the language model. - The details of a transformer's main stages: Tokenization and embedding, the stack of transformer blocks, and the language model head. - The inner workings of the transformer block, including attention, which calculates relevance scores, and the feedforward layer, which incorporates stored information learned in training. - How cached calculations make transformers faster. - Some of the most recent ideas in the latest models such as Mixture-of-Experts (MoE) which uses multiple sub-models and a router on each layer to improve the quality of LLMs. By the end of this course, you’ll have a deep understanding of how LLMs actually process text and be able to read through papers describing the latest models and understand the details. Gaining this intuition will improve your approach to building LLM applications. Please sign up here: deeplearning.ai/short-courses/…

English
28
211
1.5K
137.3K
Hadeel Alnegheimish retweetet
Foreign Ministry 🇸🇦
Foreign Ministry 🇸🇦@KSAmofaEN·
#Statement | The Foreign Ministry affirms that Saudi Arabia’s position on the establishment of a Palestinian state is firm and unwavering. HRH Prince Mohammed bin Salman bin Abdulaziz Al Saud, Crown Prince and Prime Minister clearly and unequivocally reaffirmed this stance.
Foreign Ministry 🇸🇦 tweet media
English
1.3K
4.5K
13.8K
2.4M
Hadeel Alnegheimish
Hadeel Alnegheimish@halnegheimish·
Arriving in Riyadh just in time to join in the celebrations of this historic day. Looking forward to welcoming the world with warm Saudi hospitality 💚 🇸🇦
الاتحاد السعودي لكرة القدم@saudiFF

رسميًا: الاتحاد الدولي لكرة القدم FIFA يعلن فوز المملكة العربية السعودية باستضافة كأس العالم 2034 🏆 #أهلًا_بالعالم #السعودية34

English
0
0
3
498
Hadeel Alnegheimish retweetet
Songlin Yang
Songlin Yang@SonglinYang4·
(1/10) Excited to share one of the most elegant works I’ve been working on: Parallelizing Linear Transformers with the Delta Rule over Sequence Length! 🎉 📄 Published at NeurIPS ‘24 📍 Catch my poster in person: NeurIPS East Exhibit Hall A-C #2009 🗓️ Fri, Dec 13 | 4:30–7:30 p.m
Songlin Yang tweet mediaSonglin Yang tweet media
English
3
70
339
38.3K
Hadeel Alnegheimish retweetet
Sarah Alnegheimish
Sarah Alnegheimish@salnegheimish·
Excited to see Orion (github.com/sintel-dev/Ori…) reaching 100k downloads on pypi! It's rewarding to see the impact of developing a usable library for time series anomaly detection!
English
0
2
9
2.6K
Hadeel Alnegheimish retweetet
Jason Hickel
Jason Hickel@jasonhickel·
The images that I see coming out of Gaza each day—of shredded children, piles of twisted corpses, dehumanisation in torture camps, people being burned alive—are morally indistinguishable from the images I have seen in Holocaust museums. Pure evil on a horrifying scale.
English
69
2K
5.3K
114.9K