Miles Williams

12 posts

Miles Williams banner
Miles Williams

Miles Williams

@miles_wil

PhD student at @SheffieldNLP

Katılım Nisan 2022
240 Takip Edilen138 Takipçiler
Miles Williams retweetledi
Nikos Aletras
Nikos Aletras@nikaletras·
Gutted to miss #NAACL2025 😭 but Miles @miles_wil will be there presenting the following papers: 📄 Main: Self-Calibration for Language Model Quantization and Pruning 📄 RepL4NLP: Vocabulary-Level Memory Efficiency for LM Fine-Tuning Check them out!
English
0
2
15
1.2K
Miles Williams retweetledi
Nikos Aletras
Nikos Aletras@nikaletras·
Synthetic calibration data (for pruning and quantization) generated by the LLM itself is a better approx of the pre-training data dist than "external" data. Really cool work by Miles (@miles_wil) and George (@soon1otis) to be presented at #NAACL2025 Link to the paper 👇
Nikos Aletras tweet media
English
1
7
47
3K
Miles Williams retweetledi
Nikos Aletras
Nikos Aletras@nikaletras·
We're looking for a #PhD student to work on multimodal LLMs. This is a fully-funded scholarhsip (including stipend), open to home and international candidates. Deadline: 29/1/2025 Please spread the work! #nlproc
Cass Zhixue@casszzx

📢 Fully funded PhD opportunity alert! Open to international🌎and UK applicants. Join Sheffield NLP, one of the biggest NLP groups in academics in the UK. Plus, you'll get to work with me & the amazing @nikaletras 🤝 (not to brag but we're pretty cool 😎) findaphd.com/phds/project/f…

English
1
9
14
4.6K
Miles Williams retweetledi
Cass Zhixue
Cass Zhixue@casszzx·
🔍 Lighter is better! 📚 Our latest TACL paper reveals some fascinating aha moments for #LLMs. Check it out! 🚀 #NLProc
Nikos Aletras@nikaletras

"Investigating Hallucinations in Pruned #LLMs for Abstractive Summarization" to appear in TACL! Work w/ George @soon1otis, Cass @casszzx & Miles @miles_wil 📜 arxiv.org/pdf/2311.09335 🔎Hallucinations from 🪚 LLMs ⬇️ 🔎🪚models rely more on the src for summary generation

English
0
2
9
1.4K
Miles Williams retweetledi
Nikos Aletras
Nikos Aletras@nikaletras·
Job opportunity @SheffieldNLP 🚨: I'm looking for a #postdoc (24 months) in #NLProc The (very) broad topic is on addressing LLMs limitations (e.g. hallucinations, "reasoning", interpretability etc.) If you are interested drop me an email or DM Apply: jobs.ac.uk/job/DHP918/res…
English
1
29
55
12.7K
Miles Williams retweetledi
Nikos Aletras
Nikos Aletras@nikaletras·
A fully-funded 3.5y scholarship (UK/Home applicants) for a PhD in #NLProc is available! Broad topic: efficient and robust alignment of LLMs. Application deadline: Jun 10th More info: findaphd.com/phds/project/e…
English
0
12
19
3.4K
Miles Williams retweetledi
Leon Derczynski ⚒️☁️🏔️🌲
Efficient NLP methods - an up-to-date survey, to appear in TACL. We cover efficiency wrt: * Data * Model design * Pre-training * Fine-tuning * Inference & compression * Hardware utilization * Evaluation * Model selection This was a blast to co-produce! arxiv.org/abs/2209.00099
English
6
79
289
36K
Miles Williams retweetledi
EngineeringSheffield
EngineeringSheffield@SheffUniEng·
The generative AI race has a dirty secret. Dr Nafise Sadat Moosavi, from @shefcompsci comments on the need to make large language models like the ones used by Google and Microsoft more efficient. wired.co.uk/article/the-ge…
English
0
2
3
939
Miles Williams retweetledi
Nikos Aletras
Nikos Aletras@nikaletras·
If you're interested in memory efficient pre-trained LMs chech out Huiyin's new paper on vocabulary and tokenization independent Hashformer models (to appear at #EMNLP2022) 👇
Huiyin Xue@HuiyinXue

👏Happy to announce that our paper 'HashFormers: Towards Vocabulary-independent Pre-trained Transformers' got accepted at #EMNLP2022. All thanks to my favourite supervisor @nikaletras🙏🍺 Paper👀👇: arxiv.org/abs/2210.07904. [1/k]

English
0
3
21
0