Leon Engländer

33 posts

Leon Engländer

Leon Engländer

@LeonEnglaender

code gen agents research @cohere | open-source @AdapterHub | prev. @UKPLab

Germany เข้าร่วม Ekim 2022
276 กำลังติดตาม48 ผู้ติดตาม
ทวีตที่ปักหมุด
Leon Engländer
Leon Engländer@LeonEnglaender·
📢 New preprint 🎉 We introduce "M2QA: Multi-domain Multilingual Question Answering", a benchmark for evaluating joint language and domain transfer. We present 5 key findings - one of them: Current transfer methods are insufficient, even for LLMs! 📜arxiv.org/abs/2407.01091 🧵👇
Leon Engländer tweet media
English
2
2
14
4.7K
Leon Engländer รีทวีตแล้ว
Clifton Poth
Clifton Poth@clifapt·
Took Claude up for a spin on the weekend and started a quick open-source self-hosted re-implementation Thinking Machines' Tinker API: github.com/calpt/open-tin…
English
0
3
7
304
Leon Engländer
Leon Engländer@LeonEnglaender·
@derguene @PfeiffJo We also have composition blocks for combining adapters (like stacking or multi-task learning) besides plain adapter merging. Plus, methods like bottleneck adapters & PyReFT (and their variants), which go beyond what's standard in PEFT, giving you more building blocks for research
English
0
0
2
23
Leon Engländer
Leon Engländer@LeonEnglaender·
@derguene @PfeiffJo Hi @derguene, as one of the maintainers of Adapters, I can answer this🙂Adapters is more specialised towards research: We focus on making it easy to modify/create methods & offer more flexibility. For instance, you can add multiple heads to a model & switch them out (1/2)
English
1
0
2
26
Leon Engländer
Leon Engländer@LeonEnglaender·
Thrilled about our new Adapters release!🎉I had a blast working on this version, especially contributing to the new plugin interface (like adding ModernBERT) and helping with the VeRA adapter method. Have a look at the full thread for all the awesome updates from our team 👇
AdapterHub@AdapterHub

🚀Adapters v1.2 is out!🚀 We've made Adapters incredibly flexible: Add adapter support to ANY Transformer architecture with minimal code! We used this to add 8 new models out-of-the-box, incl. ModernBERT, Gemma3 & Qwen3! Explore this +2 new adapter methods in this thread👇(1/5)

English
0
1
3
546
Leon Engländer รีทวีตแล้ว
AdapterHub
AdapterHub@AdapterHub·
🎁 A new update of the Adapters library is out! Check out all the novelties, changes & fixes here: github.com/adapter-hub/ad…
English
0
4
5
626
Leon Engländer รีทวีตแล้ว
UKP Lab
UKP Lab@UKPLab·
🎉M2QA has been accepted to #EMNLP Findings!🎉 M2QA is a new multilingual and multidomain QA dataset. We show that current transfer methods are insufficient and that language & domain transfer aren't independent! 📄 Paper: arxiv.org/abs/2407.01091 👇👇👇 twitter.com/LeonEnglaender…
Leon Engländer@LeonEnglaender

📢 New preprint 🎉 We introduce "M2QA: Multi-domain Multilingual Question Answering", a benchmark for evaluating joint language and domain transfer. We present 5 key findings - one of them: Current transfer methods are insufficient, even for LLMs! 📜arxiv.org/abs/2407.01091 🧵👇

English
0
2
15
843
Leon Engländer รีทวีตแล้ว
Hannah
Hannah@h_sterz·
Do you DARE? Introducing a multiple-choice VQA benchmark ✨DARE✨ with: - 4 main robustness evaluation ⛓️ - 5 diverse categories 🧩 - Extensive analysis of 4 widely used VLMS 🤖
English
1
7
15
2.7K
Leon Engländer รีทวีตแล้ว
Leon Engländer
Leon Engländer@LeonEnglaender·
We've just released v1.0 of Adapters!🎉 A lot has happened since the original release of Adapters. Major highlight: Adapter Merging, which opens up fascinating possibilities for transfer learning – hence it's at the heart of my current research. Stay tuned for upcoming papers!🔬
AdapterHub@AdapterHub

🎉Adapters 1.0 is here!🚀 Our open-source library for modular and parameter-efficient fine-tuning got a major upgrade! v1.0 is packed with new features (ReFT, Adapter Merging, QLoRA, ...), new models & improvements! Blog: adapterhub.ml/blog/2024/08/a… Highlights in the thread! 🧵👇

English
0
0
11
262
Leon Engländer รีทวีตแล้ว
AdapterHub
AdapterHub@AdapterHub·
🎉Adapters 1.0 is here!🚀 Our open-source library for modular and parameter-efficient fine-tuning got a major upgrade! v1.0 is packed with new features (ReFT, Adapter Merging, QLoRA, ...), new models & improvements! Blog: adapterhub.ml/blog/2024/08/a… Highlights in the thread! 🧵👇
English
2
7
45
5.4K
Leon Engländer รีทวีตแล้ว
Leon Engländer รีทวีตแล้ว
Leon Engländer รีทวีตแล้ว
AdapterHub
AdapterHub@AdapterHub·
📢 New preprint 🎉 We - the AdapterHub team - present the M2QA benchmark to evaluate joint domain and language transfer! 🔬 Key highlight: We show that adapter-based methods on small language models can reach the performance of Llama 3 on M2QA! 🚀 👇
AdapterHub tweet media
Leon Engländer@LeonEnglaender

📢 New preprint 🎉 We introduce "M2QA: Multi-domain Multilingual Question Answering", a benchmark for evaluating joint language and domain transfer. We present 5 key findings - one of them: Current transfer methods are insufficient, even for LLMs! 📜arxiv.org/abs/2407.01091 🧵👇

English
0
2
8
616
Leon Engländer
Leon Engländer@LeonEnglaender·
Overall, we see that Aya23 and gpt-3.5 perform best and - equally exciting - we see that our adapter-based methods can surpass the performance of Llama 2 and reach the performance of Llama 3 even though they only build on the much smaller model XLM-RoBERTa!
Leon Engländer tweet media
English
1
0
1
89
Leon Engländer
Leon Engländer@LeonEnglaender·
📢 New preprint 🎉 We introduce "M2QA: Multi-domain Multilingual Question Answering", a benchmark for evaluating joint language and domain transfer. We present 5 key findings - one of them: Current transfer methods are insufficient, even for LLMs! 📜arxiv.org/abs/2407.01091 🧵👇
Leon Engländer tweet media
English
2
2
14
4.7K