Jan Buchmann

34 posts

Jan Buchmann

Jan Buchmann

@Koby_Loby

PhD student in NLP @UKPLab Scientific documents, document structure, linking

Frankfurt/Darmstadt Katılım Ağustos 2013
124 Takip Edilen112 Takipçiler
Jan Buchmann retweetledi
UKP Lab
UKP Lab@UKPLab·
📢 10 papers authored or co-authored by UKP members have been accepted for publication at #EMNLP2024 in Miami! 🇺🇸 And 2 papers from CL and TACL will be presented at the conference 🎉 Congratulations to everyone involved! 💐 – be sure to check them out in this thread (1/🧵):
UKP Lab tweet media
English
2
5
59
3.9K
Jan Buchmann retweetledi
Nico Daheim
Nico Daheim@ndaheim_·
Please drop by our poster tomorrow and talk to my co-authors about how IVON can directly optimize a variational objective with similar performance and cost as Adam but many things on top (uncertainty, model adaptation, diagnosis, ...) Thu. 13:30-15:00, hall c 4-9, poster #1402
UKP Lab@UKPLab

🤔 Variational learning is often thought to be impractical 🔥 Plot twist: it actually works better than Adam! Meet IVON, a new optimizer that brings the best out of variational learning – 🧵 (1/9) #NLProc #ICML2024 📰 arxiv.org/abs/2402.17641 youtu.be/TRNYnRRJBRg

English
6
7
22
1.9K
Jan Buchmann
Jan Buchmann@Koby_Loby·
In case you are looking for it: @eaclmeeting commitment link is here: #tab-recent-activity" target="_blank" rel="nofollow noopener">openreview.net/group?id=eacl.…
English
0
0
1
166
Jan Buchmann retweetledi
UKP Lab
UKP Lab@UKPLab·
Pointwise V-usable information (PVI) excels in many #NLProc tasks. But fine-tuning #LLMs with it is very time-consuming 🐌 Is in-context PVI the necessary next step? Yes! 🎉Check out our empirical analysis accepted at #EMNLP2023 – and this 🧵 (1/7) 📄 arxiv.org/abs/2310.12300
UKP Lab tweet media
English
1
10
17
1.8K
Jan Buchmann retweetledi
UKP Lab
UKP Lab@UKPLab·
Are Emergent Abilities in Large Language Models just In-Context Learning? Spoiler: YES 🤯 Through a series of over 1,000 experiments, we provide compelling evidence: arxiv.org/abs/2309.01809 Our results allay safety concerns regarding latent hazardous abilities. A🧵👇 #NLProc
UKP Lab tweet media
English
15
181
697
388.6K
Jan Buchmann
Jan Buchmann@Koby_Loby·
@emnlpmeeting In case anyone is interested: I received an answer from the EMNLP Program Chairs via E-Mail: In short: 1. Yes 2. No
English
0
0
1
750
Jan Buchmann
Jan Buchmann@Koby_Loby·
@emnlpmeeting I have two questions regarding the author response: 1. "EMNLP 2023 template" refers to the paper template? 2. Do references count towards the one page limit? Would appreciate any help :)
English
1
0
0
852
Jan Buchmann
Jan Buchmann@Koby_Loby·
@_mm85 Kind of ironic considering that twitter is blocked in China?
English
0
0
0
21
Marcel Münch
Marcel Münch@_mm85·
Could you do the same where you live? I just grabbed a seat at this table in a Mall in Shenzhen. The girl beside me left her belongings like laptop and Christian Dior bag unattended. She didn’t ask anyone to take care of it and she hasn’t returned for more than 30 minutes …
Marcel Münch tweet media
Guangdong, People's Republic of China 🇨🇳 English
413
293
2.3K
492.9K
Jan Buchmann
Jan Buchmann@Koby_Loby·
Hey @signalapp, why do you keep insisting that I turn on notifications? Being donation funded and tracking-less (which I support very much), signal shouldn’t need to get me hooked, right?
Jan Buchmann tweet media
English
1
0
0
186
Jan Buchmann retweetledi
Ethan Perez
Ethan Perez@EthanJPerez·
Some ppl have asked why we’d expect larger language models to do worse on tasks (inverse scaling). We train LMs to imitate internet text, an objective that is often misaligned w human preferences; if the data has issues, LMs will mimic those issues (esp larger ones). Examples: 🧵
English
4
39
228
0
Jan Buchmann retweetledi
Peter Welinder
Peter Welinder@npew·
GPT-3 is amazing at complex tasks like creative writing and summarizing. But it's surprisingly bad at reversing words. 🤔 The reason is that GPT-3 doesn't see the world the way we humans do. 👀 If you teach it to reason, it can get around its limitations to get really good. 💡
Peter Welinder tweet media
English
30
265
1.4K
0
Jan Buchmann retweetledi
Deutsche Wohnen & Co Enteignen
Deutsche Wohnen & Co Enteignen@dwenteignen·
Volksentscheid, WIR KOMMEN! 💛💜 Mit sagenhaften 343.591 Unterschriften🎉 🎈Aufgrund der Anzahl an Unterschriften (schon die vorläufige Zahl ist ein Rekord für Volksbegehren in Berlin) gehen wir davon aus, dass es zum Volksentscheid am 26.09. kommt! 💪
Deutsche Wohnen & Co Enteignen tweet media
Deutsch
79
633
2.9K
0
Jan Buchmann
Jan Buchmann@Koby_Loby·
Really interesting, well-researched (and long) article on the origin of SARS-CoV2 and the dangers of gain-of-function research on viruses. Bottomline: The lab escape theory seems to be more plausible than reported by some. thebulletin.org/2021/05/the-or…
English
0
0
1
0
Jan Buchmann retweetledi
Robert G. Reeve
Robert G. Reeve@RobertGReeve·
I'm back from a week at my mom's house and now I'm getting ads for her toothpaste brand, the brand I've been putting in my mouth for a week. We never talked about this brand or googled it or anything like that. As a privacy tech worker, let me explain why this is happening. 🧵
English
2K
82.5K
246.1K
0
Jan Buchmann retweetledi
Yann LeCun
Yann LeCun@ylecun·
"An international effort to produce & distribute mRNA-based Covid-19 vaccines worldwide" English version of a petition first proposed by @axelkahn to coordinate an emergency effort to produce and distribute mRNA-based Covid vaccines to the entire world. change.org/p/president-jo…
English
1
11
35
0