Jeremias Knoblauch

398 posts

Jeremias Knoblauch

Jeremias Knoblauch

@LauchLab

Associate Professor & EPSRC Fellow @ UCL. Post-Bayesian seminar series sign-up @ https://t.co/a0MyAQOh16 Research mission @ https://t.co/kNIjvCrGne

London Beigetreten Nisan 2019
506 Folgt1.5K Follower
Jeremias Knoblauch
Jeremias Knoblauch@LauchLab·
𝗣𝗿𝗲𝗱𝗶𝗰𝘁𝗶𝘃𝗲𝗹𝘆 𝗢𝗿𝗶𝗲𝗻𝘁𝗲𝗱 𝗣𝗼𝘀𝘁𝗲𝗿𝗶𝗼𝗿𝘀 define a new statistical principle that derives model uncertainty from predictive accuracy. This predictively dominates classical approaches and avoids epistemic uncertainty from collapsing. arxiv.org/abs/2510.01915
Jeremias Knoblauch tweet media
English
0
5
34
1.5K
Probability and Statistics
Bayesian machine learning is an approach to modeling and inference that treats unknown parameters and predictions as random variables and updates beliefs using Bayes’ rule as new data arrives. Instead of producing single best guesses, it produces full probability distributions that quantify uncertainty. In probability theory, Bayesian ML builds directly on conditional probability, likelihoods, and prior distributions, providing a coherent framework for learning from data. In machine learning, it powers methods such as Bayesian neural networks, Gaussian processes, and probabilistic graphical models, enabling robust prediction, uncertainty estimation, and principled model comparison. In real life, Bayesian ML is used in medicine, finance, robotics, and recommendation systems, where decisions must be made under uncertainty and models must adapt as evidence accumulates. Image: share.google/R4mi0AVfeKLeZb…
Probability and Statistics tweet media
English
6
116
767
29.8K
Jeremias Knoblauch retweetet
Diana Cai
Diana Cai@dianarycai·
I'm on the academic job market! I design and analyze probabilistic machine-learning methods---motivated by real-world scientific constraints, and developed in collaboration with scientists in biology, chemistry, and physics. A few highlights of my research areas are:
Diana Cai tweet media
English
8
54
518
77.2K
Jeremias Knoblauch retweetet
Kevin Patrick Murphy
Kevin Patrick Murphy@sirbayes·
I am pleased to share our new NeurIPS paper for online Bayesian inference for neural networks. Instead of focusing on updating the parameter posterior, we work with the predictive posterior (which makes much more sense for non-identifiable models, and gives us more algorithmic freedom for developing faster methods).
Gerardo Duran-Martin@grrddm

Our paper “Martingale Posterior Neural Networks for Fast Sequential Decision Making” has been accepted at #neurips2025! Joint work with @l_sbetancourt, @AlvaroCartea and @sirbayes Blog: grdm.io/posts/bnn-with… Paper: arxiv.org/abs/2506.11898 Code: github.com/gerdm/martinga…

English
8
41
528
61.5K
Jeremias Knoblauch retweetet
Diana Cai
Diana Cai@dianarycai·
The application for a research fellowship at the Flatiron Institute in the Center for Computational Math is now live! This includes positions for ML and stats. The deadline is Dec 1. Links below with more details.
English
2
18
106
13.2K
Jeremias Knoblauch retweetet
François-Xavier Briol
François-Xavier Briol@fx_briol·
There’s been lots of interest in gradient flows, but I’ve been a bit sceptical as I wasn’t clear on the advantages. This paper shows that MMD gradient flows can integrate a large class of functions exactly without needing to converge to a global minimum: stationarity is enough!
Hudson@Hudson19990518

New paper on Stationary MMD points 📣 arxiv.org/pdf/2505.20754 1️⃣ Samples generated by MMD flow exhibit 'super-convergence' 2️⃣ A discrete-time finite-particle convergence result for MMD flow Joint work with Toni Karvonen, Heishiro Kanagawa, @fx_briol, Chris J. Oates

English
0
2
22
1.4K
Jeremias Knoblauch retweetet
ELLIS
ELLIS@ELLISforEurope·
🆕 The ELLIS Unit London (@ucl) will host the first workshop uniting disparate subfields of post-Bayesian methods, including PAC Bayes, generalized Bayes, predictive resampling, Martingale posteriors, and online learning. 📍 London 🏴󠁧󠁢󠁥󠁮󠁧󠁿 📅 May 15-16 🔗 bit.ly/3GgwZ0G
English
0
5
11
1.5K
Jeremias Knoblauch retweetet
Matias Altamirano Montero
Matias Altamirano Montero@matialtamiranom·
Super happy and grateful to be awarded the Bloomberg fellowship. Huge thanks to my amazing supervisors @LauchLab and @fx_briol , as well as all my collaborators for their support!
Tech At Bloomberg@TechAtBloomberg

Congratulations to @UCL / @stats_UCL's @matialtamiranom on being one of the 2024-2025 @Bloomberg #DataScience Ph.D. Fellows! Learn more about Matías’ research focus and our latest cohort of Ph.D. Fellows: bloom.bg/4itDDiN #AI #ML #NLProc #NeurIPS2024

English
6
2
23
1.4K
Siu Lun Chau
Siu Lun Chau@Chau9991·
🔔Excited to join the College of Computing & Data Science@NTU Singapore 🇸🇬 as Assistant Professor of Statistical ML in May 2025! My research aims to advance uncertainty-awareness, explainability, and preference alignment in ML methods. Recruiting PhD students soon—stay tuned! 🙌
English
6
0
29
1.4K
Jeremias Knoblauch
Jeremias Knoblauch@LauchLab·
🥳🥳🥳Massive congratulations to my PhD student @matialtamiranom for winning one of the 3 Bloomberg Fellowships for his work on robust & efficient post-Bayesian methods! I cannot think of anyone more deserving!😊 Read more at tinyurl.com/MatiasBloomberg
English
1
2
33
1.8K
Kevin Patrick Murphy
Kevin Patrick Murphy@sirbayes·
The achiles heel of Bayes is that it is only optimal if the model is the true DGP, which it never is. (That, plus all the computational issues). Generalized Bayes to the rescue! (Attaching a nice summary slide from @lauchlab - if you like it, sign up for the talk series :)
Kevin Patrick Murphy tweet media
Jeremias Knoblauch@LauchLab

📢 Post-Bayesian online seminar series coming!📢 To stay posted, sign up at tinyurl.com/postBayes We'll discuss cutting-edge methods for posteriors that no longer rely on Bayes Theorem. (e.g., PAC-Bayes, generalised Bayes, Martingale posteriors, ...) Pls circulate widely!

English
6
32
316
62K
Jeremias Knoblauch
Jeremias Knoblauch@LauchLab·
📢 Post-Bayesian online seminar series coming!📢 To stay posted, sign up at tinyurl.com/postBayes We'll discuss cutting-edge methods for posteriors that no longer rely on Bayes Theorem. (e.g., PAC-Bayes, generalised Bayes, Martingale posteriors, ...) Pls circulate widely!
English
2
41
185
48.9K