Lessons from Sampling Bayesian Neural Networks

Talk
Bayesian Deep Learning
Uncertainty Quantification
Sampling
Machine Learning
Talk@Singular Learning Theory Seminar - Youtube video available
Published

August 25, 2025

Had a great time presenting at the Singular Learning Theory Seminar together with David Rügamer and Julius Kobialka!

Abstract: Sampling-based inference is often regarded as the gold standard for posterior inference in Bayesian neural networks (BNNs), yet it continues to face skepticism regarding its practicality in large-scale or complex models. This perception has been challenged by recent methodological and computational advances that significantly broaden the scope of feasible applications. The presentation examines how sampling operates in BNNs, how performance can be improved through targeted adaptations, and why not all sampling procedures are equally effective. It further explores the role of implicit regularization induced by both the network architecture and the sampling dynamics. The discussion points toward future opportunities where sampling may redefine Bayesian deep learning, contingent on addressing current challenges in scalability, efficiency, and inference cost.

For the recording click here