Parametrization-invariant Bayesian Deep Learning Bayesian
Bálint Mucsányi (Ph.D. Student)
Bayesian deep learning is of continued and renewed theoretical and practical interest for uncertainty quantification and increased understanding of deep architectures. Local geometric approaches, like Laplace approximations and linearized deep nets are an interesting direction within this domain because they offer analytic and computationally lightweight functionality based on automatic differentiation and numerical linear algebra, as opposed to the multiplicative cost overhead of Monte Carlo ensemble methods. But these local geometric approaches explicitly use the parametrization of the network (i.e. its description in terms of layer, nonlinearities and weights/biases), and are thus not invariant under (non-linear) re-parametrizations. At first glance, this is an issue for their use in uncertainty propagation, as it breaks with the laws of measure theory. However, interestingly, these forms of Bayesian deep learning share this flaw with nearly all methods widely used for deep training itself (like all first-order optimization methods, including gradient descent, Adam, etc.). Thus, ways to “ fix” localized Bayesian deep learning also promise insights into how to improve deep training as a whole, beyond the Bayesian perspective. In this project we will explore computational formalisms for training and uncertainty quantification that are invariant to re-parametrizations of the network architecture. In collaboration with existing PhD students in the group of Philipp Hennig, we will leverage perspectives from differential geometry to identify a primary “ anchor” geometry upon which probability measures can be defined, and transform them according to the standard rules of Riemannian geometry / measure theory under re-parametrizations. This will provide a new perspective also on software development for deep learning.
Primary Host: | Philipp Hennig (University of Tübingen) |
Exchange Host: | Yee Whye Teh (University of Oxford & DeepMind) |
PhD Duration: | 01 June 2024 - 30 May 2027 |
Exchange Duration: | 01 October 2025 - 31 January 2026 01 October 2026 - 31 January 2027 |