Anytime Uncertainty in (Bayesian) Deep Learning
Metod Jazbec (Ph.D. Student)
The predictions generated by neural networks are fairly useless without a corresponding notion of confidence or uncertainty. Accordingly, methods for cheaply quantifying the uncertainty of NNs have received much attention. Yet, one issue that is often ignored is that our models operate under dynamic computation constraints. Sometimes we need an answer quickly, and in turn we require fast uncertainty quantification along with it. At other times we can afford to let the model "ponder." In this research project, we propose to investigate methods that can provide anytime, adaptive uncertainty estimates for deep neural networks.
|Primary Advisor:||Eric Nalisnick (University of Amsterdam)|
|Industry Advisor:||Dan Zhang (Bosch Center for AI)|
|PhD Duration:||01 October 2022 - 30 September 2026|