Priors and Inference for Deep Probabilistic Models
Vincent Fortuin (Ph.D. Student)
While deep learning techniques have led to impressive advances in supervised and representation learning, this was mostly in domains where large homogeneous sets of structured data are available. In contrast, probabilistic models are more data-efficient and often provide better interpretability as well as uncertainty estimates. Recent efforts have started to combine these two paradigms, for instance in variational autoencoders, Bayesian neural networks, and deep Gaussian processes. In my research, I focus on improving (1) the interpretability and data-efficiency of these models through the design of better priors and (2) their practical applicability through the development of more efficient and efficacious inference techniques.
|Primary Host:||Gunnar Rätsch (ETH Zürich)|
|Exchange Host:||Richard E. Turner (University of Cambridge)|
|PhD Duration:||01 November 2017 - 30 November 2021|
|Exchange Duration:||01 August 2019 - 01 November 2019 - Ongoing|