Flexible Densities for Deep Generative Models
Didrik Nielsen (Ph.D. Student)
Probability distributions play a central role in machine learning. For probabilistic modeling, they are used as likelihoods and prior distributions, whereas in variational inference, they are employed as approximate posterior distributions. The probability distributions typically used in practice tend to be simple, such as exponential family distributions. However, the use of too simple distributions can in general limit performance. For example, using too simple likelihood distributions can lead to serious model misspecification, whereas using too simple variational distributions can lead to bad posterior approximations and loose variational bounds. In this project, we will explore the use of flexible densities for probabilistic models and variational inference, with a focus on deep generative models.
|Primary Host:||Ole Winther (University of Copenhagen and Technical University of Denmark)|
|Exchange Host:||Max Welling (University of Amsterdam & Qualcomm)|
|PhD Duration:||01 January 2019 - 31 December 2019|
|Exchange Duration:||13 January 2020 - 29 May 2020|