Generative Models and Stochastic Processes
Cameron Stewart (Ph.D. Student)
This project will focus on the further development and analysis of generative models. Examples of such models include Generative Adversarial Networks (GANs), energy-based models, normalising flows, diffusion models, and score-based generative models. Wasserstein gradient flows of various metrics applied to the training of such models will be investigated. For example, Wasserstein gradient flows of the Kullback-Leibler (KL) divergence or Maximum Mean Discrepancy (MMD), or other metrics. A focus will be placed on evaluating the convergence of these flows to the global optimum, and the associated conditions on the distributions.
|Primary Advisor:||Arthur Gretton (University College London)|
|Industry Advisor:||Arnaud Doucet (University of Oxford & Google DeepMind)|
|PhD Duration:||26 September 2022 - 25 September 2026|