Kai Lüdemann

PhD
University of Tübingen

Score based generative models are now state of the art for unsupervised machine learning, and have rapidly developed an extensive but brittle software stack. In the PhD project, we will explore how diffusion can be naturally phrased and embedded within the probabilistic machine learning stack. In particular, we will leverage current research of both Pls that extends the Gaussian process compute stack to diffusion models, as well as the probabilistic numerics perspective on the simulation of (ordinary, stochastic, partial) differential equations.

The goal is to endow diffusion models with fundamental functionality, such as nonparametric extensions, a functional programming interface (lazy instantiation, closure under conditioning and instantiation, etc), a rich class of analytic, nonparametric priors, and efficient, stable numerical algorithms for generation and training. To maximize impact, we will leverage the GPyTorch software stack co-developed by the secondary advisor, as well as ProbNum ODE tools in jax built in the group of the primary advisor.

Track:
Academic Track
PhD Duration:
October 1st, 2026 - October 1st, 2029
ELLIS Edge Newsletter
Join the 6,000+ people who get the monthly newsletter filled with the latest news, jobs, events and insights from the ELLIS Network.