Aayush Mishra
Amortized Bayesian Inference (ABI) is a new family of methods that, in principle, enables Bayesian inference in real time via deep neural networks. However, the required simulation-based training constitutes a major bottleneck for current ABI methods. Not only is training slow, but is also not robust in the sense that, after training, inference on real data may be unreliable even if the neural network training has converged and inference on simulated data is sufficiently accurate. In his PhD, Aayush will explore multiple research directions to increase the robustness and efficiency of ABI methods. Among others, he will study the inclusion of likelihood density information into the simulation-based training as well as the development of new loss functions that also use real data for training.