Hans Olischläger
Amortized Bayesian Inference (ABI) is a powerful method for efficiently solving inverse problems without structural requirements. However, the lack of guarantees regarding model misspecification remains a key challenge to its adoption. We seek to robustify the amortized Bayesian workflow, by extending the literature on diagnostics, aiming to not just detect problems but also provide understanding and actionable guidance, and contributing well-crafted tools to open-source libraries such as BayesFlow. For example, by studying how to efficiently learn ensembles, as well as compare and combine ensemble members, we should be able to improve diagnostics and mitigate sensitivity to initialization and training data variability. The main drawback of training an ensemble is its computational cost. However, we see potential to make ensembles economical by exploiting the parallelizability, even on regular CPUs/GPUs, and by choosing parsimonious architectures, such as feed-forward point estimation networks. Finally, the exchange between Dortmund and Helsinki, the highly collaborative culture of the BayesFlow development team and the wider community of scientific simulation and statistics, are bound to provide additional opportunities for further contributions to concrete applications and general issues of learning models from data.