Bayesian Imaging with Plug & Play Priors: Implicit, Explicit & Unrolled Cases.
We consider inverse problems in imaging where the likelihood is known and the prior is encoded by a neural network, and we discuss different ways to maximise the posterior distribution and to take samples from it. The first part of the talk concentrates in the case where the plug & play prior is implicit in a pretrained neural network denoiser, we present the plug & play stochastic gradient descent (PnP-SGD) algorithm for posterior maximisation and the plug & play adjusted Langevin algorithm for posterior sampling. The second part of the talk shows how to repurpose a pretrained hierarchical VAE model as a prior for posterior sampling in the particular case of single image super-resolution. The third part of the talk considers the case of posterior maximisation for super-resolution and deblurring with a spatially varying blur kernel. The plug and play ADMM algorithm would require a computationally expensive inversion of the spatially varying blur operator. In order to avoid that we propose a linearized plug and play algorithm and an unrolled version thereof, which produces competitive results with respect to state of the art algorithms.
Joint work with: Rémi Laumont, Jean Prost, Charles Laroche, Valentin De Bortoli, Julie Delon, Antoine Houdard, Nicolas Papadakis, Marcelo Pereyra, Matias Tassano.
Related preprints:
https://arxiv.org/abs/2103.04715
https://arxiv.org/abs/2201.06133
https://arxiv.org/abs/2205.10347
https://arxiv.org/abs/2204.10109