Fast Samplers for Inverse Problems in Iterative Refinement Models

University of California, Irvine
NeurIPS'24

*Indicates Equal Contribution

4x Super-Resolution

Gaussian Deblur

Inpainting
Number of Function Evaluations: 5

TLDR

We develop a plug-and-play method for accelerating sampling in posterior approximation methods like \(\Pi\)-GDM and DPS for solving inverse problems using pre-trained diffusion models. Our approach can generate much better samples in around 5 NFEs.

Full Abstract

Constructing fast samplers for unconditional diffusion and flow-matching models has received much attention recently; however, existing methods for solving inverse problems, such as super-resolution, inpainting, or deblurring, still require hundreds to thousands of iterative steps to obtain high-quality results. We propose a plug-and-play framework for constructing efficient samplers for inverse problems, requiring only pre-trained diffusion or flow-matching models. We present Conditional Conjugate Integrators, which leverage the specific form of the inverse problem to project the respective conditional diffusion/flow dynamics into a more amenable space for sampling. Our method complements popular posterior approximation methods for solving inverse problems using diffusion/flow models. We evaluate the proposed method's performance on various linear image restoration tasks across multiple datasets, employing diffusion and flow-matching models. Notably, on challenging inverse problems like 4\(\times\) super-resolution on the ImageNet dataset, our method can generate high-quality samples in as few as 5 conditional sampling steps and outperforms competing baselines requiring 20-1000 steps.

Methodology

An overview of our Conditional Conjugate Integrators is illustrated in the above figure. Our proposed approach projects the conditional diffusion dynamics to an amenable space where sampling is more fast. On completion, the dynamics are reversed back to the original space. The only constraint is to formulate the mapping \(\Phi\) to be reversible. In this work our mapping \(\Phi\) and its inverse depend on the degradation operator \(H\) and the guidance scale \(w\) (see our paper for more details).

Additional Qualitative Results

Conjugate-\(\Pi\)GDM (Ours) vs \(\Pi\)GDM (Baseline)
(Top Row: Ground Truth, Middle Row: Degraded NFE=5)
Conjugate-\(\Pi\)GDM (Ours) vs Pseudoinverse (Baseline) for Noisy 4x super-res
(\(\sigma_y=0.05\), NFE=5, Top Row: Ground Truth samples)

BibTeX

@misc{pandey2024fastsamplersinverseproblems,
        title={Fast Samplers for Inverse Problems in Iterative Refinement Models}, 
        author={Kushagra Pandey and Ruihan Yang and Stephan Mandt},
        year={2024},
        eprint={2405.17673},
        archivePrefix={arXiv},
        primaryClass={cs.CV},
        url={https://arxiv.org/abs/2405.17673}, 
  }