[2102.09672] Improved Denoising Diffusion Probabilistic Models

Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code at https://github.com/openai/improved-diffusion

4 mentions: @arankomatsuzaki@ak92501
Date: 2021/02/23 02:21

Referring Tweets

@arankomatsuzaki Improved Denoising Diffusion Probabilistic Models Improves DDPM to achieve competitive NLL and image quality on par with SotA image models. code: t.co/ZuW9poSFnx abstract: t.co/T8QuWb0QHd t.co/c0LI76bewM

Related Entries

Read more [2002.09543] Modelling Latent Skills for Multitask Language Generationcontact arXivarXiv Twitter
0 users, 4 mentions 2020/02/26 15:50
Read more [2009.02252] KILT: a Benchmark for Knowledge Intensive Language Tasksopen searchopen navigation menu...
0 users, 5 mentions 2020/09/08 15:52
Read more [2102.01951] Pitfalls of Static Language Modelling
0 users, 5 mentions 2021/02/04 03:51
Read more [2102.01335] Neural Data Augmentation via Example Extrapolation
0 users, 6 mentions 2021/02/04 11:22
Read more [2102.11972] Do Transformer Modifications Transfer Across Implementations and Applications?
1 users, 6 mentions 2021/02/25 02:22