This survey explores state-of-the-art advancements in accelerating diffusion models, focusing on techniques to address their computational and memory inefficiencies. Diffusion models have achieved remarkable success in generative AI, surpassing prior paradigms like GANs in variou
...
This survey explores state-of-the-art advancements in accelerating diffusion models, focusing on techniques to address their computational and memory inefficiencies. Diffusion models have achieved remarkable success in generative AI, surpassing prior paradigms like GANs in various applications, including image synthesis, text-to-image generation, video generation and more. However, their reliance on a large number of sequential sampling steps significantly hinders their efficiency compared to other generative approaches. This survey categorises and analyses 11 recent works aimed at overcoming these challenges, including quantization techniques, knowledge distillation, and distributed parallel sampling. Through this survey, we aim to provide an understanding, intuition, theory and tradeoffs behind these techniques. Finally, this work offers a valuable reference for researchers and professionals seeking to enhance or utilise fast diffusion model architectures, providing a clear overview of benchmarking parameters used for each of these works.