Publications

International Conference

Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation
categorize
Machine Learning
Author
Dongjun Kim, Seungjae Shin, Kyungwoo Song, Wanmo Kang, and Il-Chul Moon
Year
2022
Conference Name
International Conference on Machine Learning (ICML 2022)
Presentation Date
Jul 17
City
Baltimore
Country
USA
File
Soft Truncation A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation.pdf (17.0M) 23회 다운로드 DATE : 2024-01-16 15:02:14

Dongjun Kim, Seungjae Shin, Kyungwoo Song, Wanmo Kang, Il-Chul Moon, Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation, International Conference on Machine  Learning (ICML 2022), Baltimore, USA, Jul 17, 2022 


Abstract

Recent advances in diffusion models bring state-of-the-art performance on image generation tasks. However, empirical results from previous research in diffusion models imply an inverse correlation between density estimation and sample generation performances. This paper investigates with sufficient empirical evidence that such inverse correlation happens because density estimation is significantly contributed by small diffusion time, whereas sample generation mainly depends on large diffusion time. However, training a score network well across the entire diffusion time is demanding because the loss scale is significantly imbalanced at each diffusion time. For successful training, therefore, we introduce Soft Truncation, a universally applicable training technique for diffusion models, that softens the fixed and static truncation hyperparameter into a random variable. In experiments, Soft Truncation achieves state-of-the-art performance on CIFAR-10, CelebA, CelebA-HQ 256x256, and STL-10 datasets. 


@article{kim2021soft, 

title={Soft truncation: A universal training technique of score-based diffusion model for high precision score estimation}, 

author={Kim, Dongjun and Shin, Seungjae and Song, Kyungwoo and Kang, Wanmo and Moon, Il-Chul}, 

journal={arXiv preprint arXiv:2106.05527}, 

year={2021} 

} 


Source Website:

https://proceedings.mlr.press/v162/kim22i/kim22i.pdf