Publications

International Conference

SAAL: Sharpness-Aware Active Learning
categorize
Machine Learning
Author
Yoon-Yeong Kim, Youngjae Cho, JoonHo Jang, Byeonghu Na, Yeongmin Kim, Kyungwoo Song, Wanmo Kang, and Il-Chul Moon
Year
2023
Conference Name
International Conference on Machine Learning (ICML 2023)
Presentation Date
Jul 25-27
City
Hawaii
Country
USA
File
SAAL_camera_ready_ver1.pdf (5.8M) 15회 다운로드 DATE : 2023-11-10 00:29:32

Yoon-Yeong Kim, Youngjae Cho, JoonHo Jang, Byeonghu Na, Yeongmin Kim, Kyungwoo Song, Wanmo Kang, and Il-Chul Moon, SAAL: Sharpness-Aware Active Learning, International Conference on Machine Learning (ICML 2023), Hawaii, USA, Jul 25-27, 2023 


Abstract

While deep neural networks play significant roles in many research areas, they are also prone to overfitting problems under limited data instances. To overcome overfitting, this paper introduces the first active learning method to incorporate the sharpness of loss space into the acquisition function. Specifically, our proposed method, Sharpness-Aware Active Learning (SAAL), constructs its acquisition function by selecting unlabeled instances whose perturbed loss becomes maximum. Unlike the Sharpness-Aware learning with fully-labeled datasets, we design a pseudo-labeling mechanism to anticipate the perturbed loss wrt the ground-truth label, which we provide the theoretical bound for the optimization. We conduct experiments on various benchmark datasets for vision-based tasks in image classification, object detection, and domain adaptive semantic segmentation. The experimental results confirm that SAAL outperforms the baselines by selecting instances that have the potentially maximal perturbation on the loss. The code is available at https://github.com/YoonyeongKim/SAAL. 


@inproceedings{kim2023saal, 

title={SAAL: sharpness-aware active learning}, 

author={Kim, Yoon-Yeong and Cho, Youngjae and Jang, JoonHo and Na, Byeonghu and Kim, Yeongmin and Song, Kyungwoo and Kang, Wanmo and Moon, Il-Chul}, 

booktitle={International Conference on Machine Learning}, 

pages={16424--16440}, 

year={2023}, 

organization={PMLR} 

} 


Source Website:

https://proceedings.mlr.press/v202/kim23c.html