Mingi Ji, Feature Based Knowledge Distillation for Image Recognition, PhD Dissertation, Department of Industrial and Systems Engineering, KAIST, 2022
- File
- Feature Based Knowledge Distillation for Image Recognition.pdf (14.1M) 16회 다운로드 DATE : 2023-11-07 14:27:31
Mingi Ji, Feature Based Knowledge Distillation for Image Recognition, PhD Dissertation, Department of Industrial and Systems Engineering, KAIST, 2022
Abstract
The performance of models related to visual recognition, such as image classification and object detection, has recently improved with the development of deep learning. However, in order to train a deep learning based model, a model with many parameters and a large amount of data are required. This thesis examines the knowledge distillation method. First, we proposed an attention-based meta-network which models relationships between the teacher model layers and the student model layers to distill knowledge effectively. We empirically show through experiments that the proposed methodology is more effective than using a heuristic to designate the layer links between the teacher model layers and the student model layers. Second, when using the method of knowledge distillation by the student model itself without a teacher model (self-knowledge distillation), we proposed the novel method which utilizes spatial information. To this end, we introduced an auxiliary network for training by altering the model used in the existing object detection model.
@phdthesis{Ji:2022,
author = {Mingi Ji},
advisor ={Il-Chul Moon},
title = {Feature Based Knowledge Distillation for Image Recognition},
school = {KAIST},
year = {2022}
}
- PreviousYongjin Shin, Localized Binary Cross-Entropy for Federated Learning, Master's Thesis, Department of Industrial and Systems Engineering, KAIST, 2021
- NextDongHyeok Shin, Dataset Distillation via Loss Approximation for Continual Learning, Master's Thesis, Department of Industrial and Systems Engineering, KAIST, 2022