Publications

Thesis

Yongjin Shin, Localized Binary Cross-Entropy for Federated Learning, Master's Thesis, Department of Industrial and Systems Engineering, KAIST, 2021
File
20194447.pdf (6.5M) 8회 다운로드 DATE : 2024-01-27 14:46:59

Yongjin Shin, Localized Binary Cross-Entropy for Federated Learning, Master's Thesis, Department of Industrial and Systems Engineering, KAIST, 2021 


Abstract

Federated Learning is a distributed learning methodology for training a server model across devices that keep their data locally. Recently, Federated Learning has established itself as one of the research fields to address social issues as to privacy infringement. However, a distributed data environment do not follow the identical independent distribution, making it harder for existing methodologies of machine learning to be applied immediately. This data heterogeneity leads to over-fitting on local data in federated learning, which undermines the performance improvement of the server model. In this work, we present Localized Binary Cross-Entropy (LBCE) as a loss function to prevent over-fitting to local data in distributed environments. When a local machine learns a model using a conventional cross-entropy function, it uses error signals for a data class that does not belong to local data. Thus, to limit over-fitting, the LBCE loss function maintains the independence of each class by using a sigmoid function instead of a softmax activation function while regulating signals for data classes that do not belong to local data. LBCE outperformed conventional softmax cross-entropy in various situations of distributed data that did not follow the identical independent distribution.


@masterthesis{Shin:2021,

author = {Yongjin Shin},

advisor ={Il-Chul Moon},

title = {Localized Binary Cross-Entropy for Federated Learning},

school = {KAIST},

year = {2021}

}