Alleviating Long-Tailed Image Classification via Dynamical Classwise Splitting
Ye Yuan,
Jiaqi Wang,
Xin Xu,
Ruoshi Li,
Yongtong Zhu,
Lihong Wan,
Qingdu Li and
Na Liu ()
Additional contact information
Ye Yuan: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Jiaqi Wang: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Xin Xu: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Ruoshi Li: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Yongtong Zhu: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Lihong Wan: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Qingdu Li: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Na Liu: Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China
Mathematics, 2023, vol. 11, issue 13, 1-12
Abstract:
With the rapid increase in data scale, real-world datasets tend to exhibit long-tailed class distributions (i.e., a few classes account for most of the data, while most classes contain only a few data points). General solutions typically exploit class rebalancing strategies involving resampling and reweighting based on the sample number for each class. In this work, we explore an orthogonal direction, category splitting, which is motivated by the empirical observation that naive splitting of majority samples could alleviate the heavy imbalance between majority and minority classes. To this end, we propose a novel classwise splitting (CWS) method built upon a dynamic cluster, where classwise prototypes are updated using a moving average technique. CWS generates intra-class pseudo labels for splitting intra-class samples based on the point-to-point distance. Moreover, a group mapping module was developed to recover the ground truth of the training samples. CWS can be plugged into any existing method as a complement. Comprehensive experiments were conducted on artificially induced long-tailed image classification datasets, such as CIFAR-10-LT, CIFAR-100-LT, and OCTMNIST. Our results show that when trained with the proposed class-balanced loss, the network is able to achieve significant performance gains on long-tailed datasets.
Keywords: deep learning; class-imbalance learning; feature clustering; long-tailed classification; classwise splitting (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/11/13/2996/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/13/2996/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:13:p:2996-:d:1187226
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().