Semi-Supervised Interior Decoration Style Classification with Contrastive Mutual Learning
Lichun Guo (),
Hao Zeng,
Xun Shi,
Qing Xu,
Jinhui Shi,
Kui Bai,
Shuang Liang and
Wenlong Hang
Additional contact information
Lichun Guo: College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China
Hao Zeng: College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China
Xun Shi: College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China
Qing Xu: College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China
Jinhui Shi: College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China
Kui Bai: College of Computer and Information Engineering, Nanjing Tech University, Nanjing 211816, China
Shuang Liang: School of Internet of Things, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
Wenlong Hang: College of Computer and Information Engineering, Nanjing Tech University, Nanjing 211816, China
Mathematics, 2024, vol. 12, issue 19, 1-19
Abstract:
Precisely identifying interior decoration styles holds substantial significance in directing interior decoration practices. Nevertheless, constructing accurate models for the automatic classification of interior decoration styles remains challenging due to the scarcity of expert annotations. To address this problem, we propose a novel pseudo-label-guided contrastive mutual learning framework (PCML) for semi-supervised interior decoration style classification by harnessing large amounts of unlabeled data. Specifically, PCML introduces two distinct subnetworks and selectively utilizes the diversified pseudo-labels generated by each for mutual supervision, thereby mitigating the issue of confirmation bias. For labeled images, the inconsistent pseudo-labels generated by the two subnetworks are employed to identify images that are prone to misclassification. We then devise an inconsistency-aware relearning (ICR) regularization model to perform a review training process. For unlabeled images, we introduce a class-aware contrastive learning (CCL) regularization to learn their discriminative feature representations using the corresponding pseudo-labels. Since the use of distinct subnetworks reduces the risk of both models producing identical erroneous pseudo-labels, CCL can reduce the possibility of noise data sampling to enhance the effectiveness of contrastive learning. The performance of PCML is evaluated on five interior decoration style image datasets. For the average AUC, accuracy, sensitivity, specificity, precision, and F1 scores, PCML obtains improvements of 1.67%, 1.72%, 3.65%, 1.0%, 4.61%, and 4.66% in comparison with the state-of-the-art method, demonstrating the superiority of our method.
Keywords: semi-supervised learning; contrastive learning; interior decoration style (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/12/19/2980/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/19/2980/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:19:p:2980-:d:1485508
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().