EconPapers    
Economics at your fingertips  
 

Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks

Haoxuan Chen, Huamao Huang, Yangyang Peng, Hui Zhou, Haiying Hu () and Ming Liu ()
Additional contact information
Haoxuan Chen: Guangdong Key Laboratory for New Technology Research of Vegetables, Vegetable Research Institute, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China
Huamao Huang: School of Physics and Optoelectronics, South China University of Technology, Guangzhou 510640, China
Yangyang Peng: Guangdong Key Laboratory for New Technology Research of Vegetables, Vegetable Research Institute, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China
Hui Zhou: Guangdong Key Laboratory for New Technology Research of Vegetables, Vegetable Research Institute, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China
Haiying Hu: School of Civil Engineering and Transportation, South China University of Technology, Guangzhou 510640, China
Ming Liu: Guangdong Key Laboratory for New Technology Research of Vegetables, Vegetable Research Institute, Guangdong Academy of Agricultural Sciences, Guangzhou 510640, China

Agriculture, 2025, vol. 15, issue 3, 1-18

Abstract: Oudemansiella raphanipes is valued for its rich nutritional content and medicinal properties, but traditional manual grading methods are time-consuming and labor-intensive. To address this, deep learning techniques are employed to automate the grading process, and knowledge distillation (KD) is used to enhance the accuracy of a small-parameter model while maintaining a low resource occupation and fast response speed in resource-limited devices. This study employs a three-teacher KD framework and investigates three cascaded structures: the parallel model, the standard series model, and the series model with residual connections (residual-series model). The student model used is a lightweight ShuffleNet V2 0.5x, while the teacher models are VGG16, ResNet50, and Xception. Our experiments show that the cascaded structures result in improved performance indices, compared with the traditional ensemble model with equal weights; in particular, the residual-series model outperforms the other models, achieving a grading accuracy of 99.7% on the testing dataset with an average inference time of 5.51 ms. The findings of this study have the potential for broader application of KD in resource-limited environments for automated quality grading.

Keywords: Oudemansiella raphanipes; quality grading; knowledge distillation; multi-teacher model (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2077-0472/15/3/301/pdf (application/pdf)
https://www.mdpi.com/2077-0472/15/3/301/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:15:y:2025:i:3:p:301-:d:1580407

Access Statistics for this article

Agriculture is currently edited by Ms. Leda Xuan

More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jagris:v:15:y:2025:i:3:p:301-:d:1580407