A Multi-Level Knowledge Distillation for Enhanced Crop Segmentation in Precision Agriculture
Zhiyong Li,
Lan Xiang,
Jun Sun (),
Dingyi Liao,
Lijia Xu and
Mantao Wang
Additional contact information
Zhiyong Li: College of Information Engineering, Sichuan Agricultural University, Ya’an 625000, China
Lan Xiang: College of Information Engineering, Sichuan Agricultural University, Ya’an 625000, China
Jun Sun: Observation and Research Station of Land Ecology and Land Use in Chengdu Plain, Ministry of Natural Resources, Chengdu 610045, China
Dingyi Liao: College of Information Engineering, Sichuan Agricultural University, Ya’an 625000, China
Lijia Xu: College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China
Mantao Wang: College of Information Engineering, Sichuan Agricultural University, Ya’an 625000, China
Agriculture, 2025, vol. 15, issue 13, 1-25
Abstract:
In this paper, we propose a knowledge distillation framework specifically designed for semantic segmentation tasks in agricultural scenarios. This framework aims to address several prevalent challenges in smart agriculture, including limited computational resources, strict real-time constraints, and suboptimal segmentation accuracy on cropped images. Traditional single-level feature distillation methods often suffer from insufficient knowledge transfer and inefficient utilization of multi-scale features, which significantly limits their ability to accurately segment complex crop structures in dynamic field environments. To overcome these issues, we propose a multi-level distillation strategy that leverages feature and embedding patch distillation, combining high-level semantic features with low-level texture details for joint distillation. This approach enables the precise capture of fine-grained agricultural elements, such as crop boundaries, stems, petioles, and weed clusters, which are critical for achieving robust segmentation. Additionally, we integrated an enhanced attention mechanism into the framework, which effectively strengthens and fuses key crop-related features during the distillation process, thereby further improving the model’s performance and image understanding capabilities. Extensive experiments on two agricultural datasets (sweet pepper and sugar) demonstrate that our method improves segmentation accuracy by 7.59% and 6.79%, without significantly increasing model complexity. Further validation shows that our approach exhibits strong generalization capabilities on two widely used public datasets, proving its applicability beyond agricultural domains.
Keywords: knowledge distillation; semantic segmentation; multi-scale features; embedding patch; attention mechanism; generalization capability; precision agriculture (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2077-0472/15/13/1418/pdf (application/pdf)
https://www.mdpi.com/2077-0472/15/13/1418/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:15:y:2025:i:13:p:1418-:d:1691522
Access Statistics for this article
Agriculture is currently edited by Ms. Leda Xuan
More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().