EconPapers    
Economics at your fingertips  
 

YOLO11-ARAF: An Accurate and Lightweight Method for Apple Detection in Real-World Complex Orchard Environments

Yangtian Lin, Yujun Xia, Pengcheng Xia, Zhengyang Liu, Haodi Wang, Chengjin Qin (), Liang Gong and Chengliang Liu
Additional contact information
Yangtian Lin: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Yujun Xia: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Pengcheng Xia: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Zhengyang Liu: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Haodi Wang: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Chengjin Qin: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Liang Gong: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Chengliang Liu: State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China

Agriculture, 2025, vol. 15, issue 10, 1-24

Abstract: Accurate object detection is a fundamental component of autonomous apple-picking systems. In response to the insufficient recognition performance and poor generalization capacity of existing detection algorithms under unstructured orchard scenarios, we constructed a customized apple image dataset captured under varying illumination conditions and introduced an improved detection architecture, YOLO11-ARAF, derived from YOLO11. First, to enhance the model’s ability to capture apple-specific features, we replaced the original C3k2 module with the CARConv convolutional layer. Second, to reinforce feature learning in visually challenging orchard environments, the enhanced attention module AFGCAM was embedded into the model architecture. Third, we applied knowledge distillation to transfer the enhanced model to a compact YOLO11n framework, maintaining high detection efficiency while reducing computational cost, and optimizing it for deployment on devices with limited computational resources. To assess our method’s performance, we conducted comparative experiments on the constructed apple image dataset. The improved YOLO11-ARAF model attained 89.4% accuracy, 86% recall, 92.3% mAP@50, and 64.4% mAP@50:95 in our experiments, which are 0.3%, 1.1%, 0.72%, and 2% higher than YOLO11, respectively. Furthermore, the distilled model significantly reduces parameters and doubles the inference speed (FPS), enabling rapid and precise apple detection in challenging orchard settings with limited computational resources.

Keywords: YOLO11-ARAF; diverse orchard environment; deep learning; apple detection (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2077-0472/15/10/1104/pdf (application/pdf)
https://www.mdpi.com/2077-0472/15/10/1104/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:15:y:2025:i:10:p:1104-:d:1660152

Access Statistics for this article

Agriculture is currently edited by Ms. Leda Xuan

More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-05-21
Handle: RePEc:gam:jagris:v:15:y:2025:i:10:p:1104-:d:1660152