A Deployment-Oriented Benchmarking of You Look Only Once (YOLO) Models for Orange Detection and Segmentation in Agricultural Robotics
Caner Beldek,
Emre Sariyildiz () and
Gursel Alici
Additional contact information
Caner Beldek: School of Mechanical, Materials, Mechatronic and Biomedical Engineering, University of Wollongong, Wollongong, NSW 2522, Australia
Emre Sariyildiz: School of Mechanical, Materials, Mechatronic and Biomedical Engineering, University of Wollongong, Wollongong, NSW 2522, Australia
Gursel Alici: School of Mechanical, Materials, Mechatronic and Biomedical Engineering, University of Wollongong, Wollongong, NSW 2522, Australia
Agriculture, 2025, vol. 15, issue 20, 1-22
Abstract:
The deployment of autonomous robots is critical for advancing sustainable agriculture, but their effectiveness hinges on visual perception systems that can reliably operate in natural, real-world environments. Selecting an appropriate vision model for these robots requires a practical evaluation that extends beyond standard accuracy metrics to include critical deployment factors such as computational efficiency, energy consumption, and robustness to environmental disturbances. To address this need, this study presents a deployment-oriented benchmark of state-of-the-art You Look Only Once (YOLO)-based models for orange detection and segmentation. Following a systematic process, the selected models were evaluated on a unified public dataset, annotated to rigorously assess real-world challenges. Performance was compared across five key dimensions: (i) identification accurac, (ii) robustness, (iii) model complexity, (iv) execution time, and (v) energy consump-tion. The results show that the YOLOv5 variants achieved the most accurate detection and segmentation. Notably, YOLO11-based models demonstrated strong and consistent results under all disturbance levels, highlighting their robustness. Lightweight architectures proved well-suited for resource-constrained operations. Interestingly, custom models did not consistently outperform their baselines, while nanoscale models showed demonstra-ble potential for meeting real-time and energy-efficient requirements. These findings offer valuable, evidence-based guidelines for the vision systems of precision agriculture robots.
Keywords: YOLO; agricultural robotics; benchmarking; object detection; instance segmentation (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2077-0472/15/20/2170/pdf (application/pdf)
https://www.mdpi.com/2077-0472/15/20/2170/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:15:y:2025:i:20:p:2170-:d:1775172
Access Statistics for this article
Agriculture is currently edited by Ms. Leda Xuan
More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().