EconPapers    
Economics at your fingertips  
 

Small Target Ewe Behavior Recognition Based on ELFN-YOLO

Jianglin Wu, Shufeng Li, Baoqin Wen (), Jing Nie, Na Liu, Honglei Cen, Jingbin Li and Shuangyin Liu ()
Additional contact information
Jianglin Wu: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Shufeng Li: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Baoqin Wen: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Jing Nie: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Na Liu: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Honglei Cen: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Jingbin Li: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
Shuangyin Liu: College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China

Agriculture, 2024, vol. 14, issue 12, 1-24

Abstract: In response to the poor performance of long-distance small target recognition tasks and real-time intelligent monitoring, this paper proposes a deep learning-based recognition method aimed at improving the ability to recognize and monitor various behaviors of captive ewes. Additionally, we have developed a system platform based on ELFN-YOLO to monitor the behaviors of ewes. ELFN-YOLO enhances the overall performance of the model by combining ELFN with the attention mechanism CBAM. ELFN strengthens multiple layers with fewer parameters, while the attention mechanism further emphasizes the channel information interaction based on ELFN. It also improves the ability of ELFN to extract spatial information in small target occlusion scenarios, leading to better recognition results. The proposed ELFN-YOLO achieved an accuracy of 92.5%, an F1 score of 92.5%, and a mAP@0.5 of 94.7% on the ewe behavior dataset built in commercial farms, which outperformed YOLOv7-Tiny by 1.5%, 0.8%, and 0.7% in terms of accuracy, F1 score, and mAP@0.5, respectively. It also outperformed other baseline models such as Faster R-CNN, YOLOv4-Tiny, and YOLOv5s. The obtained results indicate that the proposed approach outperforms existing methods in scenarios involving multi-scale detection of small objects. The proposed method is of significant importance for strengthening animal welfare and ewe management, and it provides valuable data support for subsequent tracking algorithms to monitor the activity status of ewes.

Keywords: intelligent supervision system; ewe; YOLOv7; deep learning (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2077-0472/14/12/2272/pdf (application/pdf)
https://www.mdpi.com/2077-0472/14/12/2272/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:14:y:2024:i:12:p:2272-:d:1541626

Access Statistics for this article

Agriculture is currently edited by Ms. Leda Xuan

More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-04-05
Handle: RePEc:gam:jagris:v:14:y:2024:i:12:p:2272-:d:1541626