Automatic Position Detection and Posture Recognition of Grouped Pigs Based on Deep Learning
Hengyi Ji,
Jionghua Yu,
Fengdan Lao,
Yanrong Zhuang,
Yanbin Wen and
Guanghui Teng ()
Additional contact information
Hengyi Ji: College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
Jionghua Yu: Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
Fengdan Lao: Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
Yanrong Zhuang: College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
Yanbin Wen: Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
Guanghui Teng: College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
Agriculture, 2022, vol. 12, issue 9, 1-17
Abstract:
The accurate and rapid detection of objects in videos facilitates the identification of abnormal behaviors in pigs and the introduction of preventive measures to reduce morbidity. In addition, accurate and effective pig detection algorithms provide a basis for pig behavior analysis and management decision-making. Monitoring the posture of pigs can enable the detection of the precursors of pig diseases in a timely manner and identify factors that impact pigs’ health, which helps to evaluate their health status and comfort. Excessive sitting represents abnormal behavior when pigs are frustrated in a restricted environment. The present study focuses on the automatic recognition of standing posture and lying posture in grouped pigs, which shows a lack of recognition of sitting posture. The main contributions of this paper are as follows: A human-annotated dataset of standing, lying, and sitting postures captured by 2D cameras during the day and night in a pig barn was established, and a simplified copy, paste, and label smoothing strategy was applied to solve the problem of class imbalance caused by the lack of sitting postures among pigs in the dataset. The improved YOLOX has an average precision with an intersection over union threshold of 0.5 (AP 0.5 ) of 99.5% and average precision with an intersection over union threshold of 0.5–0.95 (AP 0.5–0.95 ) of 91% in pig position detection; an AP 0.5 of 90.9% and an AP 0.5–0.95 of 82.8% in sitting posture recognition; a mean average precision with intersection over union threshold of 0.5 (mAP 0.5 ) of 95.7% and a mean average precision with intersection over union threshold of 0.5–0.95 (mAP 0.5–0.95 ) of 87.2% in all posture recognition. The method proposed in our study can improve the position detection and posture recognition of grouped pigs effectively, especially for pig sitting posture recognition, and can meet the needs of practical application in pig farms.
Keywords: pig behavior; object detection; deep learning; computer vison (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
https://www.mdpi.com/2077-0472/12/9/1314/pdf (application/pdf)
https://www.mdpi.com/2077-0472/12/9/1314/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:12:y:2022:i:9:p:1314-:d:898364
Access Statistics for this article
Agriculture is currently edited by Ms. Leda Xuan
More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().