Back to Search Start Over

Behavior classification and spatiotemporal analysis of grazing sheep using deep learning.

Authors :
Jin, Zhongming
Shu, Hang
Hu, Tianci
Jiang, Chengxiang
Yan, Ruirui
Qi, Jingwei
Wang, Wensheng
Guo, Leifeng
Source :
Computers & Electronics in Agriculture. May2024, Vol. 220, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

• Spatiotemporal distribution of livestock behavior was monitored using multi-modal data. • Multi-source data fusion outperformed single-source data in multi-behavior classification. • The CNN-LSTM model was able to classify multi-behavior with higher accuracy. • 1HZ of multi-source fusion data still achieved reasonable classification accuracy. Monitoring multiple behavioral patterns and grazing trajectories of individual sheep can provide valuable information for various aspects of livestock production. The emergence of low-cost miniature sensors, coupled with the continuous advancement of deep learning technologies, has ushered in a new generation of intelligent solutions for precision livestock farming. This study aims to explore the deployment methods of motion sensors, selection of data collection frequencies, and choice of deep learning algorithms to provide accurate classification of multiple behaviors in grazing sheep. Based on this, in conjunction with the acquired location information, the goal is to comprehend the spatiotemporal distribution of grazing sheep behaviors. Devices capable of collecting Inertial Measurement Unit (IMU) and location data were attached to the jaw, neck, and hind leg of sheep. Four datasets were created using IMU data with a frequency of 20 Hz and a 5 s time window from different positions (neck, neck & leg, jaw, jaw & leg). Two deep learning models, Convolutional Neural Network (CNN) - Long Short Term Memory (LSTM) and Temporal Convolutional Network (TCN)-Transformer, were employed to classify six grazing sheep behaviors: walking, standing, grazing, lying, standing-ruminating, and lying-ruminating. The results indicate that by fusing data from the neck- and leg-mounted devices and utilizing the CNN-LSTM model, the accuracy reached the highest at 99.3 %. Furthermore, a comparison was made regarding the behavior classification accuracy of this fused data at different IMU data frequencies (20, 10, 5, and 1 Hz). It was found that even when the data frequency was reduced to 1 Hz, the classification accuracy for the six sheep behaviors still exceeded 96 %. Additionally, the trained model with the highest accuracy was applied to monitoring grazing sheep behavior under two different management procedures. Further analysis of the time budgets of sheep behavior revealed differences in the durations of behaviors under different management procedures. For extensively grazing sheep, location information was also monitored and combined with behavior classification results to generate the spatiotemporal distribution of sheep behavior. The technologies presented are important for gaining further insights into the health status of grazing livestock and grassland, thereby enhancing grazing management practices by informing grazing choices and optimizing grassland utilization. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01681699
Volume :
220
Database :
Academic Search Index
Journal :
Computers & Electronics in Agriculture
Publication Type :
Academic Journal
Accession number :
176686595
Full Text :
https://doi.org/10.1016/j.compag.2024.108894