1. Efficient and accurate detection of herd pigs based on Ghost-YOLOv7-SIoU.
- Author
-
Sun, Donglai, Zhang, Lijuan, Wang, Jianqiang, Liu, Xintong, Wang, Zhengbo, Hui, Zhenqiao, and Wang, Jichao
- Subjects
- *
SWINE , *MATING grounds , *FEATURE extraction , *COMPUTER vision , *TRAINING of boxers (Sports) , *DEATH rate - Abstract
Computer vision methods for non-contact detection of herd pigs could help detect early disease and reduce mortality rates by analyzing pig behavior. Due to the limitation of breeding space and cost, the unit breeding area is relatively dense, making it difficult to detect all pigs for a long time without interruption. In order to improve the detection performance, this paper proposes an end-to-end efficient and accurate herd pig detection framework based on YOLOv7 target detection model, which is named Ghost-YOLOv7-SIoU. In this framework, the feature extraction backbone network consists of a series of directly connected efficient layer aggregation networks (ELAN) and downsampling modules. The neck network contains a feature pyramid network and path aggregation network. Ghost convolution is adopted to replace the 3 × 3 standard convolution of the ELAN module in backbone network and the scaled-up ELAN module in neck network to obtain rich features while reducing the parameter number and computational effort. Furthermore, to speed up the model convergence and improve the model robustness and accuracy, SIoU loss is used for bounding box regression in the training stage. On the VOC2012 dataset, the number of parameters and floating-point operations decreased by 13.4% and 15.7% compared to YOLOv7, with comparable detection accuracy. Additionally, the number of parameters and floating-point operations decreased by 13.7% and 16.1% on our pig dataset. Ghost-YOLOv7-SIoU is superior to YOLOV4-CSP and YOLOR-CSP in accuracy. Experimental results demonstrate the effectiveness of the proposed method in improving the efficiency of model detection while ensuring detection accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF