Back to Search
Start Over
Image Segmentation-Based Oilseed Rape Row Detection for Infield Navigation of Agri-Robot.
- Source :
-
Agronomy . Sep2024, Vol. 14 Issue 9, p1886. 18p. - Publication Year :
- 2024
-
Abstract
- The segmentation and extraction of oilseed rape crop rows are crucial steps in visual navigation line extraction. Agricultural autonomous navigation robots face challenges in path recognition in field environments due to factors such as complex crop backgrounds and varying light intensities, resulting in poor segmentation and slow detection of navigation lines in oilseed rape crops. Therefore, this paper proposes VC-UNet, a lightweight semantic segmentation model that enhances the U-Net model. Specifically, VGG16 replaces the original backbone feature extraction network of U-Net, Convolutional Block Attention Module (CBAM) are integrated at the upsampling stage to enhance focus on segmentation targets. Furthermore, channel pruning of network convolution layers is employed to optimize and accelerate the model. The crop row trapezoidal ROI regions are delineated using end-to-end vertical projection methods with serialized region thresholds. Then, the centerline of oilseed rape crop rows is fitted using the least squares method. Experimental results demonstrate an average accuracy of 94.11% for the model and an image processing speed of 24.47 fps/s. After transfer learning for soybean and maize crop rows, the average accuracy reaches 91.57%, indicating strong model robustness. The average yaw angle deviation of navigation line extraction is 3.76°, with a pixel average offset of 6.13 pixels. Single image transmission time is 0.009 s, ensuring real-time detection of navigation lines. This study provides upper-level technical support for the deployment of agricultural robots in field trials. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 20734395
- Volume :
- 14
- Issue :
- 9
- Database :
- Academic Search Index
- Journal :
- Agronomy
- Publication Type :
- Academic Journal
- Accession number :
- 180011728
- Full Text :
- https://doi.org/10.3390/agronomy14091886