Back to Search Start Over

Asymmetric convolutional multi-level attention network for micro-lens segmentation.

Authors :
Zhong, Shunshun
Zhou, Haibo
Yan, YiXiong
Zhang, Fan
Duan, Ji'an
Source :
Engineering Applications of Artificial Intelligence. Jul2024:Part A, Vol. 133, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Tiny target recognition in automation is currently a hot research task that usually suffers from typical issues such as complex background, dim target, and slow detection speed. In the current study, a data-driven method is proposed to realize the posture recognition of micro-lens during optical device coupling to achieve accurate clamping of the gripper. First, we establish a pixel-by-pixel labeled optical micro-lens dataset named single-frame micro-lens target (SFMT), which provides data support for the subsequently proposed convolutional neural network. Subsequently, an asymmetric convolutional multi-level attention network (ACMANet) is proposed to realize accurate segmentation detection of micro-lenses by employing an embedded multi-scale asymmetric convolutional module (MACM) and a multi-level interactive attention module (MIAM). MACM achieves not only a reduction in computational complexity but also enhanced robustness for rotated image recognition through multi-scale asymmetric convolutional kernels. Furthermore, MIAM improves the accuracy of image segmentation by connecting the down-sampling and up-sampling stages and realizing the fusion of pixel position details and key channel features. Extensive experimental results based on our self-constructed image acquisition system demonstrate that the values of normalized intersection over union and dice are successively 91.41% and 95.50%, and the processing speed is 3.3 s/100 images, which shows the advance of ACMANet. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09521976
Volume :
133
Database :
Academic Search Index
Journal :
Engineering Applications of Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
177605625
Full Text :
https://doi.org/10.1016/j.engappai.2024.108355