Back to Search
Start Over
Cross-modal pedestrian re-recognition based on attention mechanism.
- Source :
-
Visual Computer . Apr2024, Vol. 40 Issue 4, p2405-2418. 14p. - Publication Year :
- 2024
-
Abstract
- Person re-identification, as an essential research direction in intelligent security, has gained the focus of researchers and scholars. In practical scenarios, visible light cameras depend highly on lighting conditions and have limited detection capability in poor light. Therefore, many scholars have gradually shifted their research goals to cross-modality person re-identification. However, there are few relevant studies, and challenges remain in resolving the differences in the images of different modalities. In order to solve these problems, this paper will use the research method based on the attention mechanism to narrow the difference between the two modes and guide the network in a more appropriate direction to improve the recognition performance of the network. Aiming at the problem of using the attention mechanism method can improve training efficiency. However, it is easy to cause the model training instability. This paper proposes a cross-modal pedestrian re-recognition method based on the attention mechanism. A new attention mechanism module is designed to allow the network to use less time to focus on more critical features of a person. In addition, a cross-modality hard center triplet loss is designed to supervise the model training better. The paper has conducted extensive experiments on the above two methods on two publicly available datasets, which obtained better performance than similar current methods and verified the effectiveness and feasibility of the proposed methods in this paper. [ABSTRACT FROM AUTHOR]
- Subjects :
- *PEDESTRIANS
*VISIBLE spectra
*NETWORK performance
*RESEARCH personnel
*ATTENTION
Subjects
Details
- Language :
- English
- ISSN :
- 01782789
- Volume :
- 40
- Issue :
- 4
- Database :
- Academic Search Index
- Journal :
- Visual Computer
- Publication Type :
- Academic Journal
- Accession number :
- 176465104
- Full Text :
- https://doi.org/10.1007/s00371-023-02926-7