Back to Search Start Over

A deep-learning real-time visual SLAM system based on multi-task feature extraction network and self-supervised feature points.

Authors :
Li, Guangqiang
Yu, Lei
Fei, Shumin
Source :
Measurement (02632241). Jan2021, Vol. 168, pN.PAG-N.PAG. 1p.
Publication Year :
2021

Abstract

• A simplified multi-task CNN is used for feature detection. • Features keep the same descriptor format as the ORB features. • Depth information of the RGB-D camera is adopted to construct the dense 3D map. Simultaneous Localization and Mapping (SLAM) is the basis for intelligent mobile robots to work in unknown environments. However, traditional feature extraction algorithms that traditional visual SLAM systems rely on have difficulty dealing with texture-less regions and other complex scenes, which limits the development of visual SLAM. The studies of feature points extraction adopting deep learning show that this method has more advantages than traditional methods in dealing with complex scenes, but these studies consider accuracy while ignoring the efficiency. To solve these problems, this paper proposes a deep-learning real-time visual SLAM system based on multi-task feature extraction network and self-supervised feature points. By designing a simplified Convolutional Neural Network (CNN) for detecting feature points and descriptors to replace the traditional feature extractor, the accuracy and stability of the visual SLAM system are enhanced. The experimental results in a dataset and real environments show that the proposed system can maintain high accuracy in a variety of challenging scenes, run on a GPU in real-time, and support the construction of dense 3D maps. Moreover, its overall performance is better than the current traditional visual SLAM system. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02632241
Volume :
168
Database :
Academic Search Index
Journal :
Measurement (02632241)
Publication Type :
Academic Journal
Accession number :
146562557
Full Text :
https://doi.org/10.1016/j.measurement.2020.108403