Back to Search Start Over

A New Scene Sensing Model Based on Multi-Source Data from Smartphones

Authors :
Zhenke Ding
Zhongliang Deng
Enwen Hu
Bingxun Liu
Zhichao Zhang
Mingyang Ma
Source :
Sensors, Vol 24, Iss 20, p 6669 (2024)
Publication Year :
2024
Publisher :
MDPI AG, 2024.

Abstract

Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information—Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors—characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm.

Details

Language :
English
ISSN :
14248220
Volume :
24
Issue :
20
Database :
Directory of Open Access Journals
Journal :
Sensors
Publication Type :
Academic Journal
Accession number :
edsdoj.5fb4a6deaedd420c8cfd1917ba1d1066
Document Type :
article
Full Text :
https://doi.org/10.3390/s24206669