Back to Search Start Over

PNAS-MOT: Multi-Modal Object Tracking with Pareto Neural Architecture Search

Authors :
Peng, Chensheng
Zeng, Zhaoyu
Gao, Jinling
Zhou, Jundong
Tomizuka, Masayoshi
Wang, Xinbing
Zhou, Chenghu
Ye, Nanyang
Source :
IEEE Robotics and Automation Letters, 2024
Publication Year :
2024

Abstract

Multiple object tracking is a critical task in autonomous driving. Existing works primarily focus on the heuristic design of neural networks to obtain high accuracy. As tracking accuracy improves, however, neural networks become increasingly complex, posing challenges for their practical application in real driving scenarios due to the high level of latency. In this paper, we explore the use of the neural architecture search (NAS) methods to search for efficient architectures for tracking, aiming for low real-time latency while maintaining relatively high accuracy. Another challenge for object tracking is the unreliability of a single sensor, therefore, we propose a multi-modal framework to improve the robustness. Experiments demonstrate that our algorithm can run on edge devices within lower latency constraints, thus greatly reducing the computational requirements for multi-modal object tracking while keeping lower latency.<br />Comment: IEEE Robotics and Automation Letters 2024. Code is available at https://github.com/PholyPeng/PNAS-MOT

Details

Database :
arXiv
Journal :
IEEE Robotics and Automation Letters, 2024
Publication Type :
Report
Accession number :
edsarx.2403.15712
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/LRA.2024.3379865