Back to Search Start Over

On-Board Vision-Language Models for Personalized Autonomous Vehicle Motion Control: System Design and Real-World Validation

Authors :
Cui, Can
Yang, Zichong
Zhou, Yupeng
Peng, Juntong
Park, Sung-Yeon
Zhang, Cong
Ma, Yunsheng
Cao, Xu
Ye, Wenqian
Feng, Yiheng
Panchal, Jitesh
Li, Lingxi
Chen, Yaobin
Wang, Ziran
Publication Year :
2024

Abstract

Personalized driving refers to an autonomous vehicle's ability to adapt its driving behavior or control strategies to match individual users' preferences and driving styles while maintaining safety and comfort standards. However, existing works either fail to capture every individual preference precisely or become computationally inefficient as the user base expands. Vision-Language Models (VLMs) offer promising solutions to this front through their natural language understanding and scene reasoning capabilities. In this work, we propose a lightweight yet effective on-board VLM framework that provides low-latency personalized driving performance while maintaining strong reasoning capabilities. Our solution incorporates a Retrieval-Augmented Generation (RAG)-based memory module that enables continuous learning of individual driving preferences through human feedback. Through comprehensive real-world vehicle deployment and experiments, our system has demonstrated the ability to provide safe, comfortable, and personalized driving experiences across various scenarios and significantly reduce takeover rates by up to 76.9%. To the best of our knowledge, this work represents the first end-to-end VLM-based motion control system in real-world autonomous vehicles.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.11913
Document Type :
Working Paper