1. Vision-Based High-Speed Driving With a Deep Dynamic Observer
- Author
-
Brian Goldfain, James M. Rehg, Grady Williams, Evangelos A. Theodorou, and Paul Drews
- Subjects
0209 industrial biotechnology ,Control and Optimization ,Observer (quantum physics) ,Computer science ,Biomedical Engineering ,02 engineering and technology ,Convolutional neural network ,Vehicle dynamics ,020901 industrial engineering & automation ,Artificial Intelligence ,Inertial measurement unit ,0202 electrical engineering, electronic engineering, information engineering ,Computer vision ,Artificial neural network ,business.industry ,Mechanical Engineering ,Deep learning ,Computer Science Applications ,Human-Computer Interaction ,Model predictive control ,Control and Systems Engineering ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,Particle filter - Abstract
In this letter, we present a framework for combining deep learning-based road detection, particle filters, and model predictive control (MPC) to drive aggressively using only a monocular camera, IMU, and wheel speed sensors. This framework uses deep convolutional neural networks combined with LSTMs to learn a local cost map representation of the track in front of the vehicle. A particle filter uses this dynamic observation model to localize in a schematic map, and MPC is used to drive aggressively using this particle filter based state estimate. We show extensive real world testing results and demonstrate reliable operation of the vehicle at the friction limits on a complex dirt track. We reach speeds above 27 m/h (12 m/s) on a dirt track with a 105 ft (32 m) long straight using our 1:5 scale test vehicle.
- Published
- 2019
- Full Text
- View/download PDF