Back to Search Start Over

WOMD-LiDAR: Raw Sensor Dataset Benchmark for Motion Forecasting

Authors :
Chen, Kan
Ge, Runzhou
Qiu, Hang
AI-Rfou, Rami
Qi, Charles R.
Zhou, Xuanyu
Yang, Zoey
Ettinger, Scott
Sun, Pei
Leng, Zhaoqi
Baniodeh, Mustafa
Bogun, Ivan
Wang, Weiyue
Tan, Mingxing
Anguelov, Dragomir
Publication Year :
2023

Abstract

Widely adopted motion forecasting datasets substitute the observed sensory inputs with higher-level abstractions such as 3D boxes and polylines. These sparse shapes are inferred through annotating the original scenes with perception systems' predictions. Such intermediate representations tie the quality of the motion forecasting models to the performance of computer vision models. Moreover, the human-designed explicit interfaces between perception and motion forecasting typically pass only a subset of the semantic information present in the original sensory input. To study the effect of these modular approaches, design new paradigms that mitigate these limitations, and accelerate the development of end-to-end motion forecasting models, we augment the Waymo Open Motion Dataset (WOMD) with large-scale, high-quality, diverse LiDAR data for the motion forecasting task. The new augmented dataset WOMD-LiDAR consists of over 100,000 scenes that each spans 20 seconds, consisting of well-synchronized and calibrated high quality LiDAR point clouds captured across a range of urban and suburban geographies (https://waymo.com/open/data/motion/). Compared to Waymo Open Dataset (WOD), WOMD-LiDAR dataset contains 100x more scenes. Furthermore, we integrate the LiDAR data into the motion forecasting model training and provide a strong baseline. Experiments show that the LiDAR data brings improvement in the motion forecasting task. We hope that WOMD-LiDAR will provide new opportunities for boosting end-to-end motion forecasting models.<br />Comment: ICRA 2024 camera ready version. Dataset website: https://waymo.com/open/data/motion/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2304.03834
Document Type :
Working Paper