We have released the Waymo Open Dataset publicly to aid the research community in making advancements in machine perception and autonomous driving technology.
The Waymo Open Dataset is composed of two datasets - the Perception dataset with high resolution sensor data and labels for 2,030 scenes, and the Motion dataset with object trajectories and corresponding 3D maps for 103,354 scenes.
This code repository (excluding
src/waymo_open_dataset/wdl_limited
folder) is licensed under the Apache License, Version 2.0. The code appearing in
src/waymo_open_dataset/wdl_limited
is
licensed under terms appearing therein. The Waymo Open Dataset itself is
licensed under separate terms. Please
visit https://waymo.com/open/terms/ for
details. Code located in each of the subfolders located at
src/waymo_open_dataset/wdl_limited
is
licensed under (a) a BSD 3-clause copyright license and (b) an additional
limited patent license. Each limited
patent license is applicable only to code under the respective wdl_limited
subfolder, and is licensed for use only with the use case laid out in such
license in connection with the Waymo Open Dataset, as authorized by and in
compliance with the Waymo Dataset License Agreement for Non-Commercial Use. See
wdl_limited/camera/,
wdl_limited/camera_segmentation/,
wdl_limited/sim_agents_metrics/,
respectively, for details.
The Rules have been updated to allow training (including pre-training, co-training or fine-tuning models) using frozen, pre-trained weights from publicly available open source models for submissions to the Challenges. We also added a new sets of fields (which are now required, or the server will return an error) in the submission metadatas to track how participants generated their submissions. We updated the tutorials to reflect this change, check out the new fields in the submission proto files for motion, sim agents and occupancy flow.
This update contains several changes/addition to the datasets:
Perception dataset (v1.4.3 and v2.0.1):
Motion dataset (v1.2.1):
We also provide the following changes to the code supporting the challenges.
Motion prediction:
Sim Agents:
We released v1.6.1 version of the pip package with fixes for the WOSAC metrics:
We released a large-scale object-centric asset dataset containing over 1.2M images and lidar observations of two major categories (vehicles and pedestrians) from the Perception Dataset (v2.0.0).
most_visible_camera
, projected lidar returns on the corresponding camera, per-pixel camera rays information, and auto-labeled 2D panoptic segmentation that supports object NeRF reconstruction.This major update includes supporting code to four challenges at waymo.com/open, and dataset updates to both the Perception and Motion Datasets.
v2.0.0 of the Perception Dataset
v1.4.2 of the Perception Dataset
v1.2.0 of the Motion Dataset
Added supporting code for the four 2023 Waymo Open Dataset Challenges
We released v1.4.1 of the Perception dataset.
We released v1.4.0 of the Perception dataset.
Compute Metrics
in the tutorial.We released v1.3.2 of the Perception dataset to improve the quality and accuracy of the labels.
num_top_lidar_points_in_box
in dataset.proto for the 3D Camera-Only Detection Challenge.We released v1.3.1 of the Perception dataset to support the 2022 Challenges and have updated this repository accordingly.
projected_lidar_labels
in dataset.proto.We released v1.3.0 of the Perception dataset and the 2022 challenges. We have updated this repository to add support for the new labels and the challenges.
We released v1.1 of the Motion dataset to include lane connectivity information. To read more on the technical details, please read lane_neighbors_and_boundaries.md.
We expanded the Waymo Open Dataset to also include a Motion dataset comprising object trajectories and corresponding 3D maps for over 100,000 segments. We have updated this repository to add support for this new dataset.
Additionally, we added instructions and examples for the real-time detection challenges. Please follow these Instructions.
To read more about the dataset and access it, please visit https://www.waymo.com/open.
This code repository contains:
@InProceedings{Sun_2020_CVPR, author = {Sun, Pei and Kretzschmar, Henrik and Dotiwalla, Xerxes and Chouard, Aurelien and Patnaik, Vijaysai and Tsui, Paul and Guo, James and Zhou, Yin and Chai, Yuning and Caine, Benjamin and Vasudevan, Vijay and Han, Wei and Ngiam, Jiquan and Zhao, Hang and Timofeev, Aleksei and Ettinger, Scott and Krivokon, Maxim and Gao, Amy and Joshi, Aditya and Zhang, Yu and Shlens, Jonathon and Chen, Zhifeng and Anguelov, Dragomir}, title = {Scalability in Perception for Autonomous Driving: Waymo Open Dataset}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2020} }
@InProceedings{Ettinger_2021_ICCV, author={Ettinger, Scott and Cheng, Shuyang and Caine, Benjamin and Liu, Chenxi and Zhao, Hang and Pradhan, Sabeek and Chai, Yuning and Sapp, Ben and Qi, Charles R. and Zhou, Yin and Yang, Zoey and Chouard, Aur'elien and Sun, Pei and Ngiam, Jiquan and Vasudevan, Vijay and McCauley, Alexander and Shlens, Jonathon and Anguelov, Dragomir}, title={Large Scale Interactive Motion Forecasting for Autonomous Driving: The Waymo Open Motion Dataset}, booktitle= Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month={October}, year={2021}, pages={9710-9719} }
@InProceedings{Kan_2024_icra, author={Chen, Kan and Ge, Runzhou and Qiu, Hang and Ai-Rfou, Rami and Qi, Charles R. and Zhou, Xuanyu and Yang, Zoey and Ettinger, Scott and Sun, Pei and Leng, Zhaoqi and Mustafa, Mustafa and Bogun, Ivan and Wang, Weiyue and Tan, Mingxing and Anguelov, Dragomir}, title={WOMD-LiDAR: Raw Sensor Dataset Benchmark for Motion Forecasting}, month={May}, booktitle= Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)}, year={2024} }
The following table is necessary for this dataset to be indexed by search engines such as Google Dataset Search.
property | value | ||||||
---|---|---|---|---|---|---|---|
name | Waymo Open Dataset: An autonomous driving dataset |
||||||
alternateName | Waymo Open Dataset |
||||||
url | https://github.com/waymo-research/waymo-open-dataset |
||||||
sameAs | https://github.com/waymo-research/waymo-open-dataset |
||||||
sameAs | https://www.waymo.com/open |
||||||
description | The Waymo Open Dataset is comprised of high-resolution sensor data collected by autonomous vehicles operated by the Waymo Driver in a wide variety of conditions. We’re releasing this dataset publicly to aid the research community in making advancements in machine perception and self-driving technology. |
||||||
provider |
|
||||||
license |
|