The ECMD Datasets

An Event-Centric Multisensory Driving Dataset for SLAM.




This is first event-based SLAM datasets specifically focus on urbanized autonomou driving. We explore the inquiry: Are event cameras ready for autonomous driving? At the same time, we also investigates the perceptual capabilities of various sensors, including LiDAR, standard cameras, infrared cameras, and GNSS-RTK/INS.
The contributions of our work can be summarized as follows:

  • Our sensor platform consists of various novel sensors, including two sets of stereo event cameras with distinct resolutions (640×480, 346×260), an infrared camera, stereo industrial cameras, three mechanical LiDARs (including two slanted LiDARs), an onboard inertial measurement unit (IMU), and two global navigation satellite system (GNSS) receivers. For the ground-truth, we adopt a centimeter-level position system that combines the GNSS real-time kinematic (RTK) with the fiber optics gyroscope integrated inertial system as GNSS-RTK/INS.
  • ECMD collects 81 sequences covering over 200 kilometers of trajectories in various driving scenarios, including dense streets, urban, tunnels, highways, bridges, and suburbs. These sequences are recorded under daylight and nighttime, providing challenging situations for Visual and LiDAR SLAM, e.g., dynamic objects, high-speed motion, repetitive scenarios, and HDR scenes. Meanwhile, we evaluate existing state-of-the-art visual and LiDAR SLAM algorithms with various sensor modalities on our datasets. Moreover, our dataset and benchmark results are released publicly available on our website.

We hope that we can make some contributions for the development of event-based vision, especially event-based multi-sensor fusioin for autonomous driving.
The visualization of each sequence is available in Download section and Bilibili.
If you have any suggestions or questions, do not hesitate to propose an issue to our Github Repository.


News

December 2, 2023 We release our GNSS-RTK/INS gt and M8T/F9P GNSS at Download section.
November 28, 2023 Our work has been accepted by IEEE Transactions on Intelligent Vehicles!
November 21, 2023 Calibration results and rosbag are avaliable at Calibration page.
November 19, 2023 We release our sequences at Download section.
November 07, 2023 The preprint version is available at arXiv.
October 31, 2023 Watch our video presentation on Bilibili or Youtube.
August 28, 2023 We finish the evaluation of ECMD using various LiDAR SLAM (Record Videos)
August 22, 2023 We complete the collection of all sequences (bilibili visulization).
June 2, 2023 Driver code and time synchronization of event cameras are now available (Code, Bilibili).
June 1, 2023 ECMD Datasets goes live!

BibTeX

Please cite the following publication when using this benchmark in an academic context:

  1. P. Chen, W. Guan, F. Huang, Y. Zhong, W. Wen, L. Hsu, and P. Lu. ECMD: An Event-Centric Multisensory Driving Dataset for SLAM. IEEE Transactions on Intelligent Vehicles, vol. 9, no. 1, pp. 407-416, 2023.

Other resources

Some tools for pre-processing the dataset and HKU event-based handheld & drone dataset are available at here.

License

This work is released under GPLv3 license. For commercial inquires, please contact Dr. Peng Lu (lupeng@hku.hk).



Acknowledgement

This work was supported by General Research Fund under Grant 17204222, and in part by the Seed Fund for Collaborative Research and General Funding Scheme-HKU-TCL Joint Research Center for Artificial Intelligence.