Dataset and Benchmark: Novel Sensors for Autonomous Vehicle Perception

Spencer Carmichael1

Austin Buchan1

Mani Ramanagopal1*

Radhika Ravi1*

Ram Vasudevan1,2

Katherine A. Skinner1

specarmi@umich.edu

adbuchan@umich.edu

srmani@umich.edu

rradhika@umich.edu

ramv@umich.edu

kskin@umich.edu

1Robotics Department, University of Michigan, Ann Arbor

2Department of Mechanical Engineering, University of Michigan, Ann Arbor

*Mani Ramanagopal and Radhika Ravi contributed to this work while employed at University of Michigan.

Preprint

Documentation & Software Tools

Download

IROS 2023 Workshop

Abstract

Conventional cameras employed in autonomous vehicle (AV) systems support many perception tasks, but are challenged by low-light or high dynamic range scenes, adverse weather, and fast motion. Novel sensors, such as event and thermal cameras, offer capabilities with the potential to address these scenarios, but they remain to be fully exploited. This paper introduces the Novel Sensors for Autonomous Vehicle Perception (NSAVP) dataset to facilitate future research on this topic. The dataset was captured with a platform including stereo event, thermal, monochrome, and RGB cameras as well as a high precision navigation system providing ground truth poses. The data was collected by repeatedly driving two ~8 km routes and includes varied lighting conditions and opposing viewpoint perspectives. We provide benchmarking experiments on the task of place recognition to demonstrate challenges and opportunities for novel sensors to enhance critical AV perception tasks. To our knowledge, the NSAVP dataset is the first to include stereo thermal cameras together with stereo event and monochrome cameras.

Citation