1. What did you find out?
The key feature of an event-based camera is that it only represents changes in illumination from each pixel asynchronously. Additionally, this sensor has a very high dynamic range and a very short temporal resolution. Therefore, the event-based camera can excellently support a traditional area camera (RGB camera) at night and in fast-moving traffic situations. In the work "TUMTraf Event: Calibration and Fusion Resulting in a Dataset for Roadside Event-Based and RGB Cameras," we developed a calibration approach that aligns the image of a stationary event-based camera with the image of a traditional RGB camera based on traffic at a complex intersection. This process involves transferring the event stream to the RGB camera image, which is necessary for sensor fusion.
Based on this "targetless" calibration, we developed and experimented with various fusion approaches for the event-based and RGB cameras, considering the characteristics of the event-based camera. Overall, we increased the detection performance of an RGB camera by 9% mAP during the day and by 13% mAP at night using the developed sensor fusion with the event-based camera.
Lastly, we published the TUMTraf Event dataset. As far as we know, this dataset is the first of its kind for the scientific community.
2. What unique challenges did you face during your research?
The biggest challenges occurred during the recording of the camera data. The data was collected during the day, at night, and in rain, sleet, and freezing cold at the Providentia++ test field. The problem was that our fingers literally froze, and the prototype technology sporadically failed due to moisture and cold. Furthermore, the first data recordings revealed that street lighting at night caused unexpected noise in the event-based camera. Therefore, we had to optimize the sensor driver and repeat the challenging data recordings. Nevertheless, it was worth it, and we published the first dataset with synchronized event-based and RGB camera images for roadside sensor infrastructure.
3. What are your new findings suitable for (practical use)?
In practice, sensor calibrations are usually performed with a pattern (e.g., a chessboard pattern). However, such a pattern cannot be constantly placed on a busy highway whenever the camera orientation changes slightly due to weather conditions. Therefore, a calibration approach that does not require a pattern ("targetless calibration") is needed. With this work, event-based and RGB cameras can now be calibrated to each other without a pattern. Additionally, this work shows how to meaningfully combine the data of an event-based camera with that of an RGB camera to optimally detect road users at night. Looking to the future, the published dataset now enables further research activities in the field of event-based and RGB cameras regarding roadside sensor infrastructure systems.
Read more and find the datasets: innovation-mobility.com/en/project-providentia/a9-dataset/
Publication Christian Creß; Walter Zimmer; Nils Purschke; Bach Ngoc Doan; Sven Kirchner; Venkatnarayanan Lakshminarasimhan; Leah Strand; Alois C. Knoll; TUMTraf Event: Calibration and Fusion Resulting in a Dataset for Roadside Event-Based and RGB Cameras; IEEE Transactions on Intelligent Vehicles; 4-2024