摘要:Compared with conventional image sensors, event cameras have been attracting attention thanks to their potential in environments under fast motion and high dynamic range (HDR). To tackle the lost-track issue due to fast illumination changes under HDR scene such as tunnels, an object tracking framework has been presented based on event count images from an event camera. The framework contains an offline-trained detector and an online-trained tracker which complement each other: The detector benefits from pre-labelled data during training, but may have false or missing detections; the tracker provides persistent results for each initialised object but may suffer from drifting issues or even failures. Besides, process and measurement equations have been modelled, and a Kalman fusion scheme has been proposed to incorporate measurements from the detector and the tracker. Self-initialisation and track maintenance in the fusion scheme ensure autonomous real-time tracking without user intervene. With self-collected event data in urban driving scenarios, experiments have been conducted to show the performance of the proposed framework and the fusion scheme.