Code: Event-based Feature Tracking

Hypotheses-based Asynchronous Feature Tracking for Event Cameras

This code release contains the implementation of the feature tracking algorithms within the event stream, described in two of our papers listed below. In particular, here the sparsity and the asynchronicity of the event stream are explicitly exploited enabling efficient, asynchronous tracking of features using event-by-event processing. We rely on a novel hypothesis-based feature tracking paradigm for event-based features that avoids the need to explicitly optimize the underlying expensive alignment problem. This event-based feature tracking software is publicly available and can be accessed from external pagethis link.


Users of this software are kindly asked to cite at least one of the following papers, where it was introduced:

Ignacio Alzugaray and Margarita Chli, "Asynchronous Multi-Hypothesis Tracking of Features with Event Cameras" in Proceedings of the IEEE International Conference on 3D Vision (3DV), 2019. Research Collection external pageVideo

Ignacio Alzugaray and Margarita Chli, "HASTE: multi-Hypothesis Asynchronous Speeded-up Tracking of Events" in Proceedings of the British Machine Vision Conference (BMVC), 2020. Research Collection external pagePresentation external pageVideo

By playing the video you accept the privacy policy of YouTube.Learn more OK
HASTE: multi-Hypothesis Asynchronous Speeded-up Tracking of Events
By playing the video you accept the privacy policy of YouTube.Learn more OK
Asynchronous Multi-Hypothesis Tracking of Features with Event Cameras
JavaScript has been disabled in your browser