Urban Place Recognition Dataset

This dataset provides visual and inertial information as well as manually-annotated ground-truth for sequences specifically captured for place recognition tasks, outdoors in urban scenes. The recordings were made with a side-looking camera using hand-held setups at different heights, as well as with the help of a drone, visiting the same scene at different times of the day and the year. As a result, this dataset poses significant challenges in viewpoint, illumination and situational changes.

All the sequences were recorded using a high quality visual-inertial sensor providing monocular, grayscale, global-shutter images at 20 Hz and time-synchronized inertial measurements. This dataset is freeley available and can be downloaded from external pagehere.

Users of this dataset are asked to cite the following paper, where it was originally presented:

Fabiola Maffra, Zetao Chen and Margarita Chli, “Viewpoint-tolerant Place Recognition combining 2D and 3D information for UAV navigation”, in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2018. external pageDOI external pageE-citations

By playing the video you accept the privacy policy of YouTube.Learn more OK
Viewpoint-tolerant Place Recognition combining 2D and 3D information for UAV navigation
JavaScript has been disabled in your browser