1. Style Transfer for Keypoint Matching Under Adverse Conditions
- Author
-
Abdelaziz Djelouah, Simone Schaub-Meyer, and Ali Uzpak
- Subjects
Matching (statistics) ,Computer science ,business.industry ,Feature extraction ,Location awareness ,Pattern recognition ,02 engineering and technology ,Function (mathematics) ,010501 environmental sciences ,Semantics ,computer.software_genre ,01 natural sciences ,Image (mathematics) ,Visualization ,Feature (computer vision) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,computer ,0105 earth and related environmental sciences - Abstract
In this work, we address the difficulty of matching local features between images captured at distant points in time resulting in a global appearance change. Inspired by recent neural style transfer techniques, we propose to use an image transformation network to translate night images into day-like appearance, with the objective of better matching performance. We extend traditional style transfer, that optimize for content and style, with a keypoint matching loss function. The joint optimization of these losses allows our model to generate images that can significantly improve the performance of local feature matching, in a self-supervised way. As a result, our approach is flexible and does not require paired training data, which is difficult to obtain in practice. We show how our method can be used as an extension to a state-of-the-art differentiable feature extractor to improve its performance in challenging scenarios. This is demonstrated in our evaluation on day-night image matching and visual localization tasks with night-rain image queries.
- Published
- 2020
- Full Text
- View/download PDF