1. Unsupervised learning-based long-term superpixel tracking.
- Author
-
Conze, Pierre-Henri, Tilquin, Florian, Lamard, Mathieu, Heitz, Fabrice, and Quellec, Gwenolé
- Subjects
- *
ARTIFICIAL satellite tracking , *PIXELS , *OBJECT tracking (Computer vision) , *PLURALITY voting , *VIDEO processing , *COMPUTER vision , *TEMPORAL integration - Abstract
Finding correspondences between structural entities decomposing images is of high interest for computer vision applications. In particular, we analyze how to accurately track superpixels - visual primitives generated by aggregating adjacent pixels sharing similar characteristics - over extended time periods relying on unsupervised learning and temporal integration. A two-step video processing pipeline dedicated to long-term superpixel tracking is proposed. First, unsupervised learning-based superpixel matching provides correspondences between consecutive and distant frames using new context-rich features extended from greyscale to multi-channel and forward-backward consistency constraints. Resulting elementary matches are then combined along multi-step paths running through the whole sequence with various inter-frame distances. This produces a large set of candidate long-term superpixel pairings upon which majority voting is performed. Video object tracking experiments demonstrate the accuracy of our elementary estimator against state-of-the-art methods and proves the ability of multi-step integration to provide accurate long-term superpixel matches compared to usual direct and sequential integration. • A video processing pipeline dedicated to long-term superpixel tracking is proposed. • Multi-step superpixel pairings are estimated in an unsupervised learning fashion. • Unsupervised learning exploits robust context-rich features extended to multi-channel. • Elementary matches are combined through multi-step integration and majority voting. • Accurate long-term superpixel matches are shown for video object tracking experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF