Sorry, I don't understand your search. ×
Back to Search Start Over

Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning

Authors :
Bai, Yan
Jiao, Jile
Wu, Shengsen
Lou, Yihang
Liu, Jun
Feng, Xuetao
Duan, Ling-Yu
Publication Year :
2021

Abstract

Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time.Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when updating the deployed model, we can bypass the inflexible and time-consuming feature re-extraction process. However, the old feature space that needs to be compatible is not ideal and faces the distribution discrepancy problem with the new space caused by different supervision losses. In this work, we propose a global optimization Dual-Tuning method to obtain feature compatibility against different networks and losses. A feature-level prototype loss is proposed to explicitly align two types of embedding features, by transferring global prototype information. Furthermore, we design a component-level mutual structural regularization to implicitly optimize the feature intrinsic structure. Experimental results on million-scale datasets demonstrate that our Dual-Tuning is able to obtain feature compatibility without sacrificing performance. (Our code will be avaliable at https://github.com/yanbai1993/Dual-Tuning)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2108.02959
Document Type :
Working Paper