Back to Search Start Over

MOVE: Effective and Harmless Ownership Verification via Embedded External Features

Authors :
Li, Yiming
Zhu, Linghui
Jia, Xiaojun
Bai, Yang
Jiang, Yong
Xia, Shu-Tao
Cao, Xiaochun
Li, Yiming
Zhu, Linghui
Jia, Xiaojun
Bai, Yang
Jiang, Yong
Xia, Shu-Tao
Cao, Xiaochun
Publication Year :
2022

Abstract

Currently, deep neural networks (DNNs) are widely adopted in different applications. Despite its commercial values, training a well-performed DNN is resource-consuming. Accordingly, the well-trained model is valuable intellectual property for its owner. However, recent studies revealed the threats of model stealing, where the adversaries can obtain a function-similar copy of the victim model, even when they can only query the model. In this paper, we propose an effective and harmless model ownership verification (MOVE) to defend against different types of model stealing simultaneously, without introducing new security risks. In general, we conduct the ownership verification by verifying whether a suspicious model contains the knowledge of defender-specified external features. Specifically, we embed the external features by tempering a few training samples with style transfer. We then train a meta-classifier to determine whether a model is stolen from the victim. This approach is inspired by the understanding that the stolen models should contain the knowledge of features learned by the victim model. In particular, we develop our MOVE method under both white-box and black-box settings to provide comprehensive model protection. Extensive experiments on benchmark datasets verify the effectiveness of our method and its resistance to potential adaptive attacks. The codes for reproducing the main experiments of our method are available at \url{https://github.com/THUYimingLi/MOVE}.<br />Comment: 15 pages. The journal extension of our conference paper in AAAI 2022 (https://ojs.aaai.org/index.php/AAAI/article/view/20036). arXiv admin note: substantial text overlap with arXiv:2112.03476

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381558349
Document Type :
Electronic Resource