1. Locally Black-box Adversarial Attack on Time Series
- Author
-
YANG Wen-bo, YUAN Ji-dong
- Subjects
black-box adversarial attack ,time series classification ,local perturbations ,genetic algorithm ,shapelet ,Computer software ,QA76.75-76.765 ,Technology (General) ,T1-995 - Abstract
Deep neural networks(DNNs) for time series classification have potential security concerns due to their vulnerability to adversarial attacks.The existing attack methods on time series performglobal perturbation based on gradient information,and the generated adversarial examples are easy to be perceived.This paper proposes a locally black-box method to attack DNNs without gradient information.First,the attack is described as a constrained optimization problem with the assumption that the method cannot get any inner information of the model,then the genetic algorithm is employed to solve it.Second,since time series shapelets provides the most discriminative information among different categories,it is designed as a local perturbation interval.Experimental results on UCR datasets that have potential security concerns indicate that the proposed method can effectively attack DNNs and generate adversarial samples.In addition,compared with the benchmark,the method significantly reduces the mean squared error while keeping a high success rate.
- Published
- 2022
- Full Text
- View/download PDF