Back to Search Start Over

A novel hardware authentication primitive against modeling attacks.

Authors :
Liu, Wenrui
Cheng, Jiafeng
Sun, Nengyuan
Peng, Zhaokang
Sun, Caiban
Wang, Chunyang
Wen, Yiming
Zhang, Hongliu
Zhang, Pengcheng
Yu, Weize
Source :
International Journal of Circuit Theory & Applications. Jun2023, Vol. 51 Issue 6, p2993-3001. 9p.
Publication Year :
2023

Abstract

Summary: Traditional hardware security primitives such as physical unclonable functions (PUFs) are quite vulnerable to machine learning (ML) attacks. The primary reason is that PUFs rely on process mismatches between two identically designed circuit blocks to generate deterministic math functions as the secret information sources. Unfortunately, ML algorithms are pretty efficient in modeling deterministic math functions. In order to resist against ML attacks, in this letter, a novel hardware security primitive named neural network (NN) chain is proposed by utilizing noise data to generate chaotic NNs for achieving authentication. In a NN chain, two independent batches of noise data are utilized as the input and output training data of NNs, respectively, to maximize the uncertainty within the NN chain. In contrast to a regular PUF, the proposed NN chain is capable of achieving over 20 times ML attackā€resistance and 100% reliability with less than 39% power and area overhead. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00989886
Volume :
51
Issue :
6
Database :
Academic Search Index
Journal :
International Journal of Circuit Theory & Applications
Publication Type :
Academic Journal
Accession number :
164136597
Full Text :
https://doi.org/10.1002/cta.3566