Back to Search
Start Over
KDCTime: Knowledge distillation with calibration on InceptionTime for time-series classification.
- Source :
-
Information Sciences . Oct2022, Vol. 613, p184-203. 20p. - Publication Year :
- 2022
-
Abstract
- Time-series classification approaches based on deep neural networks easily overfit UCR datasets, which is caused by the few-shot problem of those datasets. Therefore, to alleviate the overfitting phenomenon to further improve accuracy, we first propose label smoothing for InceptionTime (LSTime), which adopts the soft label information compared to only hard labels. Next, instead of manually adjusting soft labels by LSTime, knowledge distillation for InceptionTime (KDTime) is proposed to automatically generate soft labels by the teacher model while compressing the inference model. Finally, to rectify the incorrectly predicted soft labels from the teacher model, knowledge distillation with calibration for InceptionTime (KDCTime) is proposed, which contains two optional calibrating strategies, i.e., KDC by translating (KDCT) and KDC by reordering (KDCR). The experimental results show that the KDCTime accuracy is promising, while its inference time is orders of magnitude faster than state-of-the-art approaches. [ABSTRACT FROM AUTHOR]
- Subjects :
- *ARTIFICIAL neural networks
*MACHINE learning
*CALIBRATION
*INFERENCE (Logic)
Subjects
Details
- Language :
- English
- ISSN :
- 00200255
- Volume :
- 613
- Database :
- Academic Search Index
- Journal :
- Information Sciences
- Publication Type :
- Periodical
- Accession number :
- 159928179
- Full Text :
- https://doi.org/10.1016/j.ins.2022.08.057