1. An empirical evaluation of meta residual network for classifying sensor drift samples.
- Author
-
Zhu, Zhengyang, Ling, Haikui, Zhang, Yiyi, Liu, Jiefeng, Shuang, Feng, Xu, Min, and Jia, Pengfei
- Subjects
- *
SENSOR networks , *PATTERN recognition systems , *ELECTRONIC noses , *SUPERVISED learning , *GAS detectors - Abstract
With the advancements in materials and pattern recognition algorithms, electronic nose (E-nose) has been continuously developing and finding increasingly widespread applications. However, E-nose is prone to drift during practical usage. Currently, the primary methods for addressing the issue of sensor drift in sample classification are semi-supervised learning and unsupervised learning. The implementation of supervised learning has been difficult in the past due to the requirement for a large number of up-to-date samples to retrain the models. But now, this challenge can be overcome with the help of meta-learning. We applied the model-agnostic meta-learning (MAML) architecture from meta-learning and designed a residual classifier with learnable per-parameter learning rates. Leveraging the ability of 'learning to learn', our method do not require a large amount of target domain data and there is no need to repeat extensive computations when transferring to any other target domain compared to other algorithms, resulting it highly deployable. Through the evaluation of multiple experiments, we demonstrate the feasibility of our approach and achieve very high classification accuracy on a publicly available long-term drift dataset compared with other methods. • The MAML architecture in meta-learning is utilized to address the issue of gas sensor drift. • Learning a learning rate and direction for each parameter is conducted in our architecture. In the outer loop phase of our method, different learning rates are also applied to different parameters. • Compared to other algorithms, our model only requires two samples per class for knowledge transfer, without the need for extensive repeated training. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF