Back to Search Start Over

A Fine-Grained Bird Classification Method Based on Attention and Decoupled Knowledge Distillation.

Authors :
Wang, Kang
Yang, Feng
Chen, Zhibo
Chen, Yixin
Zhang, Ying
Source :
Animals (2076-2615). Jan2023, Vol. 13 Issue 2, p264. 17p.
Publication Year :
2023

Abstract

Simple Summary: Identifying bird species is crucial in various bird monitoring tasks. In this study, we implemented a bird species recognition method based on visual features by convolutional neural networks. This paper proposed an attention-guided data enhancement method and a decoupled knowledge distillation model compression method. Using the two methods, we created a novel and efficient lightweight fine-grained bird classification model. The model not only achieved high accuracy in bird classification but was also user-friendly for edge devices such as mobile. Our work may be useful for bird species recognition and bird biodiversity monitoring. Classifying birds accurately is essential for ecological monitoring. In recent years, bird image classification has become an emerging method for bird recognition. However, the bird image classification task needs to face the challenges of high intraclass variance and low inter-class variance among birds, as well as low model efficiency. In this paper, we propose a fine-grained bird classification method based on attention and decoupled knowledge distillation. First of all, we propose an attention-guided data augmentation method. Specifically, the method obtains images of the object's key part regions through attention. It enables the model to learn and distinguish fine features. At the same time, based on the localization–recognition method, the bird category is predicted using the object image with finer features, which reduces the influence of background noise. In addition, we propose a model compression method of decoupled knowledge distillation. We distill the target and nontarget class knowledge separately to eliminate the influence of the target class prediction results on the transfer of the nontarget class knowledge. This approach achieves efficient model compression. With 67% fewer parameters and only 1.2 G of computation, the model proposed in this paper still has a 87.6% success rate, while improving the model inference speed. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20762615
Volume :
13
Issue :
2
Database :
Academic Search Index
Journal :
Animals (2076-2615)
Publication Type :
Academic Journal
Accession number :
161422715
Full Text :
https://doi.org/10.3390/ani13020264