Back to Search
Start Over
Unequal-Training for Deep Face Recognition With Long-Tailed Noisy Data
- Source :
- CVPR
- Publication Year :
- 2019
- Publisher :
- IEEE, 2019.
-
Abstract
- Large-scale face datasets usually exhibit a massive number of classes, a long-tailed distribution, and severe label noise, which undoubtedly aggravate the difficulty of training. In this paper, we propose a training strategy that treats the head data and the tail data in an unequal way, accompanying with noise-robust loss functions, to take full advantage of their respective characteristics. Specifically, the unequal-training framework provides two training data streams: the first stream applies the head data to learn discriminative face representation supervised by Noise Resistance loss; the second stream applies the tail data to learn auxiliary information by gradually mining the stable discriminative information from confusing tail classes. Consequently, both training streams offer complementary information to deep feature learning. Extensive experiments have demonstrated the effectiveness of the new unequal-training framework and loss functions. Better yet, our method could save a significant amount of GPU memory. With our method, we achieve the best result on MegaFace Challenge 2 (MF2) given a large-scale noisy training data set.
- Subjects :
- Training set
business.industry
Computer science
Deep learning
02 engineering and technology
010501 environmental sciences
Machine learning
computer.software_genre
01 natural sciences
Facial recognition system
Discriminative model
Face (geometry)
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Artificial intelligence
Noise (video)
Representation (mathematics)
business
Set (psychology)
computer
Feature learning
0105 earth and related environmental sciences
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- Accession number :
- edsair.doi...........2dbe132094856ebe18790cae725055a0