Back to Search Start Over

Transfer Learning with EfficientNetV2S for Automatic Face Shape Classification.

Authors :
Grd, Petra
Tomičić, Igor
Barčić, Ena
Source :
Journal of Universal Computer Science (JUCS); 2024, Vol. 30 Issue 2, p153-178, 26p
Publication Year :
2024

Abstract

The classification of human face shapes, a pivotal aspect of one's appearance, plays a crucial role in diverse fields like beauty, cosmetics, healthcare, and security. In this paper, we present a multi-step methodology for face shape classification, harnessing the potential of transfer learning and a pretrained EfficientNetV2S neural network. Our approach comprises key phases, including preprocessing, augmentation, training, and testing, ensuring a comprehensive and reliable solution. The preprocessing step involves precise face detection, cropping, and image scaling, laying a solid foundation for accurate feature extraction. Our methodology utilizes a publicly available dataset of female celebrities, comprising five face shape classes: heart, oblong, oval, round, and square. By augmenting this dataset during training, we magnify its diversity, enabling better generalization and enhancing the model's robustness. With the EfficientNetV2S neural network, we employ transfer learning, leveraging pretrained weights to optimize accuracy, training speed, and parameter size. The result is a highly efficient and effective model, which outperforms state-of-the-art approaches on the same dataset, boasting an outstanding overall accuracy of 96.32%. Our findings demonstrate the efficiency of our approach, proving its potential in the field of face shape classification. The success of our methodology holds promise for various applications, offering valuable insights into beauty analysis, cosmetic recommendations, and personalized healthcare. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0948695X
Volume :
30
Issue :
2
Database :
Complementary Index
Journal :
Journal of Universal Computer Science (JUCS)
Publication Type :
Academic Journal
Accession number :
176027639
Full Text :
https://doi.org/10.3897/jucs.104490