Back to Search Start Over

A COMPARATIVE EXPLORATION OF ACTIVATION FUNCTIONS FOR IMAGE CLASSIFICATION IN CONVOLUTIONAL NEURAL NETWORKS.

Authors :
MAKHDOOM, FAIZA
RAHMAN, JAMSHAID UL
Source :
i-Manager's Journal on Artificial Intelligence & Machine Learning (JAIM); Jun2024, Vol. 2 Issue 1, p9-17, 9p
Publication Year :
2024

Abstract

Activation functions play a crucial role in enabling neural networks to carry out tasks with increased flexibility by introducing non-linearity. The selection of appropriate activation functions becomes even more crucial, especially in the context of deeper networks where the objective is to learn more intricate patterns. Among various deep learning tools, Convolutional Neural Networks (CNNs) stand out for their exceptional ability to learn complex visual patterns. In practice, ReLu is commonly employed in convolutional layers of CNNs, yet other activation functions like Swish can demonstrate superior training performance while maintaining good testing accuracy on different datasets. This paper presents an optimally refined strategy for deep learning-based image classification tasks by incorporating CNNs with advanced activation functions and an adjustable setting of layers. A thorough analysis has been conducted to support the effectiveness of various activation functions when coupled with the favorable softmax loss, rendering them suitable for ensuring a stable training process. The results obtained on the CIFAR-10 dataset demonstrate the favorability and stability of the adopted strategy throughout the training process. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
25839128
Volume :
2
Issue :
1
Database :
Complementary Index
Journal :
i-Manager's Journal on Artificial Intelligence & Machine Learning (JAIM)
Publication Type :
Academic Journal
Accession number :
177940091
Full Text :
https://doi.org/10.26634/jaim.2.1.20225