Back to Search
Start Over
A comparative analysis of activation functions in neural networks: unveiling categories.
- Source :
- Bulletin of Electrical Engineering & Informatics; Oct2024, Vol. 13 Issue 5, p3301-3308, 8p
- Publication Year :
- 2024
-
Abstract
- Activation functions (AFs) play a critical role in artificial neural networks, allowing for the modeling of complex, non-linear relationships in data. In this review paper, we provide an overview of the most commonly used AFs in deep learning. In this comparative study, we survey and compare the different AFs in deep learning and artificial neural networks. Our aim is to provide insights into the strengths and weaknesses of each AF and to provide guidance on the appropriate selection of AFs for different types of problems. We evaluate the most commonly used AFs, including sigmoid, tanh, rectified linear units (ReLUs) and its variants, exponential linear unit (ELU), and SoftMax. For each activation category, we discuss its properties, mathematical formulation (MF), and the benefits and drawbacks in terms of its ability to model complex, non-linear relationships in data. In conclusion, this comparative study provides a comprehensive overview of the properties and performance of different AFs, and serves as a valuable resource for researchers and practitioners in deep learning and artificial neural networks. [ABSTRACT FROM AUTHOR]
- Subjects :
- ARTIFICIAL neural networks
DEEP learning
RESEARCH personnel
COMPARATIVE studies
Subjects
Details
- Language :
- English
- ISSN :
- 20893191
- Volume :
- 13
- Issue :
- 5
- Database :
- Complementary Index
- Journal :
- Bulletin of Electrical Engineering & Informatics
- Publication Type :
- Academic Journal
- Accession number :
- 180146327
- Full Text :
- https://doi.org/10.11591/eei.v13i5.7274