Back to Search Start Over

Adaptive activation functions for predictive modeling with sparse experimental data.

Authors :
Pourkamali-Anaraki, Farhad
Nasrin, Tahamina
Jensen, Robert E.
Peterson, Amy M.
Hansen, Christopher J.
Source :
Neural Computing & Applications. Oct2024, Vol. 36 Issue 29, p18297-18311. 15p.
Publication Year :
2024

Abstract

A pivotal aspect in the design of neural networks lies in selecting activation functions, crucial for introducing nonlinear structures that capture intricate input–output patterns. While the effectiveness of adaptive or trainable activation functions has been studied in domains with ample data, like image classification problems, significant gaps persist in understanding their influence on classification accuracy and predictive uncertainty in settings characterized by limited data availability. This research aims to address these gaps by investigating the use of two types of adaptive activation functions. These functions incorporate shared and individual trainable parameters per hidden layer and are examined in three testbeds derived from additive manufacturing problems containing fewer than 100 training instances. Our investigation reveals that adaptive activation functions, such as Exponential Linear Unit (ELU) and Softplus, with individual trainable parameters, result in accurate and confident prediction models that outperform fixed-shape activation functions and the less flexible method of using identical trainable activation functions in a hidden layer. Therefore, this work presents an elegant way of facilitating the design of adaptive neural networks in scientific and engineering problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
36
Issue :
29
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
179738855
Full Text :
https://doi.org/10.1007/s00521-024-10156-8