Back to Search
Start Over
Elastic exponential linear units for convolutional neural networks
- Source :
- Neurocomputing. 406:253-266
- Publication Year :
- 2020
- Publisher :
- Elsevier BV, 2020.
-
Abstract
- Activation functions play important roles in determining the depth and non-linearity of deep learning models. Since the Rectified Linear Unit (ReLU) was introduced, many modifications, in which noise is intentionally injected, have been proposed to avoid overfitting. Exponential Linear Unit (ELU) and their variants, with trainable parameters, have been proposed to reduce the bias shift effects which are often observed in ReLU-type activation functions. In this paper, we propose a novel activation function, called the Elastic Exponential Linear Unit (EELU), which combines the advantages of both types of activation functions in a generalized form. EELU has an elastic slope in the positive part, and preserves the negative signal by using a small non-zero gradient. We also present a new strategy to insert neuronal noise using a Gaussian distribution in the activation function to improve generalization. We demonstrated how EELU can represent a wider variety of features with random noise than other activation functions, by visualizing the latent features of convolutional neural networks. We evaluated the effectiveness of the EELU approach through extensive experiments with image classification using the CIFAR-10/CIFAR-100, ImageNet, and Tiny ImageNet datasets. Our experimental results show that EELU achieved better generalization performance and improved classification accuracy over conventional activation functions, such as ReLU, ELU, ReLU- and ELU-like variants, Scaled ELU, and Swish. EELU produced performance improvements in image classification using a smaller number of training samples, owing to its noise injection strategy, which allows significant variation in function outputs, including deactivation.
- Subjects :
- 0209 industrial biotechnology
Contextual image classification
Computer science
Noise (signal processing)
business.industry
Cognitive Neuroscience
Gaussian
Deep learning
Activation function
02 engineering and technology
Rectifier (neural networks)
Overfitting
Convolutional neural network
Computer Science Applications
symbols.namesake
020901 industrial engineering & automation
Neuronal noise
Artificial Intelligence
0202 electrical engineering, electronic engineering, information engineering
symbols
020201 artificial intelligence & image processing
Artificial intelligence
business
Algorithm
Subjects
Details
- ISSN :
- 09252312
- Volume :
- 406
- Database :
- OpenAIRE
- Journal :
- Neurocomputing
- Accession number :
- edsair.doi...........efc444a85644d00a4f71e7277fe7c8ba