Back to Search
Start Over
PFLU and FPFLU: Two novel non-monotonic activation functions in convolutional neural networks
- Source :
- Neurocomputing. 429:110-117
- Publication Year :
- 2021
- Publisher :
- Elsevier BV, 2021.
-
Abstract
- The choice of activation functions in Convolutional Neural Networks (CNNs) is very important. Rectified Linear Unit (ReLU) has been widely-used in most CNNs. Recently, a series of non-monotonic activation functions gradually become the new standard to enhance performance of CNNs. Inspired by them, this paper firstly proposes a novel non-monotonic activation function called Power Function Linear Unit (PFLU). The negative part of PFLU is non-monotonic and closer to zero with the negative input decreasing, which can maintain sparsity of the negative part while introducing negative activation values and non-zero derivative values for the negative part. The positive part of PFLU does not use identity mapping but is closer to identity mapping with the positive input increasing, which can bring non-linearity property for the positive part. Next, this paper proposes faster PFLU (FPFLU). A wide range of classification experiments show that PFLU tends to work better than current state-of-the-art non-monotonic activation functions, and FPFLU can run faster than most non-monotonic activation functions.
- Subjects :
- 0209 industrial biotechnology
Current (mathematics)
Series (mathematics)
Property (programming)
Computer science
Cognitive Neuroscience
Activation function
Monotonic function
02 engineering and technology
Rectifier (neural networks)
Topology
Convolutional neural network
Computer Science Applications
Range (mathematics)
020901 industrial engineering & automation
Artificial Intelligence
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Power function
Subjects
Details
- ISSN :
- 09252312
- Volume :
- 429
- Database :
- OpenAIRE
- Journal :
- Neurocomputing
- Accession number :
- edsair.doi...........6ea4be32638ffe307d88340389ae3aa7