Back to Search
Start Over
Weighted sigmoid gate unit for an activation function of deep neural network
- Source :
- Pattern Recognition Letters. 135:354-359
- Publication Year :
- 2020
- Publisher :
- Elsevier BV, 2020.
-
Abstract
- An activation function has crucial role in a deep neural network. A simple rectified linear unit (ReLU) are widely used for the activation function. In this paper, a weighted sigmoid gate unit (WiG) is proposed as the activation function. The proposed WiG consists of a multiplication of inputs and the weighted sigmoid gate. It is shown that the WiG includes the ReLU and same activation functions as a special case. Many activation functions have been proposed to overcome the performance of the ReLU. In the literature, the performance is mainly evaluated with an object recognition task. The proposed WiG is evaluated with the object recognition task and the image restoration task. Then, the expeirmental comparisons demonstrate the proposed WiG overcomes the existing activation functions including the ReLU.
- Subjects :
- FOS: Computer and information sciences
Artificial neural network
Computer science
Computer Vision and Pattern Recognition (cs.CV)
Activation function
Computer Science - Computer Vision and Pattern Recognition
Cognitive neuroscience of visual object recognition
02 engineering and technology
Rectifier (neural networks)
Sigmoid function
01 natural sciences
Artificial Intelligence
0103 physical sciences
Signal Processing
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Multiplication
Computer Vision and Pattern Recognition
010306 general physics
Unit (ring theory)
Algorithm
Software
Image restoration
Subjects
Details
- ISSN :
- 01678655
- Volume :
- 135
- Database :
- OpenAIRE
- Journal :
- Pattern Recognition Letters
- Accession number :
- edsair.doi.dedup.....6262c3d082a6c3b93929a8fa664aefca