Back to Search Start Over

Weighted sigmoid gate unit for an activation function of deep neural network

Authors :
Masayuki Tanaka
Source :
Pattern Recognition Letters. 135:354-359
Publication Year :
2020
Publisher :
Elsevier BV, 2020.

Abstract

An activation function has crucial role in a deep neural network. A simple rectified linear unit (ReLU) are widely used for the activation function. In this paper, a weighted sigmoid gate unit (WiG) is proposed as the activation function. The proposed WiG consists of a multiplication of inputs and the weighted sigmoid gate. It is shown that the WiG includes the ReLU and same activation functions as a special case. Many activation functions have been proposed to overcome the performance of the ReLU. In the literature, the performance is mainly evaluated with an object recognition task. The proposed WiG is evaluated with the object recognition task and the image restoration task. Then, the expeirmental comparisons demonstrate the proposed WiG overcomes the existing activation functions including the ReLU.

Details

ISSN :
01678655
Volume :
135
Database :
OpenAIRE
Journal :
Pattern Recognition Letters
Accession number :
edsair.doi.dedup.....6262c3d082a6c3b93929a8fa664aefca