Back to Search
Start Over
Comparative Evaluation of Various Activation Functions in the Recurrent Neurons of the LRPNN
- Source :
- Proceedings of the Twenty-First International Conference on Geometry, Integrability and Quantization, Ivaïlo M. Mladenov, Vladimir Pulov and Akira Yoshioka, eds. (Sofia: Avangard Prima, 2020)
- Publication Year :
- 2020
- Publisher :
- Avangard Prima, 2020.
-
Abstract
- In the present work, we investigate the behavior of the Locally Recurrent Probabilistic Neural Network (LRPNN) with different activation functions in the recurrent layer neurons. Specifically, we evaluate the performance of the modified activation function proposed here, which belongs to the family of Rectified Linear Units (ReLU), and compare it with other ReLU-based functions, the traditional sigmoid activation function, as well as with the Swish and E-Swish activation functions. Furthermore, we investigate the efficiency of a training procedure which simultaneously adjusts the spread factor sigma and the weights in the recurrent layer of the LRPNN. This training helps for coping with practical tasks, such as the recognition of Parkinson condition from speech signals, which operate under limited amount of training data.
Details
- Language :
- English
- Database :
- OpenAIRE
- Journal :
- Proceedings of the Twenty-First International Conference on Geometry, Integrability and Quantization, Ivaïlo M. Mladenov, Vladimir Pulov and Akira Yoshioka, eds. (Sofia: Avangard Prima, 2020)
- Accession number :
- edsair.doi.dedup.....0849746419a5894684e3a1581a13323a