1. Moderate Adaptive Linear Units (MoLU)
- Author
-
Koh, Hankyul, Ko, Joon-hyuk, and Jhe, Wonho
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Computer Science - Computer Science and Game Theory ,Computer Science - Neural and Evolutionary Computing - Abstract
We propose a new high-performance activation function, Moderate Adaptive Linear Units (MoLU), for the deep neural network. The MoLU is a simple, beautiful and powerful activation function that can be a good main activation function among hundreds of activation functions. Because the MoLU is made up of the elementary functions, not only it is a infinite diffeomorphism (i.e. smooth and infinitely differentiable over whole domains), but also it decreases training time., Comment: 4 pages, 5 figures
- Published
- 2023