Back to Search Start Over

Why KDAC? A general activation function for knowledge discovery.

Authors :
Wang, Zhenhua
Liu, Haozhe
Liu, Fanglin
Gao, Dong
Source :
Neurocomputing. Aug2022, Vol. 501, p343-358. 16p.
Publication Year :
2022

Abstract

Deep learning oriented named entity recognition (DNER) has gradually become the paradigm of knowledge discovery, which greatly promotes domain intelligence. However, the activation function of DNER fails to treat gradient vanishing, no negative output or non-differentiable existence, which may impede the exploration of knowledge due to the omission and incomplete representation of the latent semantic. To break through the dilemma, we present a novel activation function termed KDAC. Detailly, KDAC is an aggregation function with multiple conversion modes. The backbone is the interaction between exponent and linearity, and the both ends are extended through adaptive linear divergence, which can surmount the gradient vanishing and no negative output. Crucially, the non-differentiable points can be alerted and eliminated by an approximate smoothing algorithm. KDAC has a series of brilliant properties, such as nonlinear, stable near-linear transformation and derivative, as well as dynamic style, etc. We perform experiments based on BERT-BiLSTM-CNN-CRF model on six benchmark datasets containing different domain knowledge, such as Weibo , Clinical , E-commerce , Resume , HAZOP and People's daily. The evaluation results show that KDAC is advanced and effective, and can provide more generalized activation to stimulate the performance of DNER. We hope that KDAC can be exploited as a promising activation function to devote itself to the construction of knowledge. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
501
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
157909841
Full Text :
https://doi.org/10.1016/j.neucom.2022.06.019