Back to Search Start Over

Systematic Literature Review and Bibliometric Analysis on Addressing the Vanishing Gradient Issue in Deep Neural Networks for Text Data

Authors :
Haroon-Sulyman, Shakirat Oluwatosin
Mohammed, Ahmed Taiye
Kamaruddin, Siti Sakira
Ahmad, Farzana Kabir
Haroon-Sulyman, Shakirat Oluwatosin
Mohammed, Ahmed Taiye
Kamaruddin, Siti Sakira
Ahmad, Farzana Kabir
Publication Year :
2024

Abstract

The feature to learn complex text representations enabled by Deep Neural Networks (DNNs) has revolutionized Natural Language Processing and several other fields. However, DNNs have not developed beyond all challenges. For instance, the vanishing gradient problem remains a major challenge. This challenge hinders the ability of the system to capture long-term dependencies in text data. This challenge limits the ability to understand context, implied meanings, semantics, and to represent intricate patterns in text. This study aims to address the prevalent vanishing gradient problem encountered in DNNs when dealing with text data. Text data’s inherent sparsity and heterogeneity exacerbate this issue, increasing computational complexities and processing time. To tackle this problem comprehensively, we will explore existing literature and conduct a bibliometric analysis to identify potential solutions. The findings will contribute to a comprehensive review of the existing literature and suggest effective strategies for mitigating the vanishing gradient problem in the context of NLP tasks. Ultimately, our study will pave the way for further advancements in this area of research.

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1442914942
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1007.978-981-99-9589-9_13