Back to Search
Start Over
Word Level LSTM and Recurrent Neural Network for Automatic Text Generation
- Source :
- 2021 International Conference on Computer Communication and Informatics (ICCCI).
- Publication Year :
- 2021
- Publisher :
- IEEE, 2021.
-
Abstract
- Sequence prediction problems have been a major problem for a long time. Recurrent Neural Network (RNN) has been a good solution for sequential prediction problems. This work aims to create a generative model for text. Even though, RNN has its own limitations such as vanishing and exploding gradient descent problems, and inefficiency to keep track of long-term dependencies. To overcome these drawbacks, Long Short Term Memory (LSTM) has been a path-breaking solution to deal with sequential data and text data in particular. This paper delineates the design and working of text generation using word-level LSTM-RNN.
Details
- Database :
- OpenAIRE
- Journal :
- 2021 International Conference on Computer Communication and Informatics (ICCCI)
- Accession number :
- edsair.doi...........b3ac000d6c465e192e5bcbe8c169ecff
- Full Text :
- https://doi.org/10.1109/iccci50826.2021.9402488