Back to Search
Start Over
From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information
- Source :
- CoNLL
- Publication Year :
- 2018
- Publisher :
- Association for Computational Linguistics, 2018.
-
Abstract
- Dropout is used to avoid overfitting by randomly dropping units from the neural networks during training. Inspired by dropout, this paper presents GI-Dropout, a novel dropout method integrating with global information to improve neural networks for text classification. Unlike the traditional dropout method in which the units are dropped randomly according to the same probability, we aim to use explicit instructions based on global information of the dataset to guide the training process. With GI-Dropout, the model is supposed to pay more attention to inapparent features or patterns. Experiments demonstrate the effectiveness of the dropout with global information on seven text classification tasks, including sentiment analysis and topic classification.
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Computer science
Process (engineering)
Machine Learning (stat.ML)
010501 environmental sciences
Overfitting
Machine learning
computer.software_genre
01 natural sciences
Machine Learning (cs.LG)
Global information
Statistics - Machine Learning
0502 economics and business
ComputingMilieux_COMPUTERSANDEDUCATION
050207 economics
Dropout (neural networks)
0105 earth and related environmental sciences
Computer Science - Computation and Language
Artificial neural network
Mechanism (biology)
business.industry
05 social sciences
Sentiment analysis
Artificial intelligence
business
Computation and Language (cs.CL)
computer
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the 22nd Conference on Computational Natural Language Learning
- Accession number :
- edsair.doi.dedup.....9f0dcb6c8b9521e397cd7dfce15610fb
- Full Text :
- https://doi.org/10.18653/v1/k18-1055