Back to Search Start Over

From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information

Authors :
Renfen Hu
Shen Li
Si Li
Hengru Xu
Sheng Gao
Source :
CoNLL
Publication Year :
2018
Publisher :
Association for Computational Linguistics, 2018.

Abstract

Dropout is used to avoid overfitting by randomly dropping units from the neural networks during training. Inspired by dropout, this paper presents GI-Dropout, a novel dropout method integrating with global information to improve neural networks for text classification. Unlike the traditional dropout method in which the units are dropped randomly according to the same probability, we aim to use explicit instructions based on global information of the dataset to guide the training process. With GI-Dropout, the model is supposed to pay more attention to inapparent features or patterns. Experiments demonstrate the effectiveness of the dropout with global information on seven text classification tasks, including sentiment analysis and topic classification.

Details

Database :
OpenAIRE
Journal :
Proceedings of the 22nd Conference on Computational Natural Language Learning
Accession number :
edsair.doi.dedup.....9f0dcb6c8b9521e397cd7dfce15610fb
Full Text :
https://doi.org/10.18653/v1/k18-1055