Back to Search Start Over

Dropout- A Detailed Survey

Authors :
Sakshi S Lad
Source :
International Journal for Research in Applied Science and Engineering Technology. 9:1573-1578
Publication Year :
2021
Publisher :
International Journal for Research in Applied Science and Engineering Technology (IJRASET), 2021.

Abstract

Deep Neural Networks are very complex and have large number of parameters. Shortlisting the parameters that influence the model prediction is not possible as each has equal significance. These neural nets have powerful learning skills can model training data well enough. However, in most of these conditions, the models are over-fitting. Combining predictions from large neural nets where neurons are co-dependent alters the performance of the model. Dropout addresses the problem of overfitting and slow convergence in deep neural nets. The core concept of dropout technique is to randomly drop units and their connections from the neural network during training phase. This prevents units from co-adapting and thus improving the performance. The central mechanism behind dropout is to take a large model that overfits easily and repeatedly sample and train smaller sub-models from it. This paper provides an introduction to dropout, the history behind its design and various dropout methods.

Details

ISSN :
23219653
Volume :
9
Database :
OpenAIRE
Journal :
International Journal for Research in Applied Science and Engineering Technology
Accession number :
edsair.doi...........417fed6edaaa508b4a7f8658087cccc0