1. The Effect of Different Dimensionality Reduction Techniques on Machine Learning Overfitting Problem
- Author
-
Ahmad Taher Azar, Khaled Mohamed Fouad, Mustafa Samy Elgendy, and Mustafa Abdul Salam
- Subjects
General Computer Science ,Artificial neural network ,Computer science ,business.industry ,Dimensionality reduction ,Overfitting ,Machine learning ,computer.software_genre ,Linear discriminant analysis ,Random forest ,Support vector machine ,Principal component analysis ,Feature (machine learning) ,Artificial intelligence ,business ,computer - Abstract
In most conditions, it is a problematic mission for a machine-learning model with a data record, which has various attributes, to be trained. There is always a proportional relationship between the increase of model features and the arrival to the overfitting of the susceptible model. That observation occurred since not all the characteristics are always important. For example, some features could only cause the data to be noisier. Dimensionality reduction techniques are used to overcome this matter. This paper presents a detailed comparative study of nine dimensionality reduction methods. These methods are missing-values ratio, low variance filter, high-correlation filter, random forest, principal component analysis, linear discriminant analysis, backward feature elimination, forward feature construction, and rough set theory. The effects of used methods on both training and testing performance were compared with two different datasets and applied to three different models. These models are, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Random Forest classifier (RFC). The results proved that the RFC model was able to achieve the dimensionality reduction via limiting the overfitting crisis. The introduced RFC model showed a general progress in both accuracy and efficiency against compared approaches. The results revealed that dimensionality reduction could minimize the overfitting process while holding the performance so near to or better than the original one. more...
- Published
- 2021
- Full Text
- View/download PDF