7 results on '"Mustafa Abdul Salam"'
Search Results
2. A Novel Intelligent Approach for Dynamic Data Replication in Cloud Environment
- Author
-
Ahmed Awad, Mustafa Abdul Salam, Rashed Salem, and Hatem Abdelkader
- Subjects
CloudSim ,ant colony optimization ,General Computer Science ,Computer science ,Distributed computing ,Cloud computing ,02 engineering and technology ,cloud environments ,Genetic algorithm ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,multi objective optimization ,particle swarm optimization ,business.industry ,Ant colony optimization algorithms ,Replica ,General Engineering ,Particle swarm optimization ,020206 networking & telecommunications ,Dynamic replication ,Replication (computing) ,020201 artificial intelligence & image processing ,Data center ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,business ,lcsh:TK1-9971 - Abstract
In recent years, cloud computing research, specifically data replication techniques and their applications, has been growing. If the replicas number is raised and put in multiple positions, it will be expensive to maintain the data usability, performance and stability of the application systems. In this paper, two bio- inspired algorithms were proposed to improve both selection and placement of data replicas in the cloud environment. The suggested algorithms for dynamic data replication are multi-objective particle swarm optimization (MO-PSO) and ant colony optimization (MO-ACO). The first suggested algorithm, i.e., MO-PSO, is employed to obtain the best selected data replica depend on the most frequent one. However, the second suggested algorithm, i.e., MO-ACO, is employed to obtain the best data replica placement depend on the shortest distance, and the replicas availability. A simulation of the suggested strategy was carried out using CloudSim. Each data center (DC) includes hosts with set of virtual machines (VMs). The data replication order is determined at random from a thousand cloudlets. All replication files are randomly distributed in the proposed architecture. The performance of suggested techniques was evaluated against several approaches including: Adaptive Replica Dynamic Strategy (ARDS), Enhance Fast Spread (EFS), Genetic Algorithm (GA), Replica Selection and Placement (RSP), Popular File Replication First (PFRF), and Dynamic Cost-aware Re-replication and Re-balancing Strategy (DCR2S). The simulation results prove that MOPSO gives improved data replication compared against other algorithms. Additionally, MOACO realizes higher data availability, lower cost, and less bandwidth consumption compared with other algorithms.
- Published
- 2021
3. Federated Learning Approach for Measuring the Response of Brain Tumors to Chemotherapy
- Author
-
Omneya Atef, Mustafa Abdul Salam, and Hisham Abdelsalam
- Subjects
General Computer Science - Published
- 2022
- Full Text
- View/download PDF
4. An Artificial Bee Colony Algorithm for Data Replication Optimization in Cloud Environments
- Author
-
Rashed Salem, Mustafa Abdul Salam, Hatem Abdelkader, and Ahmed Awad Mohamed
- Subjects
replication ,Mathematical optimization ,General Computer Science ,Computer science ,Cloud computing ,knapsack problem ,02 engineering and technology ,Swarm intelligence ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,Selection (genetic algorithm) ,business.industry ,artificial bee colony ,General Engineering ,Process (computing) ,020206 networking & telecommunications ,Replication (computing) ,TK1-9971 ,Artificial bee colony algorithm ,multi-objective optimization ,Knapsack problem ,CloudSim ,020201 artificial intelligence & image processing ,Electrical engineering. Electronics. Nuclear engineering ,cloudsim ,business - Abstract
Cloud computing is a modern technology for dealing with large-scale data. The Cloud has been used to process the selection and placement of replications on a large scale. Most previous studies concerning replication used mathematical models, and few studies focused on artificial intelligence (AI). The Artificial Bee Colony (ABC) is a member of the family of swarm intelligence based algorithms. It simulates bee direction to the final route and has been proven to be effective for optimization. In this paper, we present the different costs and shortest route sides in the Cloud with regard to replication and its placement between data centers (DCs) through Multi-Objective Optimization (MOO) and evaluate the cost distance by using the knapsack problem. ABC has been used to solve shortest route and lower cost problems to identify the best selection for replication placement, according to the distance or shortest routes and lower costs that the knapsack approach has used to solve these problems. Multi-objective optimization with the artificial bee colony (MOABC) algorithm can be used to achieve highest efficiency and lowest costs in the proposed system. MOABC can find an optimal solution for the best placement of data replicas according to the minimum distance and the number of data transmissions, affording low cost with the knapsack approach and availability of data replication.Low cost and fast access are characteristics that guide the shortest route in the CloudSim implementation as well. The experimental results show that the proposed MOABC is more efficient and effective for the best placement of replications than compared algorithms.
- Published
- 2020
- Full Text
- View/download PDF
5. The Effect of Different Dimensionality Reduction Techniques on Machine Learning Overfitting Problem
- Author
-
Ahmad Taher Azar, Khaled Mohamed Fouad, Mustafa Samy Elgendy, and Mustafa Abdul Salam
- Subjects
General Computer Science ,Artificial neural network ,Computer science ,business.industry ,Dimensionality reduction ,Overfitting ,Machine learning ,computer.software_genre ,Linear discriminant analysis ,Random forest ,Support vector machine ,Principal component analysis ,Feature (machine learning) ,Artificial intelligence ,business ,computer - Abstract
In most conditions, it is a problematic mission for a machine-learning model with a data record, which has various attributes, to be trained. There is always a proportional relationship between the increase of model features and the arrival to the overfitting of the susceptible model. That observation occurred since not all the characteristics are always important. For example, some features could only cause the data to be noisier. Dimensionality reduction techniques are used to overcome this matter. This paper presents a detailed comparative study of nine dimensionality reduction methods. These methods are missing-values ratio, low variance filter, high-correlation filter, random forest, principal component analysis, linear discriminant analysis, backward feature elimination, forward feature construction, and rough set theory. The effects of used methods on both training and testing performance were compared with two different datasets and applied to three different models. These models are, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Random Forest classifier (RFC). The results proved that the RFC model was able to achieve the dimensionality reduction via limiting the overfitting crisis. The introduced RFC model showed a general progress in both accuracy and efficiency against compared approaches. The results revealed that dimensionality reduction could minimize the overfitting process while holding the performance so near to or better than the original one.
- Published
- 2021
- Full Text
- View/download PDF
6. Earthquake Prediction using Hybrid Machine Learning Techniques
- Author
-
Diaa Salama Abdelminaam, Mustafa Abdul Salam, and Lobna Ibrahim
- Subjects
General Computer Science ,Mean squared error ,Square root ,Approximation error ,Computer science ,Earthquake prediction ,Statistics ,Magnitude (mathematics) ,Symmetric mean absolute percentage error ,Root-mean-square deviation ,Extreme learning machine - Abstract
This research proposes two earthquake prediction models using seismic indicators and hybrid machine learning techniques in the region of southern California. Seven seismic indicators were mathematically and statistically calculated depending on pervious recorded seismic events in the earthquake catalogue of that region. These indicators are namely, time taken during the occurrence of n seismic events (T), average magnitude of n events (M_mean), magnitude deficit that is the difference between the observed magnitude and expected one (ΔM), the curve slope for n events using inverse power law of Gutenberg Richter (b), mean square deviation for n events using inverse power law of Gutenberg Richter (η), the square root of the released energy during T time (DE1/2) and average time between events (µ). Two hybrid machine learning models are proposed to predict the earthquake magnitude during fifteen days. The first model is FPA-ELM, which is a hybrid of the flower pollination algorithm (FPA) and the extreme learning machine (ELM). The second is FPA-LS-SVM, which is a hybrid of FPA and the least square support vector machine (LS-SVM). These two models' performance is compared and assessed using four assessment criteria: Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Symmetric Mean Absolute Percentage Error (SMAPE), and Percent Mean Relative Error (PMRE). The simulation results showed that the FPA-LS-SVM model outperformed the FPA-ELM, LS-SVM, and ELM models in terms of prediction accuracy.
- Published
- 2021
- Full Text
- View/download PDF
7. From Linear Programming Approach to Metaheuristic Approach: Scaling Techniques
- Author
-
Mustafa Abdul Salam, Sultan Almotairi, E. M. Badr, and Hagar Ahmed
- Subjects
Normalization (statistics) ,021103 operations research ,Multidisciplinary ,General Computer Science ,Linear programming ,Article Subject ,Computer science ,0211 other engineering and technologies ,02 engineering and technology ,QA75.5-76.95 ,medicine.disease ,Support vector machine ,Breast cancer ,ComputingMethodologies_PATTERNRECOGNITION ,Electronic computers. Computer science ,Hyperparameter optimization ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,medicine ,020201 artificial intelligence & image processing ,Geometric mean ,Scaling ,Algorithm ,Metaheuristic ,Arithmetic mean - Abstract
The objective of this work is to propose ten efficient scaling techniques for the Wisconsin Diagnosis Breast Cancer (WDBC) dataset using the support vector machine (SVM). These scaling techniques are efficient for the linear programming approach. SVM with proposed scaling techniques was applied on the WDBC dataset. The scaling techniques are, namely, arithmetic mean, de Buchet for three cases p = 1,2 , and ∞ , equilibration, geometric mean, IBM MPSX, and Lp-norm for three cases p = 1,2 , and ∞ . The experimental results show that the equilibration scaling technique overcomes the benchmark normalization scaling technique used in many commercial solvers. Finally, the experimental results also show the effectiveness of the grid search technique which gets the optimal parameters (C and gamma) for the SVM classifier.
- Published
- 2021
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.