6,374 results on '"soft computing"'
Search Results
2. Soft computing techniques in multi-criteria recommender systems: A comprehensive review
- Author
-
Anwar, Khalid, Wasid, Mohammed, Zafar, Aasim, Ganaie, M.A., and Iqbal, Arshad
- Published
- 2025
- Full Text
- View/download PDF
3. SoftED: Metrics for soft evaluation of time series event detection
- Author
-
Salles, Rebecca, Lima, Janio, Reis, Michel, Coutinho, Rafaelli, Pacitti, Esther, Masseglia, Florent, Akbarinia, Reza, Chen, Chao, Garibaldi, Jonathan, Porto, Fabio, and Ogasawara, Eduardo
- Published
- 2024
- Full Text
- View/download PDF
4. Intelligent reactive power control of a renewable integrated hybrid energy system model using static synchronous compensators and soft computing techniques
- Author
-
Guchhait, Pabitra Kumar, Chakraborty, Samrat, Mukherjee, Debottam, and Banerjee, Ramashis
- Published
- 2024
- Full Text
- View/download PDF
5. Time series forecasting techniques applied to hydroelectric generation systems
- Author
-
Barzola-Monteses, Julio, Gómez-Romero, Juan, Espinoza-Andaluz, Mayken, and Fajardo, Waldo
- Published
- 2025
- Full Text
- View/download PDF
6. Electric vehicle power system in intelligent manufacturing based on soft computing optimization
- Author
-
Zhao, Shangyi and Guo, Ming
- Published
- 2024
- Full Text
- View/download PDF
7. Advanced seismic control strategies for smart base isolation buildings utilizing active tendon and MR dampers
- Author
-
Akbari, Morteza, Zand, Javad Palizvan, Falborski, Tomasz, and Jankowski, Robert
- Published
- 2024
- Full Text
- View/download PDF
8. Observer-based type-3 fuzzy control for gyroscopes: Experimental/theoretical study
- Author
-
Zhang, Chunwei, Du, Changdong, Sakthivel, Rathinasamy, and Mohammadzadeh, Ardashir
- Published
- 2025
- Full Text
- View/download PDF
9. Active Disturbance Rejection Control for an automotive suspension system based on parameter tuning using a fuzzy technique.
- Author
-
Nguyen, Tuan Anh
- Subjects
- *
PAVEMENTS , *MEMBERSHIP functions (Fuzzy logic) , *SPECIAL functions , *SURFACE roughness , *SOFT computing , *MOTOR vehicle springs & suspension , *SUSPENSION systems (Aeronautics) - Abstract
Road surface roughness is the cause of vehicle vibration, which is considered a system disturbance. Previous studies on suspension system control often ignore the influence of disturbances while designing the controller, leading to system performance degradation under severe vibration conditions. In this work, we propose a control method to improve active suspension performance that reduces vehicle vibration by eliminating the influence of road disturbances. The proposed method is formed based on the combination of an Active Disturbance Rejection Control (ADRC) technique with control coefficients tuned by a dynamic fuzzy technique formed based on special membership functions called Active Disturbance Rejection Control Based on Fuzzy (ADRCBF). An Extended State Observer (ESO) estimates state variables and disturbances. The performance of the proposed controller is evaluated through the numerical simulation process with three different cases. According to the calculation results, the acceleration and displacement of the sprung mass are significantly reduced when the suspension system is controlled by the proposed technique, compared with the passive suspension system and the active suspension system controlled by a Proportional-Integral-Derivative (PID) technique. In addition, the suspension travel follows the road disturbance with a small error. The error estimated by the ESO does not exceed 3.5% (for sinusoidal and random excitation). In general, system adaptation is ensured under many investigated conditions based on tuning the controller parameters by the soft computing method. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
10. Advanced Soft Computing Techniques for Monthly Streamflow Prediction in Seasonal Rivers.
- Author
-
Achite, Mohammed, Katipoğlu, Okan Mert, Kartal, Veysi, Sarıgöl, Metin, Jehanzaib, Muhammad, and Gül, Enes
- Subjects
- *
LONG short-term memory , *WATER management , *GMDH algorithms , *SOFT computing , *DEEP learning , *WATERSHEDS - Abstract
The rising incidence of droughts in specific global regions in recent years, primarily attributed to global warming, has markedly increased the demand for reliable and accurate streamflow estimation. Streamflow estimation is essential for the effective management and utilization of water resources, as well as for the design of hydraulic infrastructure. Furthermore, research on streamflow estimation has gained heightened importance because water is essential not only for the survival of all living organisms but also for determining the quality of life on Earth. In this study, advanced soft computing techniques, including long short-term memory (LSTM), convolutional neural network–recurrent neural network (CNN-RNN), and group method of data handling (GMDH) algorithms, were employed to forecast monthly streamflow time series at two different stations in the Wadi Mina basin. The performance of each technique was evaluated using statistical criteria such as mean square error (MSE), mean bias error (MBE), mean absolute error (MAE), and the correlation coefficient (R). The results of this study demonstrated that the GMDH algorithm produced the most accurate forecasts at the Sidi AEK Djillali station, with metrics of MSE: 0.132, MAE: 0.185, MBE: −0.008, and R: 0.636. Similarly, the CNN-RNN algorithm achieved the best performance at the Kef Mehboula station, with metrics of MSE: 0.298, MAE: 0.335, MBE: −0.018, and R: 0.597. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
11. ANFIS simulation integrated in FM/FM/1/(CV + WV) queue with Bernoulli service interruption and metaheuristic optimization for mathematical model.
- Author
-
Dhibar, Sibasish and Jain, Madhu
- Abstract
The present investigation studies performance modeling and analysis of an M/M/1 queue with a hybrid vacation policy and Bernoulli service interruption. When the system becomes empty, the server has the choice to switch over to a complete vacation state or a working vacation state. During working vacations, the server also serves customers through a Bernoulli service process at a slower rate. After completing the service, the server either transitions to a normal busy state or remains in the working vacation state. To analyze the performance of the system, we have developed a Markovian queueing model by formulating the Chapman–Kolmogorov governing equations for the system states. Through an iterative and difference-differential equation approach, we derived the stationary probability distribution of the queue size, along with other key queueing metrics such as average queue length, average waiting time, and throughput. To deal with a more practical scenario of imprecise information about the system descriptors, the proposed crisp queueing is transformed into a fuzzy model by retaining the features. The soft computing approach based on adaptive neuro-fuzzy inference system (ANFIS), -cut, and parametric nonlinear programming (PNLP) is employed to obtain various fuzzified indices. Moreover, to determine the optimal service rate, the differential evolution (DE) and golden section search (GSS) methods are used. By taking illustration, the proposed optimization techniques are implemented to develop a cost-effective service system and to examine the sensitivity of the key descriptors. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
12. Enhanced distribution system performance through optimal placement of hybrid dynamic voltage restorer.
- Author
-
Sarker, Jayanti, Sarker, Krishna, Goswami, Swapan Kumar, and Chatterjee, Debashis
- Subjects
- *
FLEXIBLE AC transmission systems , *OPTIMIZATION algorithms , *IDEAL sources (Electric circuits) , *SOFT computing , *TEST systems - Abstract
Flexible AC Transmission System (FACTS) devices are getting increasing applications in Power Systems because of their several beneficial features. The present paper proposes a model of Hybrid-Dynamic Voltage Restorer (H-DVR) which is equivalent to a compound FACTS device. But unlike other compound FACTS devices, this H-DVR consists of only one voltage source converter (VSC). Besides, this H-DVR is having no series injection transformer. This H-DVR can operate either in series compensation mode or in shunt compensation mode which one is in priority. In this paper, it is also shown that H-DVR minimizes voltage Total Harmonic Distortion (THD) and load harmonics, improves node voltages and reduces power loss. The absence of transformer in H-DVR model improves the cost savings. The effectiveness of H-DVR has been tested on IEEE-123 node test system. For having optimal position, size and number of H-DVR in test network, Cuckoo Optimization Algorithm (COA) has been adopted. The comparative study between normal operating condition and fault condition along with proposed H-DVR placement has also been illustrated in this paper in brief. The results of H-DVR installation have been compared with those of DVR installation for having acceptance of H-DVR. To establish the effectiveness of H-DVR installation, a comparative study of different soft computing techniques have been furnished. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
13. Development of Comprehensive Predictive Models for Evaluating Böhme Abrasion Value (BAV) of Dimension Stones Using Non-Destructive Testing Methods.
- Author
-
Köken, Ekin
- Subjects
BUILDING stones ,PULSE wave analysis ,ARTIFICIAL neural networks ,ABRASION resistance ,ROCK properties - Abstract
Due to the global demand for dimension stones, fast and reliable evaluation tools are essential for assessing the quality of dimension stones. For this reason, this study aims to develop comprehensive tools for estimating the abrasion resistance of various dimension stones from Turkey. Non-destructive rock properties, including dry density (ρ
d ), water absorption by weight (wa ), and pulse wave velocity (Vp ), were determined to build a comprehensive database for soft computing analyses. Three predictive models were established using multivariate adaptive regression spline (MARS), M5P, and artificial neural networks (ANN) methodologies. The performance of the models was assessed through scatter plots and statistical indicators, showing that the ANN-based model outperforms those based on M5P and MARS. The applicability of the models was further validated with independent data from the existing literature, confirming that all models are suitable for estimating varying Böhme abrasion values (BAVs). A MATLAB-based software tool, called Böhme abrasion calculator (v1.00), was also developed, allowing users to estimate BAV values by inputting adopted non-destructive rock properties. This tool is available upon request, supporting the dimension stone industry and fostering future research in this field. [ABSTRACT FROM AUTHOR]- Published
- 2025
- Full Text
- View/download PDF
14. DNCCLA: Discrete New Caledonian Crow Learning Algorithm for Solving Traveling Salesman Problem.
- Author
-
Alsaidi, Ali H., Al-Sorori, Wedad, Mohsen, Abdulqader M., and Ashraf, Imran
- Subjects
OPTIMIZATION algorithms ,MODULAR arithmetic ,METAHEURISTIC algorithms ,MACHINE learning ,TRAVELING salesman problem - Abstract
The development of metaheuristic algorithms has led to the solution of various optimization problems. Bioinspired optimization algorithms like the New Caledonian crow learning algorithm (NCCLA) are primarily designed to address continuous problems. As most real‐world problems are discrete, some operators have been proposed to convert continuous algorithms into discrete ones to address these problems. These operators include evolutionary operators such as crossover and mutation, transformation operators such as symmetry, swap, and shift, and K‐opt algorithms such as 2‐opt, 2‐opt and a half, and 3‐opt. Employing some of these operators usually accompanies changing the algorithm's rules or the movement patterns of its search agents. However, mathematical operators such as modular arithmetic and set theory and random permutation provide an ability to keep the same algorithm's agent proposed in its continuous version and K‐opt algorithms usually balance the algorithm's exploration and exploitation capabilities. Thus, this paper converts the NCCLA into a discrete version by utilizing a combination of those mathematical operators and the 3‐opt algorithm. This combination allows the algorithm to maintain a balance between exploration and exploitation. The resulting algorithm, called the discrete New Caledonian crow learning algorithm (DNCCLA), is employed to solve the traveling salesman problem (TSP). In addition, the paper investigates the best combination of mathematical operators with K‐opt algorithms or the symmetry operator. The performance results demonstrate that DNCCLA outperforms state‐of‐the‐art discrete algorithms, exhibiting a good balance between exploration and exploitation. The algorithm successfully solves 20 TSP instances of varying scales, and it consistently achieved the top rank among the tested algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Charging station planning scheme by making efficient use of soft computing techniques for low carbon transition.
- Author
-
Zhang, Qing and Liang, Yi
- Subjects
- *
INFRASTRUCTURE (Economics) , *RENEWABLE energy transition (Government policy) , *ELECTRIC vehicles , *CLEAN energy , *CONSUMER preferences , *ELECTRIC automobiles - Abstract
For sustainable energy management, soft computing can play an important role in developing a charging station planning for low carbon transition to address the challenges faced by electric vehicle (EV) infrastructure system. This article proposes a planning scheme for charging stations of electronic vehicles for low carbon transition using soft computing-based computational techniques in the transportation sector. The existing charging stations have many flaws which are acting as hindrance in the development and acceptance of electronic vehicles by consumers. These flaws include the unreasonable layout of charging terminals, difficulty in charging, consumption of time in charging, and consumer convenience. This paper proposes a novel planning model for charging electronic vehicles as an innovative solution for electronic vehicles for low carbon transition. The protection of the environment is important in this current era and switching to electronic vehicles is a mandate for low-carbon transitions in the transportation sector. The proposed method begins with the collection of data from diverse sites of charging stations using the existing charging infrastructure, and this data also analyzed for ascertaining the energy consumption patterns, availability of renewable energy options, flow of traffic, and consumer preferences. The proposed method aims to augment the deployment of convenient charging infrastructure to promote consumers to switch to electronic vehicles from fuel-based vehicles and to promote renewable energy solutions for electric vehicles (EVs) at economic costs. The location and capacity of charging terminals is also optimized with the aid of hybrid algorithm based on catfish and PSO algorithms in this proposed research work. The effectiveness of the proposed soft computing-based hybrid algorithm is tested by using standard statistical methods and respective results are presented in the result section of this article. The proposed research has practical implications in a real-world scenario where consumers can be promoted to use electric vehicles by optimizing the capacity and availability of the charging station infrastructure which can eventually reduce the carbon footprints from this world. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Optimization of fused deposition modelling printing parameters using hybrid GA-fuzzy evolutionary algorithm.
- Author
-
Deswal, Sandeep, Kaushik, Ashish, Garg, Ramesh Kumar, Sahdev, Ravinder Kumar, and Chhabra, Deepak
- Subjects
- *
FUSED deposition modeling , *FUZZY algorithms , *COMPRESSIVE strength , *SOFT computing , *FUZZY logic - Abstract
The present study investigates the compressive strength performance of polylactic acid (PLA) polymer material parts printed using the Fused Deposition Modelling (FDM) three-dimensional (3D) printing process, with a particular emphasis on various machine input parameters. The face centred central composite design matrix approach was employed for experimental modelling, which was subsequently utilised as a knowledge base for the fuzzy algorithm. A hybrid evolutionary algorithm, i.e., Genetic-Algorithm (GA) assisted with Fuzzy Logic Methodology (FLM), was used to optimize input process parameters and compressive strength of FDM technique fabricated polymer material parts. The study concluded that the maximum compressive strength observed with GA integrated FLM was 49.7303 MPa at input factors (layer thickness-0.16 mm, temperature 208°C, infill-pattern-Honeycomb, infill-density-60% and speed/extrusion velocity-41 mm/s) which is higher than the experimental (47.08 MPa) and fuzzy predicted (47.101 MPa) value. This evolutionary hybrid soft computing methodology has optimized the compressive strength of PLA polymer material parts at optimum parameters combination set. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Impacts of electric vehicle charging stations and DGs on RDS for improving voltage stability using honey badger algorithm.
- Author
-
Thiruveedula, Madhubabu, Asokan, K., and Subrahmanyam, J. B. V.
- Subjects
ELECTRIC vehicle charging stations ,TEST systems ,MATHEMATICAL optimization ,SOFT computing ,LOW voltage systems ,HONEY - Abstract
The intelligent computational technique used in this research handles the multi-objective voltage stability optimization (MOVSO) problem in radial distribution systems (RDS). The objectives of the proposed research are to minimize network loss, lower the average voltage deviation index (AVDI), and improve the voltage stability index (VSI) of RDS by taking into account the recently created distributed generators (DGs) and electric vehicle charging stations (EVCSs). To address the MOVSO problem, a novel and innovative honey badger algorithm (HBA) optimization technique is put forth. The two stages of HBA, known as the "digging" and "honey" phases, are responsible for effectively identifying the ideal position and appropriate quantity of EVCSs and DGs. The standard IEEE 33 node test system with different case studies is considered to validate the performance of HBA. The simulation results of improved voltage profile, minimized power loss, AVDI and improved VSI are tabulated. The proposed HBA fine-tunes the ideal position and size of the EVCSs to significantly enhance RDS performance under higher loading circumstances. To demonstrate the efficacy and originality of the suggested HBA, the numerical results are contrasted with those of earlier soft computing techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Hyperspectral Method Integrated with Machine Learning to Predict the Acidity and Soluble Solid Content Values of Kiwi Fruit During the Storage Period.
- Author
-
Mansourialam, Amir, Rasekh, Mansour, Ardabili, Sina, Dadkhah, Majid, and Mosavi, Amir
- Subjects
MACHINE learning ,FRUIT storage ,SUPPORT vector machines ,ARTIFICIAL intelligence ,SOFT computing - Abstract
Non-destructive evaluation is advancing in examining the properties of fruits. Kiwi fruit stands out as one of the popular fruits globally. Due to the influence of various environmental factors and storage conditions, diligent checking and storage of this fruit are essential. Therefore, monitoring changes in its properties during storage in cold storage facilities is crucial. One nondestructive method utilised in recent years to investigate changes in fruit texture is the hyperspectral method. This study uses the support vector machine (SVM) method to assess hyperspectral method's effectiveness in examining property changes in four kiwi varieties during storage in addition to predicting the properties such as acidity and soluble solid content. The evaluation of the predictive machine learning model revealed an accuracy of 95% in predicting acidity and soluble solid content (SSC) changes in kiwi fruit during storage. Further, investigations found that the support vector machine method provided relatively lower accuracy and sensitivity in identifying product variety during storage, with an average accuracy ranging from about 91% to 94%. These findings suggest that integrating machine learning methods with outputs from techniques like hyperspectral imaging enhances the non-destructive detection capability of fruits. This integration transforms obtained results into practical outcomes, serving as an interface between software and hardware. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. A survey on localization and energy efficiency in UWSN: bio-inspired approach.
- Author
-
Murali, J. and Shankar, T.
- Abstract
The underwater wireless sensor networks (UWSNs) area is a developing area of research since there are tremendous opportunities like surveying marine life, installing and monitoring optical cables, detecting earthquakes, and surveillance of territorial borders. Though many applications exist, underwater research explored to date is less than five percent as it poses many issues and challenges like water currents, temperature, pressure, water salinity, disturbance by aquatic animals, and many more factors that affect the performance of sensors deployed inside water. A significant issue UWSNs face is focusing on energy efficiency to extend the life of submerged sensors placed in isolated areas. Resolving localization concerns is a primary additional concern. In this comprehensive survey, the basics of UWSNs are covered in the introduction, followed by a thorough literature review of the existing works mainly focusing on localization, energy efficiency, Bio-inspired algorithms (BIA), and the impact of implementing Machine Learning (ML) are discussed. In concurrent sections, we have discussed attributes, parameters useful for analysis, issues and challenges in UWSN, soft computing techniques, software and hardware tools available for extended research, and opportunities in UWSN. The researchers could gain perspective pathways at the end of this survey.Article highlights: Architectures, Applications, Issues, and Challenges of UWSN are mentioned in detail. Range-based, Range-free algorithms for Localization and various Energy Efficiency algorithms are analyzed. PSO, Artificial fish swarm optimization, Hybrid grass hopper algorithm, Firefly, Dolphin swarm, and twenty-three other Bio-inspired optimization algorithms are reviewed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Soft-Computing Analysis and Prediction of the Mechanical Properties of High-Volume Fly-Ash Concrete Containing Plastic Waste and Graphene Nanoplatelets.
- Author
-
Adamu, Musa, Ibrahim, Yasser E., and Jibril, Mahmud M.
- Subjects
ELASTIC modulus ,KRIGING ,PLASTICS ,PLASTIC scrap ,CONCRETE durability - Abstract
The rising population and demand for plastic materials lead to increasing plastic waste (PW) annually, much of which is sent to landfills without adequate recycling, posing serious environmental risks globally. PWs are grinded to smaller sizes and used as aggregates in concrete, where they improve environmental and materials sustainability. On the other hand, PW causes a significant reduction in the mechanical properties and durability of concrete. To mitigate the negative effects of PW, highly reactive pozzolanic materials are normally added as additives to the concrete. In this study, PW was used as a partial substitute for coarse aggregate, and graphene nanoplatelets (GNPs) were used as additives to high-volume fly-ash concrete (HVFAC). Utilizing PW as aggregates and GNPs as additives has been found to enhance the mechanical properties of HVFAC. Hence, this study employed two machine-learning (ML) models, namely Gaussian Process Regression (GPR) and Elman Neural Network (ELNN), to forecast the mechanical properties of HVFAC. The study input variables were PW, FA, GNP, W/C, CP, density, and slump, where the target variables are compressive strength (CS), modulus of elasticity (ME), splitting tensile strength (STS), and flexural strength (FS). A total of 240 datasets were employed in this study and divided into calibration (70%) and validation (30%) sets. During the prediction of the CS, it was found that GPR-M3 outperforms all other models with an R-value equal to 0.9930 and PCC value of 0.9929 in the calibration phase, and R-value = 0.9505 and PCC = 0.9339 in the verification phase. Additionally, during the modeling of FS, it was also noticed that GPR-M3 surpasses all other combinations with R = 0.9973 and PCC = 0.9973 in calibration and R = 0.9684 and PCC = 0.9428 in the verification phase. Moreover, in ME modeling, GPR-M3 is the best modeling combination and shows high accuracy with R = 0.9945 and PCC = 0.9945 in calibration and R = 0.9665 and PCC = 0.9584 in the verification phase. On the other hand, GPR-M3 outperforms all other models during the modeling of STS with R = 0.9856 and PCC = 0.9855 in calibration, and R = 0.9482 and PCC = 0.9353 in the verification phase. Further quantitative analysis shows that, in the prediction of CS, the GPR improves the prediction accuracy of ELNN by 0.49%, while during the prediction of the splitting tensile strength, it was also found that the GPR improved the accuracy of ELNN by 1.54%. In FS prediction, it was also improved by 7.66%, while in ME, it was improved by 4.9%. In conclusion, this AI-based model proves how accurate and effective it was to employ an ML-based model in forecasting the mechanical properties of HVFAC. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. A Two-way Crossed Effects Fuzzy Panel Linear Regression Model
- Author
-
Gholamreza Hesamian and Arne Johannssen
- Subjects
Fuzzy panel data ,Two-way crossed effects ,LR Fuzzy numbers ,Least absolute error ,Soft computing ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Abstract Over the last two decades, the panel data model has become a focus of applied research. While there are numerous proposals for soft regression models in the literature, only a few linear regression models have been proposed based on fuzzy panel data. However, these models have serious limitations. This study is an attempt to propose a kind of two-way fuzzy panel regression model with crossed effects, fuzzy responses and crisp predictors to overcome the shortcomings of these models in real applications. The corresponding parameter estimation is provided based on a three-step procedure. For this purpose, the conventional least absolute error technique is employed. Two real data sets are analyzed to investigate the fitting and predictive capabilities of the proposed fuzzy panel regression model. These real data applications demonstrate that our proposed model has good fitting accuracy and predictive performance.
- Published
- 2025
- Full Text
- View/download PDF
22. Predicting Max Scour Depths near Two-Pier Groups Using Ensemble Machine-Learning Models and Visualizing Feature Importance with Partial Dependence Plots and SHAP.
- Author
-
Nandi, Buddhadev and Das, Subhasish
- Subjects
- *
MACHINE learning , *STANDARD deviations , *RANDOM forest algorithms , *SOFT computing , *BRIDGE foundations & piers , *PIERS - Abstract
Assessing scour depth (Sd) near side-by-side, tandem, and eccentric bridge piers is crucial for designing resilient structures. Researchers employed soft computing techniques to enhance Sd prediction models, focusing on ensemble machine learning (ML) methods for isolated piers. However, this research is limited on such two-pier groups, which necessitates a detailed study of how pier spacing and positioning collectively affect Sd predictions. A thorough examination is needed to analyze scouring patterns and the collective two-pier impact on estimating Sd using ensemble ML models. This study employs two ensemble ML models, random forest (RF) and extreme gradient boosting (XGBoost), to collectively predict circular two-pier Sd. Input parameters such as flow characteristics, sediment properties, time, pier gaps, and flow skew angle are rigorously evaluated to assess their combined impact on Sd. Partial dependence plots (PDP) and SHapley Additive exPlanations (SHAP) are used to visualize feature importance and effects on predicting Sd after training ML models, providing insights into individual features' influence on predictions. Performance indicators [coefficient of determination (R2), mean absolute error, and root mean squared error] assess the ML models' performance. Results demonstrate that XGBoost outperformed RF in training and testing phases with random search cross validation (CV) optimization for both piers. However, RF excelled over XGBoost in training using grid search CV and random search CV for both piers. Flow intensity was identified as the most influential variable, making the phenomenon highly vulnerable during model training with SFS and SHAP. Using ensemble ML models with detailed parameter evaluations and visualizations, engineers can predict Sd more effectively, thus enhancing scouring pattern understanding to design resilient structures. Practical Applications: The practical application of this study shows where new bridges are needed next to old bridges for traffic in rapidly populating cities. Bridge piers are placed side-by-side, in tandem, or eccentrically. Scour depth can increase or decrease due to dynamic interference if these piers are not studied properly. This study examines how interference impacts scour depth in various positions and estimates it using an ensemble ML model. This ensemble model accurately predicts scour depth around such interfering piers, which outperforms the classic model. Partial dependence plots show how parameters affect scour depth, considering interference effects. The model shows how interference impacts the scour depth for designing such piers. This model can be used to analyze the impact of such two-pier configurations after integrating the field data for studying field installation effects. Experts and practitioners can utilize the model to improve bridge placements by predicting how pier interference affects scour depth, thus enhancing safety in the design process. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
23. Enhanced operation of PVWPS based on advanced soft computing optimization techniques
- Author
-
Mahmoud M. Elymany, Mohamed A. Enany, Hamid Metwally, and Ahmed A. Shaier
- Subjects
Soft computing ,Gorilla troop algorithm (GTO) ,Honey badger algorithm (HBA) ,Snake algorithm (SAO) ,ANFIS ,MATLAB simulation ,Medicine ,Science - Abstract
Abstract This study introduces three soft computing (SC) optimization algorithms aimed at enhancing the efficiency of photovoltaic water pumping systems (PVWPS). These algorithms include the Gorilla Troop Algorithm (GTO), Honey Badger Algorithm (HBA), and Snake Algorithm (SAO). The goal of the SC optimizers is to maximize the output power of the PV array (P PV ) and enhance the efficiency of the DC motor (η), thereby optimizing the water flow rate (Q) of the pumping system. The analytical modeling approach proposed in this study involves forecasting the optimal duty cycle (D op ) for a buck-boost converter, taking into account variables such as solar radiation (G) and ambient temperature (T). A comparative analysis is conducted between the suggested SC optimizers and analytical modeling. MATLAB simulation is employed to explore an adaptive neuro-fuzzy inference system (ANFIS) trained for the proposed system. The objective is to assess system performance and accuracy. Findings indicate a strong convergence between the analytical model and the simulation model utilizing SC optimizers. Moreover, the neuro-fuzzy system trained offline, coupled with the proposed SC optimizers, demonstrates superior performance compared to traditional control methods like perturb and observe (P&O) and incremental conductance (IC). This superiority is evident across various metrics including motor efficiency (η), photovoltaic (PV) output power (P PV ), water flow rate (Q), and time response.
- Published
- 2024
- Full Text
- View/download PDF
24. An advanced encryption system based on soft sets
- Author
-
Erdal Bayram, Gülşah Çelik, and Mustafa Gezek
- Subjects
encryption ,maximum and minimum operators ,soft computing ,soft cryptosystem ,soft sets ,Mathematics ,QA1-939 - Abstract
Given the application domains of soft set theory, such as decision-making processes, image processing, machine learning, and data mining, it is natural to consider that this theory could be utilized more effectively in encryption systems. A review of the literature reveals that soft set-based encryption systems have been explored in a limited number of studies. This study seeks to develop a new approach for soft sets in encryption systems by utilizing newly introduced algebraic and topological tools. In this system, parties will be able to generate encryption keys independently using soft sets they determine themselves rather than through prior mutual agreement. Additionally, the method of key generation and the size of the key space in the resulting encryption system provides a more secure and distinct alternative compared to existing soft set-based encryption systems.
- Published
- 2024
- Full Text
- View/download PDF
25. A survey on localization and energy efficiency in UWSN: bio-inspired approach
- Author
-
J. Murali and T. Shankar
- Subjects
Bio-inspired algorithms ,Energy efficiency algorithms ,Localization ,Machine learning ,Soft computing ,Underwater wireless sensor networks ,Science (General) ,Q1-390 - Abstract
Abstract The underwater wireless sensor networks (UWSNs) area is a developing area of research since there are tremendous opportunities like surveying marine life, installing and monitoring optical cables, detecting earthquakes, and surveillance of territorial borders. Though many applications exist, underwater research explored to date is less than five percent as it poses many issues and challenges like water currents, temperature, pressure, water salinity, disturbance by aquatic animals, and many more factors that affect the performance of sensors deployed inside water. A significant issue UWSNs face is focusing on energy efficiency to extend the life of submerged sensors placed in isolated areas. Resolving localization concerns is a primary additional concern. In this comprehensive survey, the basics of UWSNs are covered in the introduction, followed by a thorough literature review of the existing works mainly focusing on localization, energy efficiency, Bio-inspired algorithms (BIA), and the impact of implementing Machine Learning (ML) are discussed. In concurrent sections, we have discussed attributes, parameters useful for analysis, issues and challenges in UWSN, soft computing techniques, software and hardware tools available for extended research, and opportunities in UWSN. The researchers could gain perspective pathways at the end of this survey.
- Published
- 2024
- Full Text
- View/download PDF
26. Multiple objectives optimization of injection-moulding process for dashboard using soft computing and particle swarm optimization
- Author
-
Mehdi Moayyedian, Mohammad Reza Chalak Qazani, Parisa Jourabchi Amirkhizi, Houshyar Asadi, and Mohsen Hedayati-Dezfooli
- Subjects
Injection moulding ,Warpage/shrinkage/sink mark ,Soft computing ,Multiple objectives particle swarm optimisation ,Pareto front ,Medicine ,Science - Abstract
Abstract This research focuses on utilizing injection moulding to assess defects in plastic products, including sink marks, shrinkage, and warpages. Process parameters, such as pure cooling time, mould temperature, melt temperature, and pressure holding time, are carefully selected for investigation. A full factorial design of experiments is employed to identify optimal settings. These parameters significantly affect the physical and mechanical properties of the final product. Soft computing methods, such as finite element (FE), help mitigate behaviour by considering different input parameters. A CAD model of a dashboard component integrates into an FE simulation to quantify shrinkage, warpage, and sink marks. Four chosen parameters of the injection moulding machine undergo comprehensive experimental design. Decision tree, multilayer perceptron, long short-term memory, and gated recurrent units models are explored for injection moulding process modelling. The best model estimates defects. Multiple objectives particle swarm optimisation extracts optimal process parameters. The proposed method is implemented in MATLAB, providing 18 optimal solutions based on the extracted Pareto-Front.
- Published
- 2024
- Full Text
- View/download PDF
27. Intelligent agricultural robotic detection system for greenhouse tomato leaf diseases using soft computing techniques and deep learning
- Author
-
Thi Thoa Mac, Tien-Duc Nguyen, Hong-Ky Dang, Duc-Toan Nguyen, and Xuan-Thuan Nguyen
- Subjects
Soft computing ,Fuzzy control ,Tomato plant disease classification ,DCGAN ,Precision agriculture ,Medicine ,Science - Abstract
Abstract The development of soft computing methods has had a significant influence on the subject of autonomous intelligent agriculture. This paper offers a system for autonomous greenhouse navigation that employs a fuzzy control algorithm and a deep learning-based disease classification model for tomato plants, identifying illnesses using photos of tomato leaves. The primary novelty in this study is the introduction of an upgraded Deep Convolutional Generative Adversarial Network (DCGAN) that creates augmented pictures of disease tomato leaves from original genuine samples, considerably enhancing the training dataset. To find the optimum training model, four deep learning networks (VGG19, Inception-v3, DenseNet-201, and ResNet-152) were carefully compared on a dataset of nine tomato leaf disease classes. These models have validation accuracy of 92.32%, 90.83%, 96.61%, and 97.07%, respectively, when using the original PlantVillage dataset. The system then uses an enhanced dataset with ResNet-152 network design to achieve a high accuracy of 99.69%, as compared to the original dataset with ResNet-152’s accuracy of 97.07%. This improvement indicates the use of the proposed DCGAN in improving the performance of the deep learning model for greenhouse plant monitoring and disease detection. Furthermore, the proposed approach may have a broader use in various agricultural scenarios, potentially altering the field of autonomous intelligent agriculture.
- Published
- 2024
- Full Text
- View/download PDF
28. Optimizing Injection Molding for Propellers with Soft Computing, Fuzzy Evaluation, and Taguchi Method
- Author
-
M. Hedayati-Dezfooli, Mehdi Moayyedian, Ali Dinc, Mostafa Abdrabboh, Ahmed Saber, and A. M. Amer
- Subjects
injection molding ,shrinkage ,sink mark ,soft computing ,fahp ,topsis ,taguchi. ,Technology (General) ,T1-995 ,Social sciences (General) ,H1-99 - Abstract
This research explores multi-objective optimization in injection molding with a focus on identifying the optimal configuration for the moldability index in aviation propeller manufacturing. The study employs the Taguchi method and fuzzy analytic hierarchy process (FAHP) combined with the Technique for the Order Performance by Similarity to the Ideal Solution (TOPSIS) to systematically evaluate diverse objectives. The investigation specifically addresses two prevalent defects—shrinkage rate and sink mark—that impact the final quality of injection-molded components. Polypropylene is chosen as the injection material, and critical process parameters encompass melt temperature, mold temperature, filling time, cooling time, and pressure holding time. The Taguchi L25 orthogonal array is selected, considering the number of levels and parameters, and Finite Element Analysis (FEA) is applied to enhance precision in results. To validate both simulation outcomes and the proposed optimization methodology, Artificial Neural Network (ANN) analysis is conducted for the chosen component. The Fuzzy-TOPSIS method, in conjunction with ANN, is employed to ascertain the optimal levels of the selected parameters. The margin of error between the chosen optimization methods is found to be less than one percent, underscoring their suitability for injection molding optimization. The efficacy of the selected optimization method has been corroborated in prior research. Ultimately, employing the fuzzy-TOPSIS optimization method yields a minimum shrinkage value of 16.34% and a sink mark value of 0.0516 mm. Similarly, utilizing the ANN optimization method results in minimum values of 16.42% for shrinkage and 0.0519 mm for the sink mark. Doi: 10.28991/ESJ-2024-08-05-025 Full Text: PDF
- Published
- 2024
- Full Text
- View/download PDF
29. Passive earth pressure on vertical rigid walls with negative wall friction coupling statically admissible stress field and soft computing
- Author
-
Jim Shiau, Tan Nguyen, and Tram Bui-Ngoc
- Subjects
Negative wall friction ,Admissible stress field ,Passive earth pressure ,Soft computing ,Medicine ,Science - Abstract
Abstract It is well known that the roughness of a wall plays a crucial role in determining the passive earth pressure that is exerted on a rigid wall. While the effects of positive wall roughness have been extensively studied in the past few decades, the study of passive earth pressure with negative wall friction is rarely found in the literature. This study aims to provide a precise solution for negative friction walls under passive wall conditions. The research is initiated by adopting a radial stress field for the cohesionless backfill and employs the concept of stress self-similarity. The problem is then formulated in a way that a statically admissible stress field be developed throughout an analyzed domain using a two-step numerical framework. The framework involves the successful execution of numerical integration, which leads to the exploration of the statically admissible stress field in cohesionless backfills under negative wall friction. This, in turn, helps to shed light on the mechanism of load transfer in such situations so that reliable design charts and tables be provided for practical uses. The study continues with a soft computing model that leads to more robust and effective designs for earth-retaining structures under various negative wall frictions and sloping backfills.
- Published
- 2024
- Full Text
- View/download PDF
30. Comparing the Predictability of Soft Computing and Statistical Techniques for the Prediction of Tensile Strength of Additively Manufactured Carbon Fiber Polylactic Acid Parts.
- Author
-
Raj, Abhishek, Tyagi, Bobby, Goyal, Ashish, Sahai, Ankit, and Sharma, Rahul Swarup
- Subjects
ARTIFICIAL neural networks ,FUSED deposition modeling ,CARBON fibers ,ARTIFICIAL intelligence ,SOFT computing ,POLYLACTIC acid - Abstract
The objective of this study is to investigate the influence of input factors, namely layer height (LH), print speed (PS), and infill line direction (ID), on the tensile strength (TS) of polymer components fabricated using fused deposition modelling. The primary objective of this study is to construct a robust prediction model for TS utilising soft computing methodologies, namely a two-layered feed-forward backpropagation algorithm and a hybrid neural network-integrated fuzzy interface system (FIS). The specimens utilised for analysis were fabricated using carbon fibre fibre-reinforced polylactic acid (CF-PLA) composites per the ASTM D638 standard. A dataset is generated using a C27 orthogonal array to capture variations in LH, PS, and ID techniques. In this study, two soft computing methodologies, namely an artificial neural network (ANN) and an adaptive neuro-fuzzy inference system (ANFIS), are employed to effectively describe the fused deposition process and accurately predict the TS of the printed objects. The performance of these strategies is evaluated in comparison to the response surface methodology (RSM). The findings imply an inverse correlation between the TS and the LH, indicating that decreasing the LH can improve the structural integrity of the printed components. Furthermore, The ID region's effect depends on the tensile force's orientation. Infill lines aligned at 0° had the highest TS, while those at 90° had the lowest. The results of this study show how input variables affect the strength of additively produced (AM) polymer components. Soft computing methods enable AM parameter optimisation and accurate TS forecasts. The ANFIS method predicted tensile strength better than ANN and RSM. The negative relationship between LH and TS emphasises the importance of choosing the right LH for mechanical qualities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Prediction of crippling load of I-shaped steel columns by using soft computing techniques.
- Author
-
Mustafa, Rashid
- Subjects
IRON & steel columns ,STANDARD deviations ,SOFT computing ,IMPACT loads ,RANDOM forest algorithms - Abstract
This study is primarily aimed at creating three machine learning models: artificial neural network (ANN), random forest (RF), and k-nearest neighbour (KNN), so as to predict the crippling load (CL) of I-shaped steel columns. Five input parameters, namely length of column (L), width of flange (b
f ), flange thickness (tf ), web thickness (tw ) and height of column (H), are used to compute the crippling load (CL). A range of performance indicators, including the coefficient of determination (R2 ), variance account factor (VAF), a-10 index, root mean square error (RMSE), mean absolute error (MAE) and mean absolute deviation (MAD), are used to assess the effectiveness of the established machine learning models. The results show that all of the three ML (machine learning) models can accurately predict the crippling load, but the performance of ANN is superior: it delivers the highest value of R2 = 0.998 and the lowest value of RMSE = 0.008 in the training phase, as well as the highest value of R2 = 0.996 and the smaller value of RMSE = 0.012 in the testing phase. Additional methods, including rank analysis, reliability analysis, regression plot, Taylor diagram and error matrix plot, are employed to assess the models' performance. The reliability index (β) of the models is calculated by using the first-order second moment (FOSM) technique, and the result is compared with the actual value. Additionally, sensitivity analysis is performed to check the impact of the input variables on the output (CL), finding that bf has the greatest impact on the crippling load, followed by tf , tw , H and L, in that order. This study demonstrates that ML techniques are useful for developing a reliable numerical tool for measuring the crippling load of I-shaped steel columns. It is found that the proposed techniques can also be used to predict other kinds of failures as well as different kinds of perforated columns. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
32. Optimizing network lifetime in wireless sensor networks: a hierarchical fuzzy logic approach with LEACH integration.
- Author
-
Dadhirao, Chandrika, Reddy Sadi, Ram Prasad, Rao, Prabhakar, and Terlapu, Panduranga Vital
- Subjects
COMPUTER network protocols ,FUZZY logic ,TELECOMMUNICATION systems ,SOFT computing ,ENERGY consumption ,WIRELESS sensor networks - Abstract
Wireless sensor networks (WSNs) are of significant importance in many applications; nevertheless, their operational efficiency and longevity might be impeded by energy limitations. The low energy adaptive clustering hierarchy (LEACH) protocol has been specifically developed with the objective of achieving energy consumption equilibrium and regularly rotating cluster heads (CHs). This study presents a novel technique, namely the hierarchical fuzzy logic controller (HFLC), which is integrated with the LEACH protocol to enhance the process of CH selection and effectively prolong the network's operational lifespan. The HFLC system employs fuzzy logic as a means to address the challenges posed by uncertainty and imprecision. It assesses many aspects, including residual energy, node proximity, and network density, in order to make informed decisions. The combination of HFLC with LEACH demonstrates superior performance compared to the conventional LEACH protocol in terms of energy efficiency, stability, and network durability. This study emphasizes the potential of intelligent and adaptive mechanisms in improving the performance of WSNs by improving the survivability of nodes by reducing the energy consumption of the nodes during the communication of network process. It also paves the way for future research that integrates soft computing approaches into network protocols. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. An advanced encryption system based on soft sets.
- Author
-
Bayram, Erdal, Çelik, Gülşah, and Gezek, Mustafa
- Subjects
SOFT computing ,DATA mining ,IMAGE processing ,MACHINE learning ,DECISION making - Abstract
Given the application domains of soft set theory, such as decision-making processes, image processing, machine learning, and data mining, it is natural to consider that this theory could be utilized more effectively in encryption systems. A review of the literature reveals that soft set-based encryption systems have been explored in a limited number of studies. This study seeks to develop a new approach for soft sets in encryption systems by utilizing newly introduced algebraic and topological tools. In this system, parties will be able to generate encryption keys independently using soft sets they determine themselves rather than through prior mutual agreement. Additionally, the method of key generation and the size of the key space in the resulting encryption system provides a more secure and distinct alternative compared to existing soft set-based encryption systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Soft Computing Techniques to Model the Compressive Strength in Geo-Polymer Concrete: Approaches Based on an Adaptive Neuro-Fuzzy Inference System.
- Author
-
Chang, Zhiguo, Shi, Xuyang, Zheng, Kaidan, Lu, Yijun, Deng, Yunhui, and Huang, Jiandong
- Subjects
SOFT computing ,SUSTAINABLE construction ,CARBON emissions ,COMPRESSIVE strength ,SUSTAINABILITY ,POLYMER-impregnated concrete - Abstract
Media visual sculpture is a landscape element with high carbon emissions. To reduce carbon emission in the process of creating and displaying visual art and structures (visual communication), geo-polymer concrete (GePC) is considered by designers. It has emerged as an environmentally friendly substitute for traditional concrete, boasting reduced carbon emissions and improved longevity. This research delves into the prediction of the compressive strength of GePC (CSGePC) employing various soft computing techniques, namely SVR, ANNs, ANFISs, and hybrid methodologies combining Genetic Algorithm (GA) or Firefly Algorithm (FFA) with ANFISs. The investigation utilizes empirical datasets encompassing variations in concrete constituents and compressive strength. Evaluative metrics including RMSE, MAE, R
2 , VAF, NS, WI, and SI are employed to assess predictive accuracy. The results illustrate the remarkable precision of all soft computing approaches in predicting CSGePC, with hybrid models demonstrating superior performance. Particularly, the FFA-ANFISs model achieves a MAE of 0.8114, NS of 0.9858, RMSE of 1.0322, VAF of 98.7778%, WI of 0.9236, R2 of 0.994, and SI of 0.0358. Additionally, the GA-ANFISs model records a MAE of 1.4143, NS of 0.9671, RMSE of 1.5693, VAF of 96.8278%, WI of 0.8207, R2 of 0.987, and SI of 0.0532. These findings underscore the effectiveness of soft computing techniques in predicting CSGePC, with hybrid models showing particularly promising results. The practical application of the model is demonstrated through its reliable prediction of CSGePC, which is crucial for optimizing material properties in sustainable construction. Additionally, the model's performance was compared with the existing literature, showing significant improvements in predictive accuracy and robustness. These findings contribute to the development of more efficient and environmentally friendly construction materials, offering valuable insights for real-world engineering applications. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
35. Enhanced Prediction and Evaluation of Hydraulic Concrete Compressive Strength Using Multiple Soft Computing and Metaheuristic Optimization Algorithms.
- Author
-
Li, Tianyu, Hu, Xiamin, Li, Tao, Liao, Jie, Mei, Lidan, Tian, Huiwen, and Gu, Jinlong
- Subjects
MACHINE learning ,METAHEURISTIC algorithms ,OPTIMIZATION algorithms ,HYDRAULIC structures ,SOFT computing - Abstract
Concrete is the material of choice for constructing hydraulic structures in water-related buildings, and its mechanical properties are crucial for evaluating the structural damage state. Machine learning models have proven effective in predicting these properties. However, a single machine learning model often suffers from overfitting and low prediction accuracy. To address this issue, this study introduces a novel hybrid method for predicting concrete compressive strength by integrating multiple soft computing algorithms and the stacking ensemble learning strategy. In the initial stage, several classic machine learning models are selected as base models, and the optimal parameters of these models are obtained using the improved metaheuristic-based gray wolf algorithm. In the subsequent stage, the lightweight gradient boosting tree (LightGBM) model and the metaheuristic-based optimization algorithm are combined to integrate information from base models. This process identifies the primary factors affecting concrete compressive strength. The experimental results demonstrate that the hybrid ensemble learning and heuristic optimization algorithm achieve a regression coefficient of 0.9329, a mean absolute error (MAE) of 2.7695, and a mean square error (MSE) of 4.0891. These results indicate superior predictive performance compared to other advanced methods. The proposed method shows potential for application in predicting the service life and assessing the structural damage status of hydraulic concrete structures, suggesting broad prospects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Semi-Supervised Soft Computing for Ammonia Nitrogen Using a Self-Constructing Fuzzy Neural Network with an Active Learning Mechanism.
- Author
-
Zhou, Hongbiao, Huang, Yang, Yang, Dan, Chen, Lianghai, and Wang, Le
- Subjects
FUZZY neural networks ,STANDARD deviations ,SOFT computing ,K-means clustering ,WATER quality - Abstract
Ammonia nitrogen (NH
3 -N) is a key water quality variable that is difficult to measure in the water treatment process. Data-driven soft computing is one of the effective approaches to address this issue. Since the detection cost of NH3 -N is very expensive, a large number of NH3 -N values are missing in the collected water quality dataset, that is, a large number of unlabeled data are obtained. To enhance the prediction accuracy of NH3 -N, a semi-supervised soft computing method using a self-constructing fuzzy neural network with an active learning mechanism (SS-SCFNN-ALM) is proposed in this study. In the SS-SCFNN-ALM, firstly, to reduce the computational complexity of active learning, the kernel k-means clustering algorithm is utilized to cluster the labeled and unlabeled data, respectively. Then, the clusters with larger information values are selected from the unlabeled data using a distance metric criterion. Furthermore, to improve the quality of the selected samples, a Gaussian regression model is adopted to eliminate the redundant samples with large similarity from the selected clusters. Finally, the selected unlabeled samples are manually labeled, that is, the NH3 -N values are added into the dataset. To realize the semi-supervised soft computing of the NH3 -N concentration, the labeled dataset and the manually labeled samples are combined and sent to the developed SCFNN. The experimental results demonstrate that the test root mean square error (RMSE) and test accuracy of the proposed SS-SCFNN-ALM are 0.0638 and 86.31%, respectively, which are better than the SCFNN (without the active learning mechanism), MM, DFNN, SOFNN-HPS, and other comparison algorithms. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
37. Intelligent agricultural robotic detection system for greenhouse tomato leaf diseases using soft computing techniques and deep learning.
- Author
-
Mac, Thi Thoa, Nguyen, Tien-Duc, Dang, Hong-Ky, Nguyen, Duc-Toan, and Nguyen, Xuan-Thuan
- Subjects
GENERATIVE adversarial networks ,AGRICULTURE ,SOFT computing ,DEEP learning ,PLANT classification ,TOMATOES - Abstract
The development of soft computing methods has had a significant influence on the subject of autonomous intelligent agriculture. This paper offers a system for autonomous greenhouse navigation that employs a fuzzy control algorithm and a deep learning-based disease classification model for tomato plants, identifying illnesses using photos of tomato leaves. The primary novelty in this study is the introduction of an upgraded Deep Convolutional Generative Adversarial Network (DCGAN) that creates augmented pictures of disease tomato leaves from original genuine samples, considerably enhancing the training dataset. To find the optimum training model, four deep learning networks (VGG19, Inception-v3, DenseNet-201, and ResNet-152) were carefully compared on a dataset of nine tomato leaf disease classes. These models have validation accuracy of 92.32%, 90.83%, 96.61%, and 97.07%, respectively, when using the original PlantVillage dataset. The system then uses an enhanced dataset with ResNet-152 network design to achieve a high accuracy of 99.69%, as compared to the original dataset with ResNet-152's accuracy of 97.07%. This improvement indicates the use of the proposed DCGAN in improving the performance of the deep learning model for greenhouse plant monitoring and disease detection. Furthermore, the proposed approach may have a broader use in various agricultural scenarios, potentially altering the field of autonomous intelligent agriculture. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Multiple objectives optimization of injection-moulding process for dashboard using soft computing and particle swarm optimization.
- Author
-
Moayyedian, Mehdi, Qazani, Mohammad Reza Chalak, Amirkhizi, Parisa Jourabchi, Asadi, Houshyar, and Hedayati-Dezfooli, Mohsen
- Subjects
PARTICLE swarm optimization ,SOFT computing ,FACTORIAL experiment designs ,PLASTICS ,TIME pressure - Abstract
This research focuses on utilizing injection moulding to assess defects in plastic products, including sink marks, shrinkage, and warpages. Process parameters, such as pure cooling time, mould temperature, melt temperature, and pressure holding time, are carefully selected for investigation. A full factorial design of experiments is employed to identify optimal settings. These parameters significantly affect the physical and mechanical properties of the final product. Soft computing methods, such as finite element (FE), help mitigate behaviour by considering different input parameters. A CAD model of a dashboard component integrates into an FE simulation to quantify shrinkage, warpage, and sink marks. Four chosen parameters of the injection moulding machine undergo comprehensive experimental design. Decision tree, multilayer perceptron, long short-term memory, and gated recurrent units models are explored for injection moulding process modelling. The best model estimates defects. Multiple objectives particle swarm optimisation extracts optimal process parameters. The proposed method is implemented in MATLAB, providing 18 optimal solutions based on the extracted Pareto-Front. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. A scientometrics review of conventional and soft computing methods in the slope stability analysis.
- Author
-
Ahmad, Feezan, Tang, Xiao-Wei, Ahmad, Mahmood, Najeh, Taoufik, Gamil, Yaser, Ebid, Ahmed M., and Song, Jinhu
- Subjects
ARTIFICIAL neural networks ,SLOPE stability ,KRIGING ,SOFT computing ,SUPPORT vector machines - Abstract
Predicting slope stability is important for preventing and mitigating landslide disasters. This paper examines the existing approaches for analyzing slope stability. There are several established conventional approaches for slope stability analysis that can be applied in this context. However, in recent decades, soft computing methods has been extensively developed and employed in stochastic slope stability analysis, notably as surrogate models to improve computing efficiency in contrast to traditional approaches. Soft computing methods can deal with uncertainty and imprecision, which may be quantified using performance indices like coefficient of determination, in regression and accuracy in classification. This review study focuses on conventional methods such as the Bishop's method and Janbu's method, as well as soft computing models such as support vector machine, artificial neural network, Gaussian process regression, decision tree, etc. The advantages and limitations of soft computing techniques in relation to conventional methods have also been thoroughly covered in this paper. The achievements of soft computing methods are summarized from two aspects -predicting factor of safety and classification of slope stability. Key potential research challenges and future prospects are also given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. LSTM Gate Disclosure as an Embedded AI Methodology for Wearable Fall-Detection Sensors †.
- Author
-
Correia, Sérgio D., Roque, Pedro M., and Matos-Carvalho, João P.
- Subjects
- *
ARTIFICIAL intelligence , *WIRELESS sensor networks , *SOFT computing , *WEARABLE technology , *OLDER people - Abstract
In this paper, the concept of symmetry is used to design the efficient inference of a fall-detection algorithm for elderly people on embedded processors—that is, there is a symmetric relation between the model's structure and the memory footprint on the embedded processor. Artificial intelligence (AI) and, more particularly, Long Short-Term Memory (LSTM) neural networks are commonly used in the detection of falls in the elderly population based on acceleration measures. Nevertheless, embedded systems that may be utilized on wearable or wireless sensor networks have a hurdle due to the customarily massive dimensions of those networks. Because of this, the algorithms' most popular implementation relies on edge or cloud computing, which raises privacy concerns and presents challenges since a lot of data need to be sent via a communication channel. The current work proposes a memory occupancy model for LSTM-type networks to pave the way to more efficient embedded implementations. Also, it offers a sensitivity analysis of the network hyper-parameters through a grid search procedure to refine the LSTM topology network under scrutiny. Lastly, it proposes a new methodology that acts over the quantization granularity for the embedded AI implementation on wearable devices. The extensive simulation results demonstrate the effectiveness and feasibility of the proposed methodology. For the embedded implementation of the LSTM for the fall-detection problem on a wearable platform, one can see that an STM8L low-power processor could support a 40-hidden-cell LSTM network with an accuracy of 96.52 % . [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. APPLYING SOFT COMPUTING METHODS IN BUSINESS ANALYTICS USING HYBRID GENETIC ALGORITHMS.
- Author
-
Pandian, P. Senthil, Anakal, Sudhir, Padmavathy, C., and Pitty, Nagarjuna
- Subjects
FUZZY neural networks ,BUSINESS analytics ,SOFT computing ,MARKETING costs ,RATE of return - Abstract
In the era of big data and advanced analytics, the application of soft computing techniques has emerged as a powerful tool in solving complex business problems. This paper presents the use of hybrid genetic algorithms (HGAs) in business analytics to address challenges related to optimization, prediction, and decision-making processes. Traditional algorithms often struggle with large, nonlinear, and dynamic datasets typical of business environments. The incorporation of soft computing techniques such as genetic algorithms (GAs) and their hybridization with other methods like fuzzy logic and neural networks can help overcome these limitations. The problem addressed in this research is optimizing decision-making in marketing strategies, focusing on maximizing return on investment (ROI). Standard methods face difficulties in navigating through vast datasets and discovering optimal solutions. The hybrid genetic algorithm proposed in this study combines the exploration strength of GAs with the exploitative precision of local search techniques. The model was tested using a real-world dataset of marketing expenditures and revenues from a retail company. The HGA achieved an ROI improvement of 25%, significantly outperforming standard GAs and traditional optimization methods, which yielded only a 12% improvement. The flexibility and efficiency of this approach make it ideal for various business applications, including supply chain optimization, customer segmentation, and product pricing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. ADVANCING INFORMATION TECHNOLOGY WITH IMMUNOLOGICAL COMPUTING - SOFT COMPUTING TECHNIQUES FOR ADAPTIVE AND ROBUST SYSTEMS.
- Author
-
Kaliswaran, S., Sivasankari, R., Hanumantharao, Attru, Saravanan, V., and Kumari, G. Gokul
- Subjects
INFORMATION technology ,SOFT computing ,IMMUNOLOGIC memory ,IMMUNE system ,ADAPTIVE computing systems ,IMMUNOCOMPUTERS - Abstract
The field of Information Technology (IT) is evolving rapidly, and with this growth comes the need for systems that are both adaptive and robust. Biological systems, especially the human immune system, demonstrate remarkable adaptability and resilience, inspiring the development of Immunological Computing (IC). This paper explores the application of immunological principles in Soft Computing techniques to create systems capable of responding to dynamic environments. Current IT systems often face challenges such as handling unpredictable changes, scalability, and security threats. Traditional computing approaches struggle to address these issues efficiently due to their rigid structures and limited adaptability. Immunological Computing, inspired by the immune system's ability to learn, remember, and adapt, offers a promising solution. The proposed method integrates immune system mechanisms like clonal selection, immune memory, and self/non-self-recognition into computational models. These models are coupled with soft computing techniques such as fuzzy logic, genetic algorithms, and neural networks, enhancing the system's ability to adapt to changing environments and uncertainties. In simulated tests, this approach demonstrated a significant improvement in robustness and adaptability compared to traditional IT systems. For instance, in a cybersecurity application, the immunological-based system detected and neutralized 94.6% of threats, a notable improvement over the 82.3% detected by conventional systems. Similarly, in a resource optimization scenario, the system adapted to dynamic workloads with an efficiency increase of 15% compared to static systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Advanced Soft Computing Ensemble for Modeling Contaminant Transport in River Systems: A Comparative Analysis and Ecological Impact Assessment.
- Author
-
Chabokpour, Jafar
- Subjects
SOFT computing ,WATERSHEDS ,ARTIFICIAL neural networks ,ENVIRONMENTAL protection - Abstract
The paper applies soft computing techniques to contaminant transport modeling in river systems and focuses on the Monocacy River. The research employed various techniques, including Artificial Neural Networks (ANN), Adaptive Neuro-Fuzzy Inference Systems (ANFIS), Support Vector Regression (SVR), and Genetic Algorithms (GA), to predict pollutant concentrations and estimate transport parameters. The ANN, particularly the Long Short-Term Memory architecture, had more superior performance: the lowest RMSE of 0.37, and the highest R-squared was 0.958. The RMSE obtained by the ANFIS model was 0.40, with an R-squared value of 0.945. It provided a balance with accuracy and interpretability. SVR performance with RBF kernel was robust; it has attained an RMSE of 0.42 and R-squared of 0.940, along with very fast training times. The flow velocities and the longitudinal dispersion coefficients at different reaches were estimated to be in the range of 0.30 to 0.42 m/s for average flow velocity and 0.18 to 0.31 m²/s for the longitudinal dispersion coefficient. In addition, the potentially affected fraction of species due to peak concentrations was used to reflect the assessment of ecological impact, which had values ranging from 0.07 to 0.35. For the time-varying estimation, there is supposed to be a variation in the dispersion coefficient and the decay rate over 48 hours, from 0.75 to 0.89 m²/s and from 0.10 to 0.13 day
-1 , respectively. The research demonstrates the potential of soft computing approaches for modeling complex pollutant dynamics and further provides valuable insights into river management and environmental protection strategies. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
44. A Prediction of Manning's nr in Compound Channels with Converged and Diverged Floodplains using GMDH Model.
- Author
-
Bijanvand, Sajad, Mohammadi, Mirali, and Parsaie, Abbas
- Subjects
WATERWAYS ,FLOODPLAINS ,SEDIMENTATION & deposition ,FLOODS ,SOFT computing - Abstract
In the study of natural waterways, the use of formulas such as Manning's equation is prevalent for analyzing flow structure characteristics. Typically, floodplains exhibit greater roughness compared to the main river channel, which results in higher flow velocities within the main channel. This difference in velocity can lead to increased sedimentation potential within the floodplains. Therefore, accurately determining Manning's roughness coefficient for compound channels, particularly during flood events, is of significant interest to researchers. This study aims to model the Manning roughness coefficient in compound channels with both converging and diverging floodplains using advanced soft computing techniques. These techniques include a multi-layer artificial neural network (MLPNN), Group Method of Data Handling (GMDH), and the Neuro-Fuzzy Group Method of Data Handling (NF-GMDH). For the analysis, a dataset from 196 laboratory experiments was used, which was divided into training and testing subsets. Input variables included parameters such as longitudinal slope (S
o ), relative hydraulic radius (Rr ), relative depth (Dr ), relative dimension of flow aspects (δ*), and the convergent or divergent angle (θ) of the floodplain. The relative Manning roughness coefficient (nr ) was the output variable of interest. The results of the study showed that all the models performed well, with the MLPNN model achieving the highest accuracy, characterized by R² = 0.99, RMSE = 0.001, SI = 0.0015, and DDR = 0.0233 during the testing phase. Further analysis of the soft computing models indicated that the most critical parameters influencing the results were Sr , Rr ,Dr *, and θ. These findings highlight the effectiveness of soft computing techniques in accurately modeling the Manning roughness coefficient in complex channel conditions and provide valuable insights for future research and practical applications in the management of flood events and waterway analysis. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
45. Evaluating Volatility Using an ANFIS Model for Financial Time Series Prediction.
- Author
-
Orozco-Castañeda, Johanna M., Alzate-Vargas, Sebastián, and Bedoya-Valencia, Danilo
- Subjects
BOX-Jenkins forecasting ,TIME series analysis ,FUZZY systems ,MATHEMATICAL optimization ,PRICES - Abstract
This paper develops and implements an Autoregressive Integrated Moving Average model with an Adaptive Neuro-Fuzzy Inference System (ARIMA-ANFIS) for BTCUSD price prediction and risk assessment. The goal of these forecasts is to identify patterns from past data and achieve an understanding of the future behavior of the price and its volatility. The proposed ARIMA-ANFIS model is compared with a benchmark ARIMA-GARCH model. To evaluated the adequacy of the models in terms of risk assessment, we compare the confidence intervals of the price and accuracy measures for the testing sample. Additionally, we implement the diebold and Mariano test to compare the accuracy of the two volatility forecasts. The results revealed that each volatility model focuses on different aspects of the data dynamics. The ANFIS model, while effective in certain scenarios, may expose one to unexpected risks due to its underestimation of volatility during turbulent periods. On the other hand, the GARCH(1,1) model, by producing higher volatility estimates, may lead to excessive caution, potentially reducing returns. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Passive earth pressure on vertical rigid walls with negative wall friction coupling statically admissible stress field and soft computing.
- Author
-
Shiau, Jim, Nguyen, Tan, and Bui-Ngoc, Tram
- Subjects
EARTH pressure ,SOFT computing ,RADIAL stresses ,NUMERICAL integration - Abstract
It is well known that the roughness of a wall plays a crucial role in determining the passive earth pressure that is exerted on a rigid wall. While the effects of positive wall roughness have been extensively studied in the past few decades, the study of passive earth pressure with negative wall friction is rarely found in the literature. This study aims to provide a precise solution for negative friction walls under passive wall conditions. The research is initiated by adopting a radial stress field for the cohesionless backfill and employs the concept of stress self-similarity. The problem is then formulated in a way that a statically admissible stress field be developed throughout an analyzed domain using a two-step numerical framework. The framework involves the successful execution of numerical integration, which leads to the exploration of the statically admissible stress field in cohesionless backfills under negative wall friction. This, in turn, helps to shed light on the mechanism of load transfer in such situations so that reliable design charts and tables be provided for practical uses. The study continues with a soft computing model that leads to more robust and effective designs for earth-retaining structures under various negative wall frictions and sloping backfills. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Reliability‐based state parameter liquefaction probability prediction using soft computing techniques.
- Author
-
Kumar, Kishan, Samui, Pijush, and Choudhary, S. S.
- Subjects
- *
CONE penetration tests , *KRIGING , *SOFT computing , *ENGINEERING design , *CYCLIC loads - Abstract
The state parameter (ѱ) accounts for both relative density and effective stress, which influence the cyclic stress or liquefaction characteristic of the soil significantly. This study presents a ѱ‐based probabilistic liquefaction evaluation method using six soft computing (SC) techniques. The liquefaction probability of failure (PL) is calculated using the first‐order second moment (FOSM) method based on the cone penetration test (CPT) database. Then, six SC techniques, such as Gaussian process regression (GPR), relevance vector machine (RVM), functional network (FN), genetic programming (GP), minimax probability machine regression (MPMR) and multivariate adaptive regression splines (MARS), are used to predict PL. The performance of these models is examined using nine statistical indices. Additionally, plots such as regression plots, Taylor diagrams, error matrix and rank analysis are shown to assess the SC model's performance. Finally, sensitivity analysis is performed using the cosine amplitude method (CAM) to assess the influence of input parameters on output. The current study demonstrates that SC models based on state parameter predict PL effectively. RVM and MPMR models closely follow the GPR model in terms of performance, which is superior to the other models. Notably, two equations are generated using GP and MARS models to predict PL. The results of the sensitivity analysis reveal the magnitude of earthquake (Mw) as the most sensitive parameter. The outcomes of this research will offer risk evaluations for geotechnical engineering designs and expand the use of state parameter‐based SC models in liquefaction analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Stochastic debugging based reliability growth models for Open Source Software project.
- Author
-
Singhal, Shakshi, Kapur, P. K., Kumar, Vivek, and Panwar, Saurabh
- Subjects
- *
OPEN source software , *SOFT computing , *DISTRIBUTION (Probability theory) , *SOFTWARE reliability , *COMPUTER software development - Abstract
Open Source Software (OSS) is one of the most trusted technologies for implementing industry 4.0 solutions. The study aims to assist a community of OSS developers in quantifying the product's reliability. This research proposes reliability growth models for OSS by incorporating dynamicity in the debugging process. For this, stochastic differential equation-based analytical models are developed to represent the instantaneous rate of error generation. The fault introduction rate is modeled using exponential and Erlang distribution functions. The empirical applications of the proposed methodology are verified using the real-life failure data of the Open Source Software projects, GNU Network Object Model Environment, and Eclipse. A soft computing technique, Genetic Algorithm, is applied to estimate model parameters. Cross-validation is also performed to examine the forecasting efficacy of the model. The predictive power of the developed models is compared with various benchmark studies. The data analysis is conducted using the R statistical computing software. The results demonstrate the proposed models' efficacy in parameter estimation and predictive performance. In addition, the optimal release time policy based on the proposed mathematical models is presented by formulating the optimization model that intends to minimize the total cost of software development under reliability constraints. The numerical illustration and sensitivity analysis exhibit the proposed problem's practical significance. The findings of the numerical analysis exemplify the proposed study's capability of decision-making under uncertainty. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Canopy Temperature Estimation Using Gene Expression Programming Models and Artificial Neural Networks.
- Author
-
Saeidinia, Mehri and Haghiabi, AmirHamzeh
- Subjects
SATURATION vapor pressure ,IRRIGATION scheduling ,VAPOR pressure ,SOFT computing ,GENE expression - Abstract
Canopy temperature (T
c ) is one of the essential for irrigation scheduling. Measuring canopy temperature is expensive and time-consuming. Simple approaches such as soft computing can be a good tool for this purpose because there has been no documented research in this field. In this study, the ANN (MLP with two hidden layers) and GEP models were used to estimate Tc using limited data such as the dry (Ta ) and wet bulb (TW) temperatures, saturation vapor pressure (es ), actual vapor pressure (ea ), and the vapor-pressure deficit (VPD). Six combinations of input variables were investigated. The perfect model was selected based on statistical indices during the training and testing. Results showed that the performance of the models were influenced by the number of the input variables. The MLP models outperformed GEP models during the training and testing processes. The MLP7 (input variables: es and ea ) with MSE of 1.08°C, RMSE of 1.04°C, and R² of 0.92 in the training phase and MSE of 1.02, RMSE of 1.00, and R² of 0.95 in the validation phase was selected as the perfect model among MLP models. The GEP11(input variables: Ta , TW , es , ea , and VPD) with MSE of 1.32, RMSE of 1.15, and R2 of 0.89 in the training phase and MSE of 0.91, RMSE of 0.95, and R2 of 0.95 in the validation phase was also the perfect model among GEP models. Accordingly, the proposed GEP and MLP models can be drawn on as a perfect model for estimating TC . [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
50. A systematic review of applications of machine learning and other soft computing techniques for the diagnosis of tropical diseases
- Author
-
Attai, Kingsley, Amannejad, Yasaman, Pour, Maryam Vahdat, Obot, Okure, and Uzoka, Faith-Michael
- Published
- 2022
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.