1,789 results
Search Results
2. Parameter optimization of electromagnetic suspension-type maglev train control system based on multi-objective grey wolf non-dominated sorting hybrid algorithm-Ⅱ hybrid algorithm.
- Author
-
Wang, Meiqi, Zeng, Siheng, Liu, Pengfei, He, Yixin, and Chen, Enli
- Subjects
MAGNETIC levitation vehicles ,WOLVES ,OPTIMIZATION algorithms ,ALGORITHMS ,SEARCH algorithms ,STANDARD deviations ,BUOYANCY - Abstract
This paper presents a novel hybrid algorithm based on CMOGWO-ADNSGA-II to solve the vibration stability problem during the operation of a EMS-type maglev train dynamics model subjected to strong non-linear magnetic buoyancy. The proposed algorithm optimizes the control system parameters of EMS-type maglev train suspensions by combining an improved multi-objective chaotic grey wolf algorithm (CMOGWO) with an improved non-dominated Sorting genetic algorithm-II (ADNSGA-II) to enhance the search capability of the algorithm and ensure population diversity. The efficacy of the algorithm is demonstrated by applying it to the EMS-type maglev train suspension frame control system to find the optimal control parameters. Experimental results show that the system with the optimal parameters applied significantly reduces the suspension gap amplitude and the corresponding standard deviation, as well as the vertical acceleration amplitude and the corresponding standard deviation during operation. The proposed algorithm provides a good solution for EMS-type maglev train suspension vibration control, which can improve its performance and safety. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Defects Detection of Lithium-Ion Battery Electrode Coatings Based on Background Reconstruction and Improved Canny Algorithm.
- Author
-
Wang, Xianju, Liu, Shanhui, Zhang, Han, Li, Yinfeng, and Ren, Huiran
- Subjects
MAXIMUM entropy method ,COATING processes ,ELECTRODES ,SEARCH algorithms ,CORRECTION factors ,ALGORITHMS - Abstract
Aiming to address the problems of uneven brightness and small defects of low contrast on the surface of lithium-ion battery electrode (LIBE) coatings, this study proposes a defect detection method that combines background reconstruction with an enhanced Canny algorithm. Firstly, we acquire and pre-process the electrode coating image, considering the characteristics of the electrode coating process and defects. Secondly, background reconstruction and the difference method are introduced to achieve the rough localization of coating defects. Furthermore, the image with potential defects undergoes enhancement through improved Gamma correction, and the PSO-OTSU algorithm with adaptive searching is applied to determine the optimal segmentation. Finally, precise defect detection is accomplished using the improved Canny algorithm and morphological processing. The experimental results show that, compared with the maximum entropy method, the region growth method, and the traditional Canny algorithm, the algorithm in this paper has a higher segmentation accuracy for defects. It better retains defect edge features and provides a more accurate detection effect for defects like scratches, dark spots, bright spots, metal leakage, and decarburization, which are difficult to recognize on the background of coating areas of electrodes. The proposed method is suitable for the online real-time defect detection of LIBE coating defects in actual lithium-ion battery industrial production. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. An improved quantum particle swarm algorithm for solving multi-objective fuzzy flexible job shop scheduling problem.
- Author
-
Liu, Weiling, Xu, Jinliang, Ren, Guoqing, and Xiao, Yanjun
- Subjects
PRODUCTION scheduling ,FLOW shops ,SIMULATED annealing ,GATES ,ALGORITHMS ,SEARCH algorithms ,INDUSTRIAL costs - Abstract
Due to the dynamic nature of work conditions in the manufacturing plant, it is difficult to obtain accurate information on process processing time and energy consumption, affecting the implementation of scheduling solutions. The fuzzy flexible job shop scheduling problem with uncertain production parameters has not yet been well studied. In this paper, a scheduling optimization model with the objectives of maximum completion time, production cost and delivery satisfaction loss is developed using fuzzy triangular numbers to characterize the time parameters, and an improved quantum particle swarm algorithm is proposed to solve it. The innovations of this paper lie in designing a neighborhood search strategy based on machine code variation for deep search; using cross-maintaining the diversity of elite individuals, and combining it with a simulated annealing strategy for local search. Based on giving full play to the global search capability of the quantum particle swarm algorithm, the comprehensive search capability of the algorithm is enhanced by improving the average optimal position of particles. In addition, a gray target decision model is introduced to make the optimal decision on the scheduling scheme by comprehensively considering the fuzzy production cost. Finally, simulation experiments are conducted for test and engineering cases and compared with various advanced algorithms. The experimental results show that the proposed algorithm significantly outperforms the compared ones regarding convergence speed and precision in optimal-searching. The method provides a more reliable solution to the problem and has some application value. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. Fault diagnosis model of rolling bearing based on parameter adaptive AVMD algorithm.
- Author
-
Li, Meixuan, Yan, Chun, Liu, Wei, Liu, Xinhong, Zhang, Mengchao, and Xue, Jiankai
- Subjects
ROLLER bearings ,FAULT diagnosis ,ALGORITHMS ,SEARCH algorithms ,FEATURE extraction ,HILBERT-Huang transform ,INDEX numbers (Economics) - Abstract
Aiming at the intrinsic aspect that the weak features of the early fault information of rolling bearings are not easy to extract, a parameter Adaptive Variational Modal Decomposition (AVMD) based algorithm is proposed for bearing fault signal feature extraction. Since the number of Variational Modal Decomposition (VMD) decomposition and penalty factor play an important role in VMD decomposition effect, the irregularities in the selection of these two influencing parameters are analyzed. We exploit the stronger global search capability of the improved sparrow search algorithm (LSSA) for adaptive parameter selection of the VMD algorithm. In this paper, the Levy flight algorithm is introduced and chaos is added to initialize sparrow population position to prevent sparrow from falling into the disadvantage of local optimum in the search process. In addition, this paper also combines the maximum kurtosis index, the minimum envelope entropy index and the number of iterations of VMD to form the objective function of the LSSA. The VMD algorithm with optimized parameters decomposes the signal to be measured, the decomposed IMFs was reconstructed, finally the validity of the model was verified by calculating 20 features (time domain and frequency domain) of the reconstructed signal as the input vector of the SVM classifier. Finally, the feasibility of this model in fault diagnosis of rolling bearing is verified using simulation and example experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. An ISSA-RF Algorithm for Prediction Model of Drug Compound Molecules Antagonizing ERα Gene Activity.
- Author
-
Minxi Rong, Yong Li, Xiaoli Guo, Tao Zong, Zhiyuan Ma, and Penglei Li
- Subjects
DOSAGE forms of drugs ,PREDICTION models ,STANDARD deviations ,SEARCH algorithms ,ALGORITHMS - Abstract
Objectives: The ERα biological activity prediction model is constructed by the compound molecular data of the anti-breast cancer therapeutic target ERα and its biological activity data, which improves the screening efficiency of anti-breast cancer drug candidates and saves the time and cost of drug development. Methods: In this paper, Ridge model is used to screen out molecular descriptors with a high degree of influence on the biological activity of Erα and divide datasets with different numbers of the molecular descriptors by screening results. Random Forest (RF) is trained by Root Mean Square Error (RMSE) and Coefficient of determination (R2) to determine the parameter range of RF optimized by Improved Sparrow Search Algorithm (ISSA-RF) which adds adaptive weights compared with the ordinary Sparrow Search Algorithm (SSA). Then the divided datasets were put into the ISSA-RF with defined parameter ranges to construct a regression prediction model for the biological activity of compounds on Erα, and compared with Genetic Algorithm Optimized Support Vector Machine (GA-SVM), Back Propagation Neural Network (BP), Extreme Gradient Boosting (XGBoost) for analysis and comparison. Results: We have tried a variety of combinations of molecular descriptors with different numbers and the above four models all achieve the best accuracy model on the dataset constructed when using 100 molecular descriptors. The ISSA-RF model proposed in this paper has a high degree of agreement between the predicted biological value of ERα and the actual value and prediction accuracy (RMSE) is 0.6876389. Conclusions: In the training model, ISSA-RF is proposed and it is proved that adding adaptive weights can greatly optimize the fitness accuracy of the sparrow algorithm. In the experimental part, this paper uses a variety of molecular descriptors for training, which reduces the chance of model training accuracy caused by the number of different molecular descriptors, and limits the search range of the ISSA-RF model to avoid the local optimization of the model. Secondly, the parameter optimization time is greatly reduced. In conclusion, the prediction model of drug compound molecules that antagonize ERα gene activity (ISSA-RF) proposed in this paper improves the accuracy and efficiency of anti-breast cancer drug candidates, and provides a new idea for building a quantitative structure-activity relationship model. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. Autofocus algorithm using optimized Laplace evaluation function and enhanced mountain climbing search algorithm.
- Author
-
Jia, Dongyao, Zhang, Chuanwang, Wu, Nengkai, Zhou, Jialin, and Guo, Zhigang
- Subjects
SEARCH algorithms ,MOUNTAINEERING ,LAPLACIAN operator ,ALGORITHMS ,IMAGING systems ,IMAGE recognition (Computer vision) - Abstract
In the field of digital imaging systems, autofocus plays increasingly a vital role as a key technology. Autofocus poses a great challenge due to nosiy background and slow focusing speed. This paper presents a new focusing algorithm based on improved Laplacian operator and mountain-climb search algorithm. The clear image after focusing is more different in gray scale than the image without focusing, an image definition evaluation function combining local variance and Laplacian operator is proposed. Learning from the advantages of two-stage recognition in deep learning image recognition, an two-stage search algorithm based on mountain-climb search is designed to better fit the focusing curve near the extreme value of focusing evaluation function, improved mountain-climb search algorithm is divided into rough focusing and fine focusing. The method of rough focusing is used to determine a small focus area, and then fine focusing based on function approximation can greatly improve the efficiency of focus position.The experimental results indicate that this algorithm in this paper is superior to the traditional algorithm in time and accuracy, and the time of the autofocus is reduced by 76%. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Enhancing Medical Image Classification with an Advanced Feature Selection Algorithm: A Novel Approach to Improving the Cuckoo Search Algorithm by Incorporating Caputo Fractional Order.
- Author
-
Habeb, Abduljlil Abduljlil Ali Abduljlil, Taresh, Mundher Mohammed, Li, Jintang, Gao, Zhan, and Zhu, Ningbo
- Subjects
IMAGE recognition (Computer vision) ,FEATURE selection ,SEARCH algorithms ,MEDICAL coding ,ALGORITHMS - Abstract
Glaucoma is a chronic eye condition that seriously impairs vision and requires early diagnosis and treatment. Automated detection techniques are essential for obtaining a timely diagnosis. In this paper, we propose a novel method for feature selection that integrates the cuckoo search algorithm with Caputo fractional order (CFO-CS) to enhance the performance of glaucoma classification. However, when using the infinite series, the Caputo definition has memory length truncation issues. Therefore, we suggest a fixed memory step and an adjustable term count for optimization. We conducted experiments integrating various feature extraction techniques, including histograms of oriented gradients (HOGs), local binary patterns (LBPs), and deep features from MobileNet and VGG19, to create a unified vector. We evaluate the informative features selected from the proposed method using the k-nearest neighbor. Furthermore, we use data augmentation to enhance the diversity and quantity of the training set. The proposed method enhances convergence speed and the attainment of optimal solutions during training. The results demonstrate superior performance on the test set, achieving 92.62% accuracy, 94.70% precision, 93.52% F1-Score, 92.98% specificity, 92.36% sensitivity, and 85.00% Matthew's correlation coefficient. The results confirm the efficiency of the proposed method, rendering it a generalizable and applicable technique in ophthalmology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A simple weighting method for inverting earthquake source parameters using geodetic multisource data under Bayesian algorithm.
- Author
-
Xi, Can, Wang, Leyang, Zhao, Xiong, Sun, Zhanglin, Zhao, Weifeng, Pang, Ming, and Wu, Fei
- Subjects
- *
EARTHQUAKES , *STANDARD deviations , *CONSTRAINT algorithms , *GEODESICS , *SEARCH algorithms , *ANGLES , *ALGORITHMS - Abstract
More accurate inversion of source fault geometry and slip parameters under the constraint of the Bayesian algorithm has become a research hotspot in the field of geodetic inversion in recent years. In nonlinear inversion, the determination of the weight ratio of the joint inversion of multisource data is more complicated. In this context, this paper proposes a simple and easily generalized weighting method for inversion of source fault parameters by joint geodetic multisource data under the Bayesian framework. This method determines the relative weight ratio of multisource data by root mean square error (RMSE) value and can be extended to other nonlinear search algorithms. To verify the validity of the method in this paper, this paper first sets up four sets of simulated seismic experiment schemes. The inversion results show that the joint inversion weighting method proposed in this paper has a significant decrease in the large residual value compared with the equal weight joint inversion and the single data source joint inversion method. The east–west deformation RMSE is 0.1458 mm, the north–south deformation RMSE is 0.2119 mm and the vertical deformation RMSE is 0.2756 mm. The RMSEs of the three directions are lower than those of other schemes, indicating that the proposed method is suitable for the joint inversion of source parameters under Bayesian algorithm. To further verify the applicability of the proposed method in complex earthquakes, the source parameters of the Maduo earthquake were inverted using the method of this paper. The focal depth of the inversion results in this paper is closer to the focal depth released by the GCMT agency. In terms of strike angle and dip angle, the joint inversion in this paper is also more inclined to the GCMT results. The joint inversion results generally conform to the characteristics of left-lateral strike-slip, which shows the adaptability of this method in complex earthquakes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Two-Stage Probe-Based Search Optimization Algorithm for the Traveling Salesman Problems.
- Author
-
Rahman, Md. Azizur and Ma, Jinwen
- Subjects
OPTIMIZATION algorithms ,SEARCH algorithms ,COMBINATORIAL optimization ,OPERATIONS research ,ARTIFICIAL intelligence ,ALGORITHMS - Abstract
As a classical combinatorial optimization problem, the traveling salesman problem (TSP) has been extensively investigated in the fields of Artificial Intelligence and Operations Research. Due to being NP-complete, it is still rather challenging to solve both effectively and efficiently. Because of its high theoretical significance and wide practical applications, great effort has been undertaken to solve it from the point of view of intelligent search. In this paper, we propose a two-stage probe-based search optimization algorithm for solving both symmetric and asymmetric TSPs through the stages of route development and a self-escape mechanism. Specifically, in the first stage, a reasonable proportion threshold filter of potential basis probes or partial routes is set up at each step during the complete route development process. In this way, the poor basis probes with longer routes are filtered out automatically. Moreover, four local augmentation operators are further employed to improve these potential basis probes at each step. In the second stage, a self-escape mechanism or operation is further implemented on the obtained complete routes to prevent the probe-based search from being trapped in a locally optimal solution. The experimental results on a collection of benchmark TSP datasets demonstrate that our proposed algorithm is more effective than other state-of-the-art optimization algorithms. In fact, it achieves the best-known TSP benchmark solutions in many datasets, while, in certain cases, it even generates solutions that are better than the best-known TSP benchmark solutions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. LTHS: A heuristic algorithm based on local two-hop search strategy for influence maximization in social networks.
- Author
-
Qiu, Liqing, Yang, Zhongqi, Zhu, Shiwei, Gu, Chunmei, and Tian, Xiangbo
- Subjects
SOCIAL influence ,SOCIAL networks ,ALGORITHMS ,SEARCH algorithms ,VIRAL marketing - Abstract
Influence maximization is a classic network optimization problem, which has been widely used in the field of viral marketing. The influence maximization problem aims to find a fixed number of active nodes. After a specific propagation model, the number of active nodes reaches the maximum. However, the existing influence maximization algorithms are overly pursuing certain indicators of efficiency or accuracy, which cannot be well accepted by some researchers. This paper proposes an effective algorithm to balance the accuracy and efficiency of the influence maximization problem called local two-hop search algorithm (LTHS). The core of the proposed algorithm is a node not only be affected by one-hop neighbor nodes, but also by two-hop neighbor nodes. Firstly, this paper selects initial seed nodes according to the characteristics of the node degree. Generally, the high degree of nodes regards as influential nodes. Secondly, this paper proposes a node two-hop influence evaluate function called two-hop diffusion value (THDV), which can evaluate node influence more accurately. Furthermore, in order to seek higher efficiency, this paper proposes a method to reduce the network scale. This paper conducted full experiments on five real-world social network datasets, and compared with other four well-known algorithms. The experimental results show that the LTHS algorithm is better than the comparison algorithms in terms of efficiency and accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
12. A Novel Study on Localization in Scene Text Detection.
- Author
-
Sonsare, Pravinkumar, Jain, Rushabh, Runwal, Rutuj, Dave, Kunal, and Banode, Ashutosh
- Subjects
DEEP learning ,TEXT recognition ,COMPUTER vision ,SEARCH algorithms ,ALGORITHMS - Abstract
Scene text detection has been one of the most important topics for research in computer vision. With constant development and rise in deep learning, computer vision technology has undergone an impactful transformation. In the era before deep learning, there existed algorithms and technologies for scene text detection, but the performance was mediocre. In recent years, deep learning technology has remarkably transformed scene text detection. Researchers have witnessed notable advancements in the approach, methodology, and overall performance of the newly discovered techniques. In this paper, the predominant focus is on summarizing and analysing the significant progress in scene text detection through deep learning. This paper covers an introduction to scene text detection, steps to perform scene text recognition and detection, technique before deep-learning, recent techniques and their insights, some results, and an overview by comparing the algorithms. We will also emphasize the criteria that make a search algorithm a good choice for performing scene text detection and recognition, the notable differences incorporated by deep learning, and analyse the drawbacks of the techniques used before deep learning. This paper would be helpful to understand the key differences that have changed this field and also some remaining challenges. [ABSTRACT FROM AUTHOR]
- Published
- 2023
13. An Improved Adaptive Sparrow Search Algorithm for TDOA-Based Localization.
- Author
-
Dong, Jiaqi, Lian, Zengzeng, Xu, Jingcheng, and Yue, Zhe
- Subjects
SEARCH algorithms ,OPTIMIZATION algorithms ,SWARM intelligence ,SPARROWS ,MEASUREMENT errors ,LEAST squares ,ALGORITHMS - Abstract
The Ultra-Wideband (UWB) indoor positioning method is widely used in areas where no satellite signals are available. However, during the measurement process of UWB, the collected data contain random errors. To alleviate the effect of random errors on positioning accuracy, an improved adaptive sparrow search algorithm (IASSA) based on the sparrow search algorithm (SSA) is proposed in this paper by introducing three strategies, namely, the two-step weighted least squares algorithm, adaptive adjustment of search boundary, and producer–scrounger quantity adaptive adjustment. The simulation and field test results indicate that the IASSA algorithm achieves significantly higher localization accuracy than previous methods. Meanwhile, the IASSA algorithm requires fewer iterations, which overcomes the problem of the long computation time of the swarm intelligence optimization algorithm. Therefore, the IASSA algorithm has advantages in indoor positioning accuracy and robustness performance. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. Detection of crowdedness in bus compartments based on ResNet algorithm and video images.
- Author
-
Zhao, Jiandong, Lei, Wei, Li, Zijian, Zhao, Dongfeng, Han, Mingmin, and Hou, Xiaoqing
- Subjects
PARTICLE swarm optimization ,SUPPORT vector machines ,SEARCH algorithms ,GENETIC algorithms ,BUS transportation ,ALGORITHMS - Abstract
The crowding in bus is an important factor affecting passenger satisfaction and bus dispatching level. However, how to use video images to detect crowding accurately is a difficult problem. In this paper, firstly, an image sample library is established based on the evaluation standard of crowding in bus, which contains 16346 sample images. Then, Local Binary Pattern (LBP) and Gray Level Co-occurrence Matrix (GLCM) are used to extract the texture features of the image in bus. Then, a rough classification method of crowding based on Support Vector Machine (SVM) is proposed. At the same time, in order to improve the accuracy of rough classification of crowding, the optimization effects of grid search algorithm, particle swarm optimization algorithm and genetic algorithm on SVM parameters are compared. The results show that the optimization effect of genetic algorithm is the best, and the accuracy rate is 93.20%. Finally, for the problem that the SVM method is not ideal in the fine classification of crowding, this paper proposes a new method based on ResNet. SGD, Adadelta and Adam are selected to optimize the parameters of ResNet model. The accuracy of the optimal Adam algorithm reaches 96.22%, which effectively solves the problem of the fine classification of crowding in bus. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Otsu's Image Segmentation Algorithm with Memory-Based Fruit Fly Optimization Algorithm.
- Author
-
Chai, Ruishuai
- Subjects
MATHEMATICAL optimization ,INTERPOLATION algorithms ,FRUIT flies ,IMAGE segmentation ,GENETIC algorithms ,ALGORITHMS ,SEARCH algorithms ,THRESHOLDING algorithms ,INTERPOLATION - Abstract
In this paper, the most common pepper noise in grayscale image noise is investigated in depth in the median filtering algorithm, and the improved median filtering algorithm, adaptive switching median filtering algorithm, and adaptive polar median filtering algorithm are applied to the OTSU algorithm. Two improved OTSU algorithms such as the adaptive switched median filter-based OTSU algorithm and the polar adaptive median filter-based OTSU algorithm are obtained. The experimental results show that the algorithm can better cope with grayscale images contaminated by pretzel noise, and the segmented grayscale images are not only clear but also can better retain the detailed features of grayscale images. A genetic algorithm is a kind of search algorithm with high adaptive, fast operation speed, and good global space finding ability, and it will have a good effect when applied to the threshold finding of the OTSU algorithm. However, the traditional genetic algorithm will fall into the local optimal solution in different degrees when finding the optimal threshold. The advantages of the two interpolation methods proposed in this paper are that one is the edge grayscale image interpolation algorithm using OTSU threshold adaptive segmentation and the other is the edge grayscale image interpolation algorithm using local adaptive threshold segmentation, which can accurately divide the grayscale image region according to the characteristics of different grayscale images and effectively improve the loss of grayscale image edge detail information and jagged blur caused by the classical interpolation algorithm. The visual effect of grayscale images is enhanced by selecting grayscale images from the standard grayscale image test set and interpolating them with bilinear interpolation, bucolic interpolation, NEDI interpolation, and FEOI interpolation for interpolation simulation validation. The subjective evaluation and objective evaluation, as well as the running time, are compared, respectively, showing that the method of this paper can effectively improve the quality of grayscale image interpolation. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
16. An Optimized Probabilistic Roadmap Algorithm for Path Planning of Mobile Robots in Complex Environments with Narrow Channels.
- Author
-
Qiao, Lijun, Luo, Xiao, and Luo, Qingsheng
- Subjects
POTENTIAL field method (Robotics) ,ROBOTIC path planning ,SEARCH algorithms ,ENERGY function ,ALGORITHMS - Abstract
In this paper, we propose a new path planning algorithm based on the probabilistic roadmaps method (PRM), in order to effectively solve the autonomous path planning of mobile robots in complex environments with multiple narrow channels. The improved PRM algorithm mainly improves the density and distribution of sampling points in the narrow channel, through a combination of the learning process of the PRM algorithm and the APF algorithm. We also shortened the required time and path length by optimizing the query process. The first key technology to improve the PRM algorithm involves optimizing the number and distribution of free points and collision-free lines in the free workspace. To ensure full visibility of the narrow channel, we extend the obstacles through the diagonal distance of the mobile robot while ignoring the safety distance. Considering the safety distance during movement, we re-classify the all sampling points obtained by the quasi-random sampling principle into three categories: free points, obstacle points, and adjacent points. Next, we transform obstacle points into the free points of the narrow channel by combining the APF algorithm and the characteristics of the narrow channel, increasing the density of sampling points in the narrow space. Then, we include potential energy judgment into the construction process of collision-free lines shortening the required time and reduce collisions with obstacles. Optimizing the query process of the PRM algorithm is the second key technology. To reduce the required time in the query process, we adapt the bidirectional A* algorithm to query these local paths and obtain an effective path to the target point. We also combine the path pruning technology with the potential energy function to obtain a short path without collisions. Finally, the experimental results demonstrate that the new PRM path planning technology can improve the density of free points in narrow spaces and achieve an optimized, collision-free path in complex environments with multiple narrow channels. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. Adaptive Fractional-Order Multi-Scale Optimization TV-L1 Optical Flow Algorithm.
- Author
-
Yang, Qi, Wang, Yilu, Liu, Lu, and Zhang, Xiaomeng
- Subjects
OPTICAL flow ,OPTIMIZATION algorithms ,ANT algorithms ,ALGORITHMS ,SWARM intelligence ,SEARCH algorithms - Abstract
We propose an adaptive fractional multi-scale optimization optical flow algorithm, which for the first time improves the over-smoothing of optical flow estimation under the total variation model from the perspective of global feature and local texture balance, and solves the problem that the convergence of fractional optical flow algorithms depends on the order parameter. Specifically, a fractional-order discrete L1-regularization Total Variational Optical Flow model is constructed. On this basis, the Ant Lion algorithm is innovatively used to realize the iterative calculation of the optical flow equation, and the fractional order is dynamically adjusted to obtain an adaptive optimization algorithm with strong search accuracy and high efficiency. In this paper, the flexibility of optical flow estimation in weak gradient texture scenes is increased, and the optical flow extraction rate of target features at multiple scales is greatly improved. We show excellent recognition performance and stability under the MPI_Sintel and Middlebury benchmarks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Feature selection algorithms in generalized additive models under concurvity.
- Author
-
Kovács, László
- Subjects
ALGORITHMS ,CONSTRAINT algorithms ,FEATURE selection ,SEARCH algorithms ,MULTICOLLINEARITY - Abstract
In this paper, the properties of 10 different feature selection algorithms for generalized additive models (GAMs) are compared on one simulated and two real-world datasets under concurvity. Concurvity can be interpreted as a redundancy in the feature set of a GAM. Like multicollinearity in linear models, concurvity causes unstable parameter estimates in GAMs and makes the marginal effect of features harder interpret. Feature selection algorithms for GAMs can be separated into four clusters: stepwise, boosting, regularization and concurvity controlled methods. Our numerical results show that algorithms with no constraints on concurvity tend to select a large feature set, without significant improvements in predictive performance compared to a more parsimonious feature set. A large feature set is accompanied by harmful concurvity in the proposed models. To tackle the concurvity phenomenon, recent feature selection algorithms such as the mRMR and the HSIC-Lasso incorporated some constraints on concurvity in their objective function. However, these algorithms interpret concurvity as pairwise non-linear relationship between features, so they do not account for the case when a feature can be accurately estimated as a multivariate function of several other features. This is confirmed by our numerical results. Our own solution to the problem, a hybrid genetic–harmony search algorithm (HA) introduces constrains on multivariate concurvity directly. Due to this constraint, the HA proposes a small and not redundant feature set with predictive performance similar to that of models with far more features. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. An efficient algorithm with fast convergence rate for sparse graph signal reconstruction.
- Author
-
Cao, Yuting, Jiang, Xue-Qin, Wang, Jian, Zhou, Shubo, and Hou, Xinxin
- Subjects
SPARSE graphs ,SIGNAL reconstruction ,COMPRESSED sensing ,ALGORITHMS ,SEARCH algorithms ,IMAGE reconstruction algorithms ,THRESHOLDING algorithms - Abstract
In this paper, we consider the graph signals are sparse in the graph Fourier domain and propose an iterative threshold compressed sensing reconstruction (ITCSR) algorithm to reconstruct sparse graph signals in the graph Fourier domain. The proposed ITCSR algorithm derives from the well-known compressed sensing by considering a threshold for sparsity-promoting reconstruction of the underlying graph signals. The proposed ITCSR algorithm enhances the performance of sparse graph signal reconstruction by introducing a threshold function to determine a suitable threshold. Furthermore, we demonstrate that the suitable parameters for the threshold can be automatically determined by leveraging the sparrow search algorithm. Moreover, we analytically prove the convergence property of the proposed ITCSR algorithm. In the experimental, numerical tests with synthetic as well as 3D point cloud data demonstrate the merits of the proposed ITCSR algorithm relative to the baseline algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Data transmission path planning method for wireless sensor network in grounding grid area based on MM‐DPS hybrid algorithm.
- Author
-
Xiao, Xianghui, Huang, Longsheng, Zhang, Zhenshan, Huang, Mingxian, Guan, Luchang, and Song, Yunhao
- Subjects
WIRELESS sensor networks ,DATA transmission systems ,SEARCH algorithms ,MULTICASTING (Computer networks) ,NONDESTRUCTIVE testing ,ALGORITHMS ,ENERGY consumption - Abstract
At present, in order to conduct non‐destructive testing on the grounding grid of substations under the condition of continuous power supply and no excavation, researchers have applied wireless technology based on electrochemical methods to remotely monitor the corrosion state of grounding conductors online. Nevertheless, wireless signals are affected by the environment when they are transmitted underground. In the field of grounding gird wireless monitoring, how to plan the information transmission path of wireless sensor network (WSN) with high accuracy of data transfer and low energy consumption earns growing research attention. To address the problem of WSN path planning in grounding grid area, a path planning method for WSN based on the hybrid algorithm of map‐matching algorithm and double‐pole search algorithm (MM‐DPS) is proposed in this paper. The map‐matching algorithm is employed to calculate the optimal sampling node number of the data transmission path. On the basis of the optimal sampling node number, the double‐pole search algorithm is employed in seeking out each sensor node of the path, and two groups of path plans are obtained. In the simulation experiment, compared with the A‐star algorithm, the MM‐DPS algorithm shortens the data transmission path length by about 39% and reduces the energy consumption by about 57%. The research work brings a method to alleviate the problem of data transmission underground of WSN in grounding grid area. The method not only ensures the accuracy of data transmission, but also shorts the transmission distance and reduces energy consumption. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. A New Hybrid Three-Term LS-CD Conjugate Gradient In Solving Unconstrained Optimization Problems.
- Author
-
Ishak, M. A. I. and Marjugi, S. M.
- Subjects
- *
SEARCH algorithms , *QUASI-Newton methods , *TECHNOLOGY convergence , *ALGORITHMS - Abstract
The Conjugate Gradient (CG) method is renowned for its rapid convergence in optimization applications. Over the years, several modifications to CG methods have emerged to improve computational efficiency and tackle practical challenges. This paper presents a new three-term hybrid CG method for solving unconstrained optimization problems. This algorithm utilizes a search direction that combines Liu-Storey (LS) and Conjugate Descent (CD) CG coefficients and standardizes it using a spectral which acts as a scheme for the choices of the conjugate parameters. This resultant direction closely approximates the memoryless Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton direction, known for its bounded nature and compliance with the sufficient descent condition. The paper establishes the global convergence under standard Wolfe conditions and some appropriate assumptions. Additionally, the numerical experiments were conducted to emphasize the robustness and superior efficiency of this hybrid algorithm in comparison to existing approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Best Principal Submatrix Selection for the Maximum Entropy Sampling Problem: Scalable Algorithms and Performance Guarantees.
- Author
-
Li, Yongchun and Xie, Weijun
- Subjects
ARTIFICIAL intelligence ,APPROXIMATION algorithms ,ENTROPY ,SEARCH algorithms ,ALGORITHMS ,SURETYSHIP & guaranty - Abstract
This paper studies a classic maximum entropy sampling problem (MESP), which aims to select the most informative principal submatrix of a prespecified size from a covariance matrix. MESP is widely applied to many areas, including healthcare, power systems, manufacturing, and data science. By investigating its Lagrangian dual and primal characterization, we derive a novel convex integer program for MESP and show that its continuous relaxation yields a near-optimal solution. The results motivate us to study efficient approximation algorithms and develop their approximation bounds for MESP, which improves the best known one in the literature. This paper studies a classic maximum entropy sampling problem (MESP), which aims to select the most informative principal submatrix of a prespecified size from a covariance matrix. By investigating its Lagrangian dual and primal characterization, we derive a novel convex integer program for MESP and show that its continuous relaxation yields a near-optimal solution. The results motivate us to develop a sampling algorithm and derive its approximation bound for MESP, which improves the best known bound in literature. We then provide an efficient deterministic implementation of the sampling algorithm with the same approximation bound. Besides, we investigate the widely used local search algorithm and prove its first known approximation bound for MESP. The proof techniques further inspire for us an efficient implementation of the local search algorithm. Our numerical experiments demonstrate that these approximation algorithms can efficiently solve medium-size and large-scale instances to near optimality. Finally, we extend the analyses to the A-optimal MESP, for which the objective is to minimize the trace of the inverse of the selected principal submatrix. Funding: This work was supported by the National Science Foundation Division of Information and Intelligent Systems [Grant 2246417] and Division of Civil, Mechanical and Manufacturing Innovation [Grant 2246414]. Supplemental Material: The e-companion is available at https://doi.org/10.1287/opre.2023.2488. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. A Population-Based Search Approach to Solve Continuous Distributed Constraint Optimization Problems.
- Author
-
Liao, Xin and Hoang, Khoi D.
- Subjects
UTILITY functions ,DISTRIBUTED algorithms ,SEARCH algorithms ,BENCHMARK problems (Computer science) ,HEURISTIC ,RESEARCH personnel ,ALGORITHMS - Abstract
Distributed Constraint Optimization Problems (DCOPs) are an efficient framework widely used in multi-agent collaborative modeling. The traditional DCOP framework assumes that variables are discrete and constraint utilities are represented in tabular forms. However, the variables are continuous and constraint utilities are in functional forms in many practical applications. To overcome this limitation, researchers have proposed Continuous DCOPs (C-DCOPs), which can model DCOPs with continuous variables. However, most of the existing C-DCOP algorithms rely on gradient information for optimization, which means that they are unable to solve the situation where the utility function is a non-differentiable function. Although the Particle Swarm-Based C-DCOP (PCD) and Particle Swarm with Local Decision-Based C-DCOP (PCD-LD) algorithms can solve the situation with non-differentiable utility functions, they need to implement Breadth First Search (BFS) pseudo-trees for message passing. Unfortunately, employing the BFS pseudo-tree results in expensive computational overheads and agent privacy leakage, as messages are aggregated to the root node of the BFS pseudo-tree. Therefore, this paper aims to propose a fully distributed C-DCOP algorithm to solve the utility function form problem and avoid the disadvantages caused by the BFS pseudo-tree. Inspired by the population-based algorithms, we propose a fully decentralized local search algorithm, named Population-based Local Search Algorithm (PLSA), for solving C-DCOPs with three-fold advantages: (i) PLSA adopts a heuristic method to guide the local search to achieve a fast search for high-quality solutions; (ii) in contrast to the conventional C-DCOP algorithm, PLSA can solve utility functions of any form; and (iii) compared to PCD and PCD-LD, PLSA avoids complex message passing to achieve efficient computation and agent privacy protection. In addition, we implement an extended version of PLSA, named Population-based Global Search Algorithm (PGSA), and empirically show that our algorithms outperform the state-of-the-art C-DCOP algorithms on three types of benchmark problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. An Improved Gravitational Search Algorithm for Task Offloading in a Mobile Edge Computing Network with Task Priority.
- Author
-
Xu, Ling, Liu, Yunpeng, Fan, Bing, Xu, Xiaorong, Mei, Yiguo, and Feng, Wei
- Subjects
EDGE computing ,MOBILE computing ,SEARCH algorithms ,GENETIC algorithms ,ENERGY consumption ,ALGORITHMS - Abstract
Mobile edge computing (MEC) distributes computing and storage resources to the edge of the network closer to the user and significantly reduces user task completion latency and system energy consumption. This paper investigates the problem of computation offloading in a three-tier mobile edge computing network composed of multiple users, multiple edge servers, and a cloud server. In this network, each user's task can be divided into multiple subtasks with serial and parallel priority relationships existing among these subtasks. An optimization model is established with the objective of minimizing the total user delay and processor cost under constraints such as the available resources of users and servers and the interrelationships among the subtasks. An improved gravitational search algorithm (IGSA) is proposed to solve this optimization model. In contrast with the other gravitational search algorithm, the convergence factor is introduced in the calculation of the resultant force and the crossover operation in a genetic algorithm is performed when generating the new particles during each iteration. The simulation results show that the proposed IGSA greatly improves the system performance compared with the existing algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Periodic Communities Mining in Temporal Networks: Concepts and Algorithms.
- Author
-
Qin, Hongchao, Li, Rong-Hua, Yuan, Ye, Wang, Guoren, Yang, Weihua, and Qin, Lu
- Subjects
TIME-varying networks ,MINES & mineral resources ,SEARCH algorithms ,ALGORITHMS ,SOCIAL interaction ,COMMUNITIES - Abstract
Periodicity is a frequently happening phenomenon for social interactions in temporal networks. Mining periodic communities are essential to understanding periodic group behaviors in temporal networks. Unfortunately, most previous studies for community mining in temporal networks ignore the periodic patterns of communities. In this paper, we study the problem of seeking periodic communities in a temporal network, where each edge is associated with a set of timestamps. We propose novel models, including $\sigma$ σ -periodic $k$ k -core and $\sigma$ σ -periodic $k$ k -clique, that represent periodic communities in temporal networks. Specifically, a $\sigma$ σ -periodic $k$ k -core (or $\sigma$ σ -periodic $k$ k -clique) is a $k$ k -core (or clique with size larger than $k$ k ) that appears at least $\sigma$ σ times periodically in the temporal graph. The problem of searching periodic core is efficient but the resulting communities may be not enough cohesive; the problem of enumerating all periodic cliques is not efficient (NP-hard) but the resulting communities are very cohesive. To compute all of them efficiently, we first develop two effective graph reduction techniques to significantly prune the temporal graph. Then, we transform the temporal graph into a static graph and prove that mining the periodic communities in the temporal graph equals mining communities in the transformed graph. Subsequently, we propose a decomposition algorithm to search maximal $\sigma$ σ -periodic $k$ k -core, a Bron-Kerbosch style algorithm to enumerate all maximal $\sigma$ σ -periodic $k$ k -cliques, and a branch-and-bound style algorithm to find the maximum $\sigma$ σ -periodic clique. The results of extensive experiments on five real-life datasets demonstrate the efficiency, scalability, and effectiveness of our algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
26. An Advanced Crow Search Algorithm for Solving Global Optimization Problem.
- Author
-
Lee, Donwoo, Kim, Jeonghyun, Shon, Sudeok, and Lee, Seungjae
- Subjects
SEARCH algorithms ,METAHEURISTIC algorithms ,GLOBAL optimization ,TABU search algorithm ,ALGORITHMS - Abstract
The conventional crow search (CS) algorithm is a swarm-based metaheuristic algorithm that has fewer parameters, is easy to apply to problems, and is utilized in various fields. However, it has a disadvantage, as it is easy for it to fall into local minima by relying mainly on exploitation to find approximations. Therefore, in this paper, we propose the advanced crow search (ACS) algorithm, which improves the conventional CS algorithm and solves the global optimization problem. The ACS algorithm has three differences from the conventional CS algorithm. First, we propose using dynamic A P (awareness probability) to perform exploration of the global region for the selection of the initial population. Second, we improved the exploitation performance by introducing a formula that probabilistically selects the best crows instead of randomly selecting them. Third, we improved the exploration phase by adding an equation for local search. The ACS algorithm proposed in this paper has improved exploitation and exploration performance over other metaheuristic algorithms in both unimodal and multimodal benchmark functions, and it found the most optimal solutions in five engineering problems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. Crow Search Algorithm with Improved Objective Function for Test Case Generation and Optimization.
- Author
-
Sharma, Meena and Pathik, Babita
- Subjects
SEARCH algorithms ,MATHEMATICAL functions ,FLOWGRAPHS ,ALGORITHMS ,MATHEMATICAL optimization - Abstract
Test case generation and optimization is the foremost requirement of software evolution and test automation. In this paper, a bio-inspired Crow Search Algorithm (CSA) is suggested with an improved objective function to fulfill this requirement. CSA is a nature-inspired optimization method. The improved objective function combines branch distance and predicate distance to cover the critical path on the control flow graph. CSA is a search-based technique that uses heuristic information for automation testing, and CSA optimizers minimize test cases generated by satisfying the objective function. This paper focuses on generating test cases for all paths, including critical paths. The control flow graph covers the information flow among all the classes, functions, and conditional statements and provides test paths. The number of test cases examined through graph path coverage analysis. The minimum number of test paths is counted through complexity metrics using the cyclomatic complexity of the constructed graph. The proposed method is evaluated as mathematical optimization functions to validate their effectiveness in locating optimal solutions. The python codes are considered for evaluation and revealed that our approach is time-efficient and outperforms various optimization algorithms. The proposed approach achieved 100% path coverage, and the algorithm executes and gives optimum results in approximately 0.2745 seconds. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
28. A novel enhanced flow regime algorithm using opposition-based learning.
- Author
-
Lv, Zhaoming
- Subjects
NUMERICAL functions ,ALGORITHMS ,SEARCH algorithms ,METAHEURISTIC algorithms ,PARTICLE swarm optimization ,LEARNING strategies ,SCIENCE & industry - Abstract
Metaheuristics are widely used in science and industry because it as a high-level heuristic technique can provide robust or advanced solutions compared to classical search algorithms. Flow Regime Algorithm is a novel physics-based optimization approach recently proposed, and it is one of the candidate algorithms for solving complex optimization problems because of its few parameter configurations, simple coding, and good performance. However, the population that initialized randomly may have poor diversity issues, resulting in insufficient global search, and premature convergence to local optimum. To solve this problem, in this paper, a novel enhanced Flow Regime Algorithm based on opposition learning scheme is proposed. The proposed algorithm introduces the opposition-based learning strategy into the generation of some populations to enhance the global search performance while maintaining a fast convergence rate. In order to verify the performance of the proposed algorithm, 23 benchmark numerical optimization functions were studied experimentally in detail and compared with six well-known algorithms. Experimental results show that the proposed algorithm outperforms all other metaheuristic algorithms in all unimodal functions with higher accuracy, and can obtain competitive results on more multimodal cases. A statistical comparison shows that the proposed algorithm has superiority. Finally, that the proposed algorithm can achieve higher quality alignment compared to most other metaheuristic-based systems and OAEI ontology alignment systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
29. Sleep-wakeup scheduling algorithm for lifespan maximization of directional sensor networks: a discrete cuckoo search optimization algorithm.
- Author
-
Mortazavi, Mir Gholamreza, Hosseini Shirvani, Mirsaeid, Dana, Arash, and Fathy, Mahmood
- Subjects
OPTIMIZATION algorithms ,SEARCH algorithms ,WIRELESS sensor networks ,METAHEURISTIC algorithms ,SENSOR networks ,POLYNOMIAL time algorithms ,ALGORITHMS ,ENERGY harvesting - Abstract
Directional sensor networks (DSNs) are ad-hoc networks which are utilized in different industrial applications. Their usual engagements are to monitor and to perform the coverage of all specific targets in the observing fields permanently. These kinds of networks include numerous configurable directional sensors in which they can be utilized in one of the possible directions along with the one of their adjustable ranges. Although the energy harvesting methodology is being applied for these battery-hungry applications, the battery management and network lifetime maximization are still prominent challenges. In this paper, the network lifetime extension is formulated to a discrete optimization problem which is a famous non-deterministic polynomial time hardness (NP-Hard) problem. To solve this combinatorial problem, a discrete cuckoo search algorithm (D-CSA) is designed and is called in several rounds. A cover is a sub set of configured sensors capable of monitoring all targets in the observing field. In each round, the most efficient cover is constituted along with its activation time. In the determined activation time, the sensors in the cover are scheduled in wakeup mode whereas others are set in sleep mode to save energy. Despite other meta-heuristic algorithms, this proposed algorithm utilizes the novel defined discrete walking around procedures that makes to reach a good balance between exploration and exploitation in this complex search space. The proposed algorithm has been tested in different scenarios to be evaluated. The simulation results in the variety circumstances prove the superiority of the proposed algorithm is about 20.29%, 19.55%, 14.40%, 14.51%, 7.70% and 8.03% in term of average lifespan improvement against H-MNLAR, Hm-LifMax-BC, GA, ACOSC, H-GATS, and HDPSO algorithms, respectively. The results also show the high potential scalability of the proposed algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. Innovative Bacterial Colony Detection: Leveraging Multi-Feature Selection with the Improved Salp Swarm Algorithm.
- Author
-
Ihsan, Ahmad, Muttaqin, Khairul, Fajri, Rahmatul, Mursyidah, Mursyidah, and Fattah, Islam Md Rizwanul
- Subjects
BACTERIAL colonies ,FEATURE selection ,BACTERIA classification ,ALGORITHMS ,CLASSIFICATION algorithms ,SEARCH algorithms - Abstract
In this paper, we introduce a new and advanced multi-feature selection method for bacterial classification that uses the salp swarm algorithm (SSA). We improve the SSA's performance by using opposition-based learning (OBL) and a local search algorithm (LSA). The proposed method has three main stages, which automate the categorization of bacteria based on their unique characteristics. The method uses a multi-feature selection approach augmented by an enhanced version of the SSA. The enhancements include using OBL to increase population diversity during the search process and LSA to address local optimization problems. The improved salp swarm algorithm (ISSA) is designed to optimize multi-feature selection by increasing the number of selected features and improving classification accuracy. We compare the ISSA's performance to that of several other algorithms on ten different test datasets. The results show that the ISSA outperforms the other algorithms in terms of classification accuracy on three datasets with 19 features, achieving an accuracy of 73.75%. Additionally, the ISSA excels at determining the optimal number of features and producing a better fit value, with a classification error rate of 0.249. Therefore, the ISSA method is expected to make a significant contribution to solving feature selection problems in bacterial analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. A fundamental approach to discover closed periodic-frequent patterns in very large temporal databases.
- Author
-
Pamalla, Veena, Rage, Uday Kiran, Penugonda, Ravikumar, Palla, Likhitha, Hayamizu, Yuto, Goda, Kazuo, Toyoda, Masashi, Zettsu, Koji, and Sourabh, Shrivastava
- Subjects
SEARCH algorithms ,DATABASES ,TEMPORAL databases ,ENERGY consumption ,ALGORITHMS - Abstract
Periodic frequent-pattern mining (PFPM) is a vital knowledge discovery technique that identifies periodically occurring patterns in a temporal database. Although traditional PFPM algorithms have many applications, they often produce a large set of periodic-frequent patterns (PFPs) in a database. As a result, analyzing PFPs can be very time-consuming for users. Moreover, a large set of PFPs makes PFPM algorithms less efficient regarding runtime and memory consumption. This paper handles this problem by proposing a novel model of closed periodic-frequent patterns (CPFPs) found in databases. CPFPs are less expensive to mine because they represent a concise and lossless subset uniquely describing the entire set of PFPs. We also present an efficient depth-first search algorithm, called Closed Periodic-Frequent Pattern-Miner (CPFP-Miner), to discover the patterns. The proposed algorithm utilizes the weighted ordering of the patterns concept to reduce the patterns' search space. On the other hand, the current periodicity concept is also applied to prune aperiodic patterns from the search space. Extensive experiments on both real-world and synthetic databases demonstrate that the CPFP-Miner algorithm is efficient. It outperforms the state-of-the-art algorithms regarding runtime requirements, memory consumption, and energy consumption on several real-world and synthetic databases. Additionally, the scalability of the CPFP-Miner algorithm is demonstrated to be more effective and productive than the state-of-the-art algorithms. Finally, we present two case studies to show the functionality of the proposed patterns. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. Unsupervised data normalization for continuous dynamic monitoring by an innovative hybrid feature weighting-selection algorithm and natural nearest neighbor searching.
- Author
-
Sarmadi, Hassan, Entezami, Alireza, and Magalhães, Filipe
- Subjects
SEARCH algorithms ,FEATURE selection ,ALGORITHMS ,ARCH bridges ,CONCRETE bridges ,STRUCTURAL health monitoring - Abstract
Continuous dynamic monitoring brings an important opportunity to evaluate the health and integrity of civil structures in a long-term manner. However, high dimensionality and sparsity of data caused by long-term monitoring and negative influences of environmental and/or operational variability are major challenges in this process. To address these important issues, this article proposes an innovative unsupervised data normalization method based on a novel hybrid feature weighting-selection algorithm and the idea of natural nearest neighbor (NN) searching emanated from the theory of mutual friendships in human societies. The proposed hybrid algorithm is a combination of global feature weighting with a new weighting measure and local feature selection. For this algorithm, this article leverages the natural NN searching that seeks to find adequate NNs automatically. The main objective of the proposed method is to remove the environmental and/or operational effects and provide normalized weighted features for reliable continuous dynamic monitoring. Using such features, an anomaly detector based on the Mahalanobis-squared distance is developed to assess and detect structural damage. The key innovations of this paper contain proposing a fully nonparametric unsupervised learning technique in two parts of data normalization and anomaly detection and developing a novel hybrid algorithm for removing the environmental and/or operational variations. Long-term dynamic features (modal frequencies) of a three-span box-girder concrete bridge (Z24 Bridge) and a long-span concrete arch bridge (Infante Dom Henrique Bridge) are considered to verify the proposed technique with several comparisons. Results indicate that this technique is successful and reliable in mitigating the environmental and/or operational effects and notifying accurate structural states. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. An Algorithm for Optimizing the Process Parameters of the Spindle Process of Universal CNC Machine Tools Based on the Most Probable Explanation of Bayesian Networks.
- Author
-
Zhang, Liyue, Liu, Haoran, Wang, Niantai, Qin, Yuhua, and Chen, Enping
- Subjects
SPINDLES (Machine tools) ,NUMERICAL control of machine tools ,BAYESIAN analysis ,ALGORITHMS ,PROCESS optimization ,SEARCH algorithms ,MACHINE parts - Abstract
As an essential component of a universal CNC machine tool, the spindle plays a critical role in determining the accuracy of machining parts. The three cutting process parameters (cutting speed, feed speed, and cutting depth) are the most important optimization input parameters for studying process optimization. Better processing quality is often achieved through their optimization. Therefore, it is necessary to study the three cutting process parameters of the CNC machine tool spindle. In this paper, we proposed an improved algorithm incorporated with the beetle antennae search algorithm for the most probable explanation in Bayesian networks to achieve optimization calculation of process parameters. This work focuses on building adaptive dynamic step parameters to improve detection behavior. The chaotic strategy is discretized and used to establish the dominant initial population during the population initialization. This article uses four standard network data sets to compare the time and fitness values based on the improved algorithm. The experimental results show that the proposed algorithm is superior in time and accuracy compared to similar algorithms. At the same time, an optimization example for the actual machining of a universal CNC machine tool spindle was provided. Through the optimization of this algorithm, the true machining quality was improved. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. Outlier Detection Algorithms for Open Environments.
- Author
-
Kou, Aijun, Huang, Xu, and Sun, Wenxue
- Subjects
OUTLIER detection ,SUPPORT vector machines ,CLASSIFICATION algorithms ,SEARCH algorithms ,ALGORITHMS ,HIGH-dimensional model representation - Abstract
The high dimensionality and massive amount of data in open environments make the existing low-dimensional outlier detection methods time-consuming. The support vector machine (SVM) is a commonly used outlier detection method. However, the SVM still faces the problem of difficulty in obtaining optimal parameters quickly and effectively, resulting in low detection efficiency, poor stability, and difficulty in applying to open environment datasets. In order to improve the efficiency and stability of outlier detection, this paper proposes an improved sparrow search algorithm and uses it to optimize SVM parameters. First, the traditional sparrow search algorithm is improved by using improved backtracking learning and variable logarithmic spirals. Then, the improved sparrow search method is used to optimize SVM parameters, and the optimized support vector machine is applied to the field of outlier detection. Simulation experimental results show that the proposed method is significantly better than the compared classification algorithms in multiple evaluation indicators, with better detection efficiency, stability, and generalization ability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. A novel marine predators algorithm with adaptive update strategy.
- Author
-
Chen, Tao, Chen, Yong, He, Zhicheng, Li, Eric, Zhang, Chenglin, and Huang, Yuanyi
- Subjects
METAHEURISTIC algorithms ,PARTICLE swarm optimization ,ALGORITHMS ,SEARCH engines ,REINFORCEMENT learning ,SEARCH algorithms ,MYXOMYCETES - Abstract
The marine predators algorithm (MPA) is a metaheuristic algorithm for solving optimization problems. MPA divides the whole optimization process into three phases evenly, and each phase corresponds to a different search agent update strategy. Such a setup makes MPA inflexible when facing different optimization problems, which affects the optimization performance. In this paper, we propose a novel modified MPA hybridizing by Q-learning (QMPA), which applies reinforcement learning to the selection of update strategy, and selects the most appropriate position update strategy for search agents in different iteration stages and states. It can effectively compensate for the deficiency of MPA's adaptive ability when facing different optimization problems. The performance of QMPA is tested on classical benchmark functions, the CEC2014 test suite, and engineering problems. In the classical benchmark functions test, QMPA is compared with MPA in 10, 30, and 50 dimensions. QMPA performs better than MPA for seven of the ten functions when the dimension is 10 and 30. The results of dimension 50 show that QMPA outperforms MPA in 5 functions and is close to it in 4 functions. Then, comparing QMPA with algorithms such as grey wolf optimizer, particle swarm optimization, slime mould algorithm, sine cosine algorithm, reptile search algorithm, and aquila optimizer, the results show that QMPA has the best performance on 22 of the total 30 functions in the CEC2014 test suite. Finally, QMPA is tested on two commonly used real-world engineering problems and gives the most optimal results. In general, the adaptive update strategy proposed in this paper improves the optimization performance of the MPA algorithm in terms of convergence and stability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. A self-training subspace clustering algorithm based on adaptive confidence for gene expression data.
- Author
-
Dan Li, Hongnan Liang, Pan Qin, and Jia Wang
- Subjects
GENE expression ,MACHINE learning ,SUPERVISED learning ,CONFIDENCE ,SEARCH algorithms ,ALGORITHMS - Abstract
Gene clustering is one of the important techniques to identify co-expressed gene groups from gene expression data, which provides a powerful tool for investigating functional relationships of genes in biological process. Self-training is a kind of important semi-supervised learning method and has exhibited good performance on gene clustering problem. However, the self-training process inevitably suffers from mislabeling, the accumulation of which will lead to the degradation of semi-supervised learning performance of gene expression data. To solve the problem, this paper proposes a self-training subspace clustering algorithm based on adaptive confidence for gene expression data (SSCAC), which combines the low-rank representation of gene expression data and adaptive adjustment of label confidence to better guide the partition of unlabeled data. The superiority of the proposed SSCAC algorithm is mainly reflected in the following aspects. 1) In order to improve the discriminative property of gene expression data, the low-rank representation with distance penalty is used to mine the potential subspace structure of data. 2) Considering the problem of mislabeling in self-training, a semi-supervised clustering objective function with label confidence is proposed, and a self-training subspace clustering framework is constructed on this basis. 3) In order to mitigate the negative impact of mislabeled data, an adaptive adjustment strategy based on gravitational search algorithm is proposed for label confidence. Compared with a variety of state-of-the-art unsupervised and semi-supervised learning algorithms, the SSCAC algorithm has demonstrated its superiority through extensive experiments on two benchmark gene expression datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. Algorithm for ponding effect detection considering amount of precipitation.
- Author
-
Zbyněk, Zajac, Michal, Jedlička, Rostislav, Lang, and Ivan, Němec
- Subjects
- *
SEARCH algorithms , *ALGORITHMS , *STRENGTH of materials , *DEFORMATION of surfaces , *TENSILE strength - Abstract
This paper deals with an unfavorable phenomenon called the ponding effect. This phenomenon can cause an increase in local deformations on a tensile surface and therefore endanger the integrity of the structure in case of overcoming the tensile strength of the membrane material. A searching algorithm has been developed in order to analyze and prevent occurrence of this phenomenon. An improvement of the algorithm was made so that only an expected amount of precipitation would be considered during the calculation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Crow search freeman chain code (CS-FCC) extraction algorithm for handwritten character recognition.
- Author
-
Mohamad, M. A., Ahmad, M. A., Mahmood, J., Daud, Kauthar Mohd, and Rahman, Azamuddin Ab
- Subjects
- *
PATTERN recognition systems , *FEATURE extraction , *ALGORITHMS , *SEARCH algorithms , *METAHEURISTIC algorithms , *PROBLEM solving - Abstract
In Handwritten Character Recognition (HCR), interest in feature extraction has been on the increase with the abundance of algorithms derived to increase the accuracy of classification. In this paper, a metaheuristic approach for feature extraction technique in HCR based on Crow Search Algorithm (CSA) was proposed. Freeman Chain Code (FCC) was used as data representation. The main problem in representing a character using FCC is that the results of the extractions depend on the starting points that affected the route length of chain code. To solve this problem, the metaheuristic approach via CSA was proposed to find the shortest route length and minimum computational time for HCR. The performance measurements of the proposed CS-FCC extraction algorithm are the route lengths and computation times. The experiments on the algorithms are performed based on the chain code representation derived from established previous works of Center of Excellence for Document Analysis and Recognition (CEDAR) dataset which consists of 126 upper-case letter characters. Based on the result, the proposed CS-FCC obtained 1880.28 in term of route length and only needs 1.10 second to solve the whole character images. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Landscape Perspective Distance-Included Angle Shape Distribution Analysis Based on 3D CAD Model Retrieval Algorithm.
- Author
-
Bi, Huijuan
- Subjects
DISTRIBUTION (Probability theory) ,ALGORITHMS ,LANDSCAPE gardening ,SEARCH algorithms ,LANDSCAPES - Abstract
In view of the results obtained in the retrieval process of the 3D CAD model, which can show the differences in the local feature details of the model, the 3D CAD model retrieval algorithm is introduced into the analysis of the perspective distance-angle shape distribution of the garden landscape in this paper. Random sampling is performed on the surface of the constructed 3D CAD model, combined with the test distance between the sampling point and the neighboring points, and the corresponding garden landscape perspective distance-angle shape distribution characteristics in this area are calculated in order to achieve the similarity of the CAD model high-speed retrieval. Finally, experimental research shows that the algorithm proposed in this paper is better than the overall shape distribution algorithm and the spherical harmonic algorithm in the search performance of the CAD model, and it can effectively improve the recognition ability of the local detailed features of the 3D CAD model. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
40. A Dynamic Adaptive Firefly Algorithm for Flexible Job Shop Scheduling.
- Author
-
Devi, K. Gayathri, Mishra, R. S., and Madan, A. K.
- Subjects
PRODUCTION scheduling ,NP-hard problems ,SIMULATED annealing ,SEARCH algorithms ,ALGORITHMS ,TABU search algorithm - Abstract
An NP-hard problem like Flexible Job Shop Scheduling (FJSP) tends to be more complex and requires more computational effort to optimize the objectives with contradictory measures. This paper aims to address the FJSP problem with combined and contradictory objectives, like minimization of make-span, maximum workload, and total workload. This paper proposes ‘Hybrid Adaptive Firefly Algorithm’ (HAdFA), a new enhanced version of the classic Firefly Algorithm (FA) embedded with adaptive parameters to optimize the multi objectives concurrently. The proposed algorithm has adopted two adaptive strategies, i.e., an adaptive randomization parameter (α) and an effective heterogeneous update rule for fireflies. The adaptations proposed by this paper can help the optimization process to strike a balance between diversification and intensification. Further, an enhanced local search algorithm, Simulated Annealing (SA), is hybridized with Adaptive FA to explore the local solution space more efficiently. This paper has also attempted to solve FJSP by a rarely used integrated approach where assignment and sequencing are done simultaneously. Empirical simulations on benchmark instances demonstrate the efficacy of our proposed algorithms, thus providing a competitive edge over other nature-inspired algorithms to solve FJSP. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. RJA-Star Algorithm for UAV Path Planning Based on Improved R5DOS Model.
- Author
-
Li, Jian, Zhang, Weijian, Hu, Yating, Fu, Shengliang, Liao, Changyi, and Yu, Weilin
- Subjects
DRONE aircraft ,ALGORITHMS ,SEARCH algorithms ,COMPUTATIONAL complexity - Abstract
To improve the obstacle avoidance ability of agricultural unmanned aerial vehicles (UAV) in farmland settings, a three-dimensional space path planning model based on the R5DOS model is proposed in this paper. The direction layer of the R5DOS intersection model is improved, and the RJA-star algorithm is constructed with the improved jump point search A-star algorithm in our paper. The R5DOS model is simulated in MATLAB. The simulation results show that this model can reduce the computational complexity, computation time, the number of corners and the maximum angles of the A-star algorithm. Compared with the traditional algorithm, the model can avoid obstacles effectively and reduce the reaction times of the UAV. The final fitting results show that compared with A-star algorithm, the RJA-star algorithm reduced the total distance by 2.53%, the computation time by 97.65%, the number of nodes by 99.96% and the number of corners by 96.08% with the maximum corners reduced by approximately 63.30%. Compared with the geometric A-star algorithm, the running time of the RJA-star algorithm is reduced by 95.84%, the number of nodes is reduced by 99.95%, and the number of turns is reduced by 67.28%. In general, the experimental results confirm the effectiveness and feasibility of RJA star algorithm in three-dimensional space obstacle avoidance. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. Multi-Level Phase Noise Model for CO-OFDM Spatial-Division Multiplexed Transmission.
- Author
-
Jiang, Guozhou and Yang, Liu
- Subjects
PHASE noise ,OPTICAL communications ,TELECOMMUNICATION ,TELECOMMUNICATION systems ,MULTIPLEXING ,SEARCH algorithms - Abstract
Spatial division multiplexed (SDM) transmission systems with coherence communication technology have become an important issue in meeting the demands for the capacity of fiber. However, research on the phase noise from lasers is mainly focused on single-channel systems or single-carrier SDM systems. In this paper, a phase noise model comprising common laser phase noise, in addition to the core phase drifts induced by the SDM, is introduced and analyzed for a coherence orthogonal frequency-division multiplexing (CO-OFDM) spatial-division multiplexed transmission (SDM) system. Based on the phase noise model, the applicability of the blind phase search algorithm and the pilot-aided phase estimation algorithm is discussed and demonstrated via simulation. The results show that these two algorithms can work well when considering combined laser linewidths with core phase drifts for CO-OFDM 7-core multi-core fiber (MCF). The results mean that with the SDM phase noise model, phase noise estimation in other cores can be transferred from one core to lower the complexity with the help of the model. This research provides a proper application of the phase noise analysis of large-capacity optical communication based on a weak-coupled MCF. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. Study on an Assembly Prediction Method of RV Reducer Based on IGWO Algorithm and SVR Model.
- Author
-
Jin, Shousong, Cao, Mengyi, Qian, Qiancheng, Zhang, Guo, and Wang, Yaliang
- Subjects
PARTICLE swarm optimization ,LATIN hypercube sampling ,BACK propagation ,ALGORITHMS ,SEARCH algorithms ,PRODUCT quality - Abstract
This paper proposes a new method for predicting rotation error based on improved grey wolf–optimized support vector regression (IGWO-SVR), because the existing rotation error research methods cannot meet the production beat and product quality requirements of enterprises, because of the disadvantages of its being time-consuming and having poor calculation accuracy. First, the grey wolf algorithm is improved based on the optimal Latin hypercube sampling initialization, nonlinear convergence factor, and dynamic weights to improve its accuracy in optimizing the parameters of the support vector regression (SVR) model. Then, the IGWO-SVR prediction model between the manufacturing error of critical parts and the rotation error is established with the RV-40E reducer as a case. The results show that the improved grey wolf algorithm shows better parameter optimization performance, and the IGWO-SVR method shows better prediction performance than the existing back propagation (BP) neural network and BP neural network optimized by the sparrow search algorithm rotation error prediction methods, as well as the SVR models optimized by particle swarm algorithm and grey wolf algorithm. The mean squared error of IGWO-SVR model is 0.026, the running time is 7.843 s, and the maximum relative error is 13.5%, which can meet the requirements of production beat and product quality. Therefore, the IGWO-SVR method can be well applied to the rotate vector (RV) reducer parts-matching model to improve product quality and reduce rework rate and cost. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
44. Multi-UAV Path Planning Algorithm Based on BINN-HHO.
- Author
-
Li, Sen, Zhang, Ran, Ding, Yuanming, Qin, Xutong, Han, Yajun, and Zhang, Huiting
- Subjects
PARTICLE swarm optimization ,ALGORITHMS ,SEARCH algorithms ,ENERGY function ,DRONE aircraft - Abstract
Multi-UAV (multiple unmanned aerial vehicles) flying in three-dimensional (3D) mountain environments suffer from low stability, long-planned path, and low dynamic obstacle avoidance efficiency. Spurred by these constraints, this paper proposes a multi-UAV path planning algorithm that consists of a bioinspired neural network and improved Harris hawks optimization with a periodic energy decline regulation mechanism (BINN-HHO) to solve the multi-UAV path planning problem in a 3D space. Specifically, in the procession of global path planning, an energy cycle decline mechanism is introduced into HHO and embed it into the energy function, which balances the algorithm's multi-round dynamic iteration between global exploration and local search. Additionally, when the onboard sensors detect a dynamic obstacle during the flight, the improved BINN algorithm conducts a local path replanning for dynamic obstacle avoidance. Once the dynamic obstacles in the sensor detection area disappear, the local path planning is completed, and the UAV returns to the trajectory determined by the global planning. The simulation results show that the proposed Harris hawks algorithm has apparent superiorities in path planning and dynamic obstacle avoidance efficiency compared with the basic Harris hawks optimization, particle swarm optimization (PSO), and the sparrow search algorithm (SSA). [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
45. An anytime Visibility–Voronoi graph-search algorithm for generating robust and feasible unmanned surface vehicle paths.
- Author
-
Schoener, Marco, Coyle, Eric, and Thompson, David
- Subjects
AUTONOMOUS vehicles ,REMOTELY piloted vehicles ,DOCKS ,COST functions ,ALGORITHMS ,VEHICLE models ,FLOOR plans ,SEARCH algorithms - Abstract
While path planning for Unmanned Surface Vehicles (USVs) is in many ways similar to path planning for ground vehicles, the lack of reliable USV models and significant maritime environmental uncertainties requires an increased focus on robustness and safety. This paper presents a novel graph construction method based on Visibility–Voronoi diagrams that allow users to tune path optimality and path safety while considering vehicle dynamics and model uncertainty. The vehicle state is defined as both a 2D location and heading. The method is based on a roadmap generated from a Visibility–Voronoi diagram, and uses motion curves and path smoothing to ensure path feasibility. The roadmap can then be searched using any graph-search algorithm to return optimal paths subject to a cost function. This paper also shows how to generate and search this roadmap in an anytime fashion, which makes the method suitable for local planning where sensors are used to build a map of the environment in real-time. This approach is demonstrated effectively on underactuated systems, with empirical results from USV docking and obstacle field navigation scenarios. These case studies show the path maintains feasibility subject to a simplified vehicle model, and is able to maximize safety when navigating close to obstacles. Simulation results are also used to analyze algorithm complexity, prove suitability for local planning, and demonstrate the benefits of anytime roadmap generation. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
46. An Enhanced Crow Search Inspired Feature Selection Technique for Intrusion Detection Based Wireless Network System.
- Author
-
Khanna, Ashish, Rani, Poonam, Garg, Puneet, Singh, Prakash Kumar, and Khamparia, Aditya
- Subjects
INTRUSION detection systems (Computer security) ,FEATURE selection ,COGNITIVE computing ,ALGORITHMS ,SEARCH algorithms ,WIRELESS communications - Abstract
Recent development of cognitive computing driven evolutionary techniques improve the overall quality of service and user experience in wireless communication network. This Paper consists of a feature selection method based on improvement of Crow Search Algorithm which has been used in Intrusion Detection System to limit the size of the dataset with which the system is working with and getting better results. Since IDS deals with a large data, the crucial task of IDS is to keep efficient features which represents the whole data and there is no duplicity and irrelevancy. The previous model that was proposed used the crow search algorithm in the intrusion detection system (CSA-IDS) as a model to find the optimal feature's subset and random forest as a judgement on features that are produced by the CSA-IDS. The KDD and UNSW datasets are used to evaluate the earlier proposed model. The proposed model achieved an accuracy of 99.84% for attack detection using UNSW datasets. Similarly, R2L and U2R attacks have detected accuracy of 99.97% for NSL-KDD dataset. The development of proposed model improve the overall communication services and feature selection in wireless communication network. The outcome proves that the subset of features that are obtained by using CSA-IDS fetches higher accuracy rate using a smaller number of features. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
47. A new model to protect an important node against two threatening agents.
- Author
-
Maleki, Z., Maleki, H. R., and Akbari, R.
- Subjects
SIMULATED annealing ,SEARCH algorithms ,METAHEURISTIC algorithms ,LOCATION problems (Programming) ,NATURAL disasters ,PROBLEM solving ,ALGORITHMS - Abstract
One of the main goals of network planners is the protection of important nodes in a network against natural disasters, security threats, attacks, and so on. Given the importance of this issue, a new model is presented in this paper for protecting an important node in a typical network based on a defensive location problem where the two agents threaten this node. The protecting facilities location problem with two agents is formulated as a three-level programming problem. The decision maker in the upper level is a network planner agent. The planner agent wants to find the best possible location of protecting facilities to protect the important node against threatening agents. The second and third levels problems are stated as the shortest path problems in the network in which the edges are weighted with positive values. In this work, the genetic, variable neighborhood search, simulated annealing algorithms are used to solve the problem. The performance of the used metaheuristic algorithms on this class of problems is investigated by a test problem that is generated randomly. Then, t-test are used to compare the performance of these algorithms. The best results are obtained by the variable neighborhood search algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2022
48. Critical assessment of Shape Retrieval Tools (SRTs).
- Author
-
Xiao, Xinyi, Joshi, Sanjay, and Cecil, J.
- Subjects
SEARCH algorithms ,INFORMATION retrieval ,DATABASE searching ,KEY performance indicators (Management) ,ALGORITHMS ,EVALUATION methodology - Abstract
In today's design — manufacturing context, designers often modify existing 3D shapes (or design models) instead of creating a new design from scratch. This requires the ability to search an existing database of designs/3D models to identify and extract similar designs. Shape Retrieval Tools (SRTs) have been developed to provide an essential role in saving time and effort to retrieve and generate new designs. The capabilities of commercially available SRTs vary based on the form of the input design model, the search technique or algorithm used, the search/retrieval time, ease of use, and the quality of results. The focus of this paper is to study of their capabilities, performances, and differences and develop criteria to compare the effectiveness and performance of such Shape Retrieval Tools. Current search evaluation methods, such as precision and recall, are based on human interpretation of the results. This paper presents a holistic set of metrics for comparing the performance and effectiveness of SRTs, including data input options (to search), effectiveness of the search process, the associated retrieval time, overall ease of use, and additional data retrieval details. An algorithm is proposed to objectively analyze the search results based on the proposed Model Match Ratio (MMR), computed by the variance between the input and retrieved geometries. The search results are usually presented in a rank order list. A Precision Sequence Metric (PSM) is developed to evaluate the retrieved list by ranking the retrieved results based on the MMR for evaluating the quality of the search. The proposed evaluation algorithm was tested on several design models (and their subsequent retrieval results) involving three SRTs (Vizseek, Geolus, and CADENAS); the results of the comparison of the performance of these SRTs are discussed in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
49. Multi-skill resource-constrained multi-modal project scheduling problem based on hybrid quantum algorithm.
- Author
-
Peng, Jun Long, Liu, Xiao, Peng, Chao, and Shao, Yu
- Subjects
ALGORITHMS ,PARTICLE swarm optimization ,SEARCH algorithms ,SCHEDULING ,MATHEMATICAL models ,PROJECT management - Abstract
Numerous studies on project scheduling only consider a single factor, which fails to reflect the actual environment of project operations. In light of this issue, the article synthesizes multiple perspectives and proposes a multi-skill resource-based multi-modal project scheduling problem (MRCMPSP). This problem is described, modeled, and solved using the resource capability matrix and other constraints to minimize the project duration. To effectively solve MRCMPSP and enrich scheduling algorithms, the paper selects the hybrid quantum algorithm (HQPSO) based on the quantum particle swarm algorithm (QPSO). The HQPSO introduces various improvements such as the JAYA optimization search to improve the algorithm's performance. In order to verify the generality, superiority, and effectiveness of the algorithm, independent operation comparison experiments and practical application experiments of the algorithm are designed based on different case sizes and resource quantities. The experimental results demonstrate that the proposed algorithm has superior convergence performance and solution accuracy and can provide an effective scheduling solution for real cases. Additionally, the article provides targeted management suggestions based on the research findings. Overall, this study contributes a novel mathematical model, solution algorithm, optimization strategies, and managerial insights, advancing the field of project management research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. Solving Redundancy Allocation Problems using Jaya Algorithm.
- Author
-
Aswin, B., Lokhande, Tapan, and Gaonkar, Rajesh S. Prabhu
- Subjects
OPTIMIZATION algorithms ,METAHEURISTIC algorithms ,RELIABILITY in engineering ,SEARCH algorithms ,ALGORITHMS - Abstract
Reliability-based design is related to the performance analysis of engineering systems. The redundancy allocation problem is one of the most common problems in the reliability-based design approach. The redundancy allocation problem determines the redundancy level of components in a system to maximize system reliability, subject to several constraints. In recent years, obtaining solutions to reliability-related redundancy allocation problems by means of evolving meta-heuristic algorithms has appealed to researchers due to the several drawbacks of classical mathematical methods. Meta-heuristics have shown the potential of obtaining precise solution in optimization problems and many techniques have been applied in the literature for optimal redundancy allocation. In this paper, a recently developed Jaya optimization algorithm is proposed to be applied for redundancy allocation to maximize system reliability. The Jaya algorithm is a simple, population-based intelligent meta-heuristic algorithm consisting of a single phase and an algorithm-specific parameter-less algorithm. This paper aims to present an application of the Jaya algorithm for searching the optimal solution of two redundancy allocation problems from the literature with nonlinear constraints so that system reliability is maximized. The first problem is the over speed protection system for a gas turbine, whose control system is modelled as a four-stage series system. The objective is to determine the optimal level of redundancy of the valves of the protection system under cost and weight constraints. The second one is the redundancy allocation problem of a five-stage series system with volume, weight, and cost constraints. The results are validated by comparing them with two other meta-heuristics. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.