92 results on '"*PSEUDOCODE (Computer program language)"'
Search Results
2. PSEUDO CORE INVERTIBILITY AND DMP INVERTIBILITY IN TWO SEMIGROUPS OF A RING WITH INVOLUTION.
- Author
-
WENDE LI, JIANLONG CHEN, YUKUN ZHOU, and YUANYUAN KE
- Subjects
- *
PSEUDOCODE (Computer program language) , *MATHEMATICS , *FINITE element method , *SEMIGROUPS (Algebra) , *GROUP theory - Abstract
In 2004, Patrício and Puystjens characterized the relation between Drazin invertible elements (resp., Moore-Penrose invertible elements) of two semigroups pRp and pRp+1 p of a ring R for some idempotent (resp., projection) p ∈ R. In this paper, we consider the relevant result for pseudo core invertible elements of such two semigroups of a ring for some projection, which is then applied to characterize the relation between pseudo core invertible elements of the matrix semigroup AA†RmxmAA†+Im AA† and the matrix semigroup A†ARnxnA†A+In A†A, where A ∈ Rmxn with A† existing. Also, similar equivalence involving DMP invertible elements is investigated. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. Remainder and quotient without polynomial long division.
- Author
-
Laudano, Francesco
- Subjects
- *
STUDY & teaching of polynomials , *MATHEMATICS education , *ALGEBRA education , *PSEUDOCODE (Computer program language) , *PROGRAMMING languages - Abstract
We propose an algorithm that allows calculating the remainder and the quotient of division between polynomials over commutative coefficient rings, without polynomial long division. We use the previous results to determine the quadratic factors of polynomials over commutative coefficient rings and, in particular, to completely factorize in Z [ x ] any integral polynomial with degree less than 6. The arguments are suitable for building classroom/homework activities in basic algebra courses. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
4. A Look at Algorithm BEPtoPNST.
- Author
-
García-Ojeda, Juan C.
- Subjects
- *
BUILDING evacuation , *COMBINATORIAL optimization , *PSEUDOCODE (Computer program language) , *COMPUTATIONAL complexity , *HIGH performance computing , *VERY large scale circuit integration , *INTERNET of things , *GRAPH theory - Abstract
This work analyzes the computational complexity of algorithm BEPtoPNST which transforms a building-evacuation problem (BEP) into a time-expanded, process-network synthesis (PNST) problem. The solution of the latter is achieved by resorting to the P-graph method which exploits the combinatorial nature of a BEP. Unlike other approaches, the P-graph method provides not only the optimal solution (best evacuation route as a function of egress time), but also the best n sub-optimal solutions. For the complexity analysis, a generic processor, and a Random-access machine (RAM) model were deployed as well as a mathematical model to calculate the number and cost of the operations performed. It was observed that algorithm BEPtoPNST exhibits an asymptotic complexity of order O (T/A/(/N/-k)). When solving a BEP, however, the total complexity grows exponentially with order O (T/A/(/N/-k)) + O (2h)) in the worst case; where h represents the total number of operating units specified in the corresponding PNST problem. Nevertheless, the computational complexity can be reduced significantly when the P-graph method is deployed. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
5. UbiSitePred: A novel method for improving the accuracy of ubiquitination sites prediction by using LASSO to select the optimal Chou's pseudo components.
- Author
-
Cui, Xiaowen, Yu, Zhaomin, Yu, Bin, Wang, Minghui, Tian, Baoguang, and Ma, Qin
- Subjects
- *
UBIQUITINATION , *LOGICAL prediction , *PSEUDOCODE (Computer program language) , *DNA damage , *COMPUTATIONAL complexity - Abstract
Abstract Ubiquitination is an essential process in protein post-translational modification, which plays a crucial role in cell life activities, such as proteasomal degradation, transcriptional regulation, and DNA damage repair. Therefore, recognition of ubiquitination sites is a crucial step to understand the molecular mechanisms of ubiquitination. However, the experimental verification of numerous ubiquitination sites is time-consuming and costly. To alleviate these issues, a computational approach is needed to predict ubiquitination sites. This paper proposes a new method called UbiSitePred for predicting ubiquitination sites combined least absolute shrinkage and selection operator (LASSO) feature selection and support vector machine. First, we use binary encoding (BE), pseudo-amino acid composition (PseAAC), the composition of k-spaced amino acid pairs (CKSAAP), position-specific propensity matrices (PSPM) to extract the sequence feature information; thus, the initial feature space is obtained. Secondly, LASSO is applied to remove the feature redundancy information and selects the optimal feature subset. Finally, the optimal feature subset is input into the support vector machine (SVM) to predict the ubiquitination sites. Five-fold cross-validation shows that UbiSitePred model can achieve a better prediction performance compared with other methods, the AUC values for Set1, Set2, and Set3 are 0.9998, 0.8887, and 0.8481, respectively. Notably, the UbiSitePred has overall accuracy rates of 98.33%, 81.12%, and 76.90%, respectively. The results demonstrate that the proposed method is significantly superior to other state-of-the-art prediction methods and provide a new idea for the prediction of other post-translational modification sites of proteins. The source code and all datasets are available at https://github.com/QUST-AIBBDRC/UbiSitePred/. Highlights • A new method (UbiSitePred) to predict the ubiquitination sites. • Fusing BE, PseAAC, CKSAAP and PSPM methods to extract protein sequence features information. • LASSO method can effectively remove redundant information in the protein sequences. • We investigate the effect of the six different classifiers on the results. • The proposed method increases the prediction performance over several methods. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. GLANCE ON PARALLELIZATION OF FFT ALGORITHMS.
- Author
-
FARHAN, Mhnd
- Subjects
- *
FAST Fourier transforms , *PARALLELIZING compilers , *DISCRETE Fourier transforms , *MULTIPROCESSORS , *PSEUDOCODE (Computer program language) - Abstract
This paper implores the parallelization of Fast Fourier Transform (FFT) algorithms and evaluates the resultant parallelized source codes. The FFT algorithm is considered to be among the most important algorithms in the digital era. There are various FFT algorithms but just a few are considered in this paper. The Cooley-Tukey FFT is the most widely known and used. With no exception, in this paper, the radix-2 Decimation in Time (DIT) and Decimation in Frequency (DIF) are studied and implemented. Another important FFT algorithm that is the Goertzel is also considered in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2019
7. Classification of some elements in pseudo BL-algebras.
- Author
-
Daneshpayeh, Roohallah, Borumand Saeid, Arsham, Mirvakili, Saeed, and Rezaei, Akbar
- Subjects
- *
CLASSIFICATION algorithms , *PSEUDOCODE (Computer program language) , *ALGEBRA , *ORTHOGONAL codes , *SMALL divisors - Abstract
In this paper, the notions of orthogonal, dense, regular, zero-divisor, strong and complemented elements in a pseudo BL-algebra are introduced and relation between the orthogonal and zero-divisor elements for perfect (good) pseudo BL-algebras is investigated. In particular, we get some results when a pseudo BL- algebra is good or perfect. Finally, a new characterization of these elements in a pseudo BL-algebra by a diagram is given. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
8. Revealing Privacy Vulnerabilities of Anonymous Trajectories.
- Author
-
Chang, Shan, Li, Chao, Zhu, Hongzi, Lu, Ting, and Li, Qiang
- Subjects
- *
TRAJECTORIES (Mechanics) , *MOBILE communication systems , *DIGITAL Object Identifiers , *CYBERTERRORISM , *PSEUDOCODE (Computer program language) , *VEHICULAR ad hoc networks , *GLOBAL Positioning System , *PRIVACY - Abstract
The proliferation of various mobile devices equipped with GPS positioning modules makes the collection of trajectories more easier than ever before, and more and more trajectory datasets have been available for business applications or academic researches. Normally, published trajectories are often anonymized by replacing real identities of mobile objects with pseudonyms (e.g., random identifiers); however, privacy leaks can hardly be prevented. In this paper, we introduce a novel paradigm of de-anonymization attack re-identifying trajectories of victims from anonymous trajectory datasets. Different from existing attacks, no background knowledge or side channel information about the target dataset is required. Instead, we claim that, for each moving object, there exist some mobility patterns that reflect the preference or usual behavior of the object, and will not change dramatically over a period of time. As long as those relatively stable patterns can be extracted from trajectories and be utilized as quasi-identifiers, trajectories can be linked to anonymous historical ones. To implement such kind of de-anonymization attacks, an adversary only needs to collect a few trajectory segments of a victim, the durations of which do not necessarily overlap with that of trajectories in the target dataset (in simple terms, those trajectory segments are not necessary sub-trajectories included in the target dataset). Since the movements of victims in public areas could be observed openly, an adversary can obtain traces or locations about the victims either by direct monitoring them (e.g., tracking) or from third parties (e.g., social-networks). Then, the adversary extracts useful patterns from both the historical trajectories in the accessible dataset and newly obtained trajectory segments of victims, the historical trajectory with most similar patterns to that of a victim is considered as belonging to the victim. In order to demonstrate the feasibility of such attacks, we conduct extensive trace-driven simulations. We extract road segment preferences and stop of interests from trajectories of vehicles, and construct feature vectors (mobility patterns) of vehicles according to them, used for trajectory comparisons. Simulation results show that the adversary could re-identify anonymous trajectories effectively. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
9. Peeling the longest: A simple generalized curve reconstruction algorithm.
- Author
-
Parakkat, Amal Dev, Methirumangalath, Subhasree, and Muthuganapathy, Ramanathan
- Subjects
- *
PARAMETRIC processes , *VORONOI polygons , *WATER distribution , *TRIANGULATION , *PSEUDOCODE (Computer program language) - Abstract
Given a planar point set sampled from a curve, the curve reconstruction problem computes a polygonal approximation of the curve. In this paper, we propose a Delaunay triangulation-based algorithm for curve reconstruction, which removes the longest edge of each triangle to result in a graph. Further, each vertex of the graph is checked for a degree constraint to compute simple closed/open curves. Assuming ϵ-sampling, we provide theoretical guarantee which ensures that a simple closed/open curve is a piecewise linear approximation of the original curve. Input point sets with outliers are handled as part of the algorithm, without pre-processing. We also propose strategies to identify the presence of noise and simplify a noisy point set, identify self-intersections and enhance our algorithm to reconstruct such point sets. Perhaps, this is the first algorithm to identify the presence of noise in a point set. Our algorithm is able to detect closed/open curves, disconnected components, multiple holes and sharp corners. The algorithm is simple to implement, independent of the type of input, non-feature specific and hence it is a generalized one. We have performed extensive comparative studies to demonstrate that our method is comparable or better than other existing methods. Limitations of our approach have also been discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
10. Hybrid non-parametric particle swarm optimization and its stability analysis.
- Author
-
Liu, Zhao-Guang, Ji, Xiu-Hua, and Liu, Yun-Xia
- Subjects
- *
PARTICLE swarm optimization , *MACHINE learning , *STOCHASTIC convergence , *WILCOXON signed-rank test , *PSEUDOCODE (Computer program language) - Abstract
As a population-based random search optimization technique, particle swarm optimization (PSO) has become an important branch of swarm intelligence (SI). The tuning of parameters in PSO has attracted the attention of many researchers. This study proposes an alternative technology called hybrid non-parametric PSO (HNPPSO) algorithm. Other SI operations, including a multi-crossover operation, a vertical crossover, and an exemplar-based learning strategy, are combined with the proposed algorithm to balance the global and local search capabilities. The first- and second-order stability analyses conducted for the present study showed that the particle positions are expected to converge at a fixed point in the search space and that the variance of the particle positions converges to zero. In the experiments, the proposed algorithm was compared with 10 other advanced PSO techniques using 40 widely used benchmark functions. The experimental results indicated that the proposed algorithm yields better solution accuracy and convergence speed than the other PSO techniques. The proposed algorithm significantly outperformed the other PSO approaches in terms of convergence speed. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
11. Fast Scheduling of Robot Teams Performing Tasks With Temporospatial Constraints.
- Author
-
Gombolay, Matthew C., Wilcox, Ronald J., and Shah, Julie A.
- Subjects
- *
HUMAN-robot interaction , *CONSTRAINTS (Physics) , *METAHEURISTIC algorithms , *PSEUDOCODE (Computer program language) , *SCHEDULING - Abstract
The application of robotics to traditionally manual manufacturing processes requires careful coordination between human and robotic agents in order to support safe and efficient coordinated work. Tasks must be allocated to agents and sequenced according to temporal and spatial constraints. Also, systems must be capable of responding on-the-fly to disturbances and people working in close physical proximity to robots. In this paper, we present a centralized algorithm, named “Tercio,” that handles tightly intercoupled temporal and spatial constraints. Our key innovation is a fast, satisficing multi-agent task sequencer inspired by real-time processor scheduling techniques and adapted to leverage a hierarchical problem structure. We use this sequencer in conjunction with a mixed-integer linear program solver and empirically demonstrate the ability to generate near-optimal schedules for real-world problems an order of magnitude larger than those reported in prior art. Finally, we demonstrate the use of our algorithm in a multirobot hardware testbed. [ABSTRACT FROM PUBLISHER]
- Published
- 2018
- Full Text
- View/download PDF
12. Fourier-Based Shape Servoing: A New Feedback Method to Actively Deform Soft Objects into Desired 2-D Image Contours.
- Author
-
Navarro-Alarcon, David and Liu, Yun-Hui
- Subjects
- *
FOURIER series , *ADAPTIVE control systems , *IMAGE sensors , *PSEUDOCODE (Computer program language) , *COMPUTER simulation - Abstract
This paper addresses the design of a vision-based method to automatically deform soft objects into desired two-dimensional shapes with robot manipulators. The method presents an innovative feedback representation of the object's shape (based on a truncated Fourier series) and effectively exploits it to guide the soft object manipulation task. A new model calibration scheme that iteratively approximates a local deformation model from vision and motion sensory feedback is derived; this estimation method allows us to manipulate objects with unknown deformation properties. Pseudocode algorithms are presented to facilitate the implementation of the controller. Numerical simulations and experiments are reported to validate this new approach. [ABSTRACT FROM PUBLISHER]
- Published
- 2018
- Full Text
- View/download PDF
13. A Time-Efficient Pair-Wise Collision-Resolving Protocol for Missing Tag Identification.
- Author
-
Zhang, Lijuan, Xiang, Wei, Atkinson, Ian, and Tang, Xiaohu
- Subjects
- *
PROTOCOL analyzers , *RADIO frequency identification systems , *PSEUDOCODE (Computer program language) , *REDUNDANCY in engineering , *TCP/IP - Abstract
Radio frequency identification (RFID) technology has been employed in wide-raging application domains. In most RFID applications, time-efficient identification of missing tags is one of the most fundamental objectives, especially for asset management and anti-theft purposes. In this paper, we propose a time-efficient pair-wise collision-resolving missing tag identification (PCMTI) protocol for large-scale RFID systems. In the protocol, two novel strategies, i.e., the pair-reply and two-collision slot (i.e., a slot with two exact tag responses) resolving strategies, are proposed. The pair-reply strategy can verify two tags in one short response slot simultaneously, while the two-collision slot resolving strategy further increases the number of tags verified in each frame. Both theoretical analysis and simulated results are presented to demonstrate the superiority of the proposed PCMTI protocol, which is capable of outperforming the state-of-the-art comparative protocols with at least a 30% reduction in average identification time for verifying one tag. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
14. The connected-component labeling problem: A review of state-of-the-art algorithms.
- Author
-
He, Lifeng, Ren, Xiwei, Gao, Qihang, Zhao, Xiao, Yao, Bin, and Chao, Yuyan
- Subjects
- *
PATTERN recognition systems , *PIXELS , *COMPUTER vision , *PSEUDOCODE (Computer program language) , *IMAGE analysis - Abstract
This article addresses the connected-component labeling problem which consists in assigning a unique label to all pixels of each connected component (i.e., each object) in a binary image. Connected-component labeling is indispensable for distinguishing different objects in a binary image, and prerequisite for image analysis and object recognition in the image. Therefore, connected-component labeling is one of the most important processes for image analysis, image understanding, pattern recognition, and computer vision. In this article, we review state-of-the-art connected-component labeling algorithms presented in the last decade, explain the main strategies and algorithms, present their pseudo codes, and give experimental results in order to bring order of the algorithms. Moreover, we will also discuss parallel implementation and hardware implementation of connected-component labeling algorithms, extension for n -D images, and try to indicate future work on the connected component labeling problem. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
15. Limited contiguous processor allocation mechanism in the mesh-connected multiprocessors using compaction.
- Author
-
Reza, Akram and Rafie, Mahnaz
- Subjects
- *
MULTIPROCESSORS , *MESH networks , *RESOURCE allocation , *PSEUDOCODE (Computer program language) , *TECHNOLOGICAL innovations - Abstract
In this paper, several efficient migration and allocation strategies have been compared on the mesh-based multiprocessor systems. The traditional non-preemptive submesh allocation strategies consist of two row boundary (TRB) and two column boundary (TCB). The existing migration mechanisms are online dynamic compaction-four corner (ODC-FC), limited top-down compaction (LTDC), TCB, and the combination of TCB and ODC-FC algorithms. Indeed, the new allocation method is presented in this paper. This mechanism has the benefits of two efficient traditional allocation algorithms. It is the combination of the TCB and TRB allocation methods. Also, in this process the impact of four key metrics on online mapping is considered. The parameters are average task execution time (ATET), average task system utilization (ATSU), average task waiting time (ATWT), and average task response time (ATRT). Using TCB and TRB mechanism with the migration strategies is shown that the new algorithm has better ATET, ATRT, ATWT, and ATSU. It has, respectively, 23.5494, 97.1216, 39.1291, and 4.142% improvements in comparison with the previous mechanisms. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
16. DuelMerge: Merging with Fewer Moves.
- Author
-
MERGEN, SERGIO L. S. and MOREIRA, VIVIANE P.
- Subjects
- *
COMPUTER algorithms , *PSEUDOCODE (Computer program language) , *C (Computer program language) , *PROGRAMMING languages , *COMPUTER science - Abstract
This work proposes DUELMERGE, a stable merging algorithm that is asymptotically optimal in the number of comparisons and performs O (n log2 (n) moves. Unlike other partition-based algorithms, we only allow blocks of equal sizes to be swapped, which reduces the number of moves required. We performed experiments comparing DUELMERGE against a number of baselines including RECMERGE, the standard merging solution for programming languages such as C, and some more recent approaches. The results show that our proposed algorithm performs fewer moves than other stable solutions. Experiments employing DUELMERGE within MergeSort confirmed our positive results in terms of moves, comparisons and runtime. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
17. Lexicographic pseudo effect algebras.
- Author
-
Dvurečenskij, Anatolij
- Subjects
- *
LEXICOGRAPHICAL errors , *EFFECT algebras , *RIESZ spaces , *ASSOCIATIVE algebras , *PSEUDONOISE sequences (Digital communications) , *PSEUDOCODE (Computer program language) - Abstract
We characterize effect algebras and pseudo effect algebras that can be represented as an interval of the lexicographic product of an antilattice unital po-group ( H, u) with a directed po-group G, both with the Riesz Decomposition Property (RDP). We show that a crucial condition is the existence of a lexicographic normal ideal. Finally, we present a categorical equivalence of the category of ( H, u)-lexicographic pseudo effect algebras having RDP with the category of directed po-groups with RDP. In addition, a weaker form of the ( H, u)-lexicographic pseudo effect algebra is studied. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
18. Parallel Pseudo Arc-Length Moving Mesh Schemes for Multidimensional Detonation.
- Author
-
Ning, Jianguo, Yuan, Xinpeng, Ma, Tianbao, and Li, Jian
- Subjects
- *
ARC length , *GEOMETRY , *PARALLEL algorithms , *COMPUTATIONAL geometry , *PSEUDOCODE (Computer program language) , *MATHEMATICAL models - Abstract
We have discussed the multidimensional parallel computation for pseudo arc-length moving mesh schemes, and the schemes can be used to capture the strong discontinuity for multidimensional detonations. Different from the traditional Euler numerical schemes, the problems of parallel schemes for pseudo arc-length moving mesh schemes include diagonal processor communications and mesh point communications, which are illustrated by the schematic diagram and key pseudocodes. Finally, the numerical examples are given to show that the pseudo arc-length moving mesh schemes are second-order convergent and can successfully capture the strong numerical strong discontinuity of the detonation wave. In addition, our parallel methods are proved effectively and the computational time is obviously decreased. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
19. A computational strategy to establish algebraic parameters for the Reference Resistance Design of metal shell structures.
- Author
-
Sadowski, Adam J., Fajuyitan, O. Kunle, and Wang, Jie
- Subjects
- *
COMPRESSION loads , *PSEUDOCODE (Computer program language) , *MATERIAL plasticity , *MECHANICAL buckling , *FINITE element method - Abstract
The new Reference Resistance Design (RRD) method, recently developed by Rotter [1] , for the manual dimensioning of metal shell structures effectively permits an analyst working with only a calculator or spreadsheet to take full advantage of the realism and accuracy of an advanced nonlinear finite element (FE) calculation. The method achieves this by reformulating the outcomes of a vast programme of parametric FE calculations in terms of six algebraic parameters and two resistances, each representing a physical aspect of the shell's behaviour. The formidable challenge now is to establish these parameters and resistances for the most important shell geometries and load cases. The systems that have received by far the most research attention for RRD are that of a cylindrical shell under uniform axial compression and uniform bending. Their partial algebraic characterisations required thousands of finite element calculations to be performed across a four-dimensional parameter hyperspace (i.e. length, radius to thickness ratio, imperfection amplitude, linear strain hardening modulus). Handling so many nonlinear finite element models is time-consuming and the quantities of data generated can be overwhelming. This paper illustrates a computational strategy to deal with both issues that may help researchers establish sets of RRD parameters for other important shell systems with greater confidence and accuracy. The methodology involves full automation of model generation, submission, termination and processing with object-oriented scripting, illustrated using code and pseudocode fragments. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
20. A comprehensive review of fuzzy-based clustering techniques in wireless sensor networks.
- Author
-
Singh, Manjeet and Soni, Surender Kumar
- Subjects
- *
WIRELESS sensor networks , *WIRELESS sensor nodes , *FUZZY logic , *PSEUDOCODE (Computer program language) , *ENERGY consumption - Abstract
Purpose This paper aims to discuss a comprehensive survey on fuzzy-based clustering techniques. The determination of an appropriate sensor node as a cluster head straightforwardly affects a network’s lifetime. Clustering often possesses some uncertainties in determining suitable sensor nodes as a cluster head. Owing to various variables, selection of a suitable node as a cluster head is a perplexing decision. Fuzzy logic is capable of handling uncertainties and improving decision-making processes even with insufficient information. Then, state-of-the-art research in the field of clustering techniques has been reviewed.Design/methodology/approach The literature is presented in a tabular form with merits and limitations of each technique. Furthermore, the various techniques are compared graphically and classified in a tabular form and the flowcharts of important algorithms are presented with pseudocodes.Findings This paper comprehends the importance and distinction of different fuzzy-based clustering methods which are further supportive in designing more efficient clustering protocols.Originality/value This paper fulfills the need of a review paper in the field of fuzzy-based clustering techniques because no other paper has reviewed all the fuzzy-based clustering techniques. Furthermore, none of them has presented literature in a tabular form or presented flowcharts with pseudocodes of important techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
21. Excited Muon Searches at the FCC-Based Muon-Hadron Colliders.
- Author
-
Caliskan, A., Kara, S. O., and Ozansoy, A.
- Subjects
- *
MUON spin rotation , *HADRON colliders , *COLLIDERS (Nuclear physics) , *HADRON facilities , *PSEUDOCODE (Computer program language) - Abstract
We study the excited muon production at the FCC-based muon-hadron colliders. We give the excited muon decay widths and production cross-sections. We deal with the μp→μ⋆q→μγq process and plot the transverse momentum and normalized pseudorapidity distributions of final state particles to define the kinematical cuts best suited for discovery. By using these cuts, we get the mass limits for excited muons. It is shown that the discovery limits obtained on the mass of μ⋆ are 2.2, 5.8, and 7.5 TeV for muon energies of 63, 750, and 1500 GeV, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
22. A box-particle implementation of standard PHD filter for extended target tracking.
- Author
-
Zhang, Yongquan, Ji, Hongbing, and Hu, Qi
- Subjects
- *
FILTERS (Mathematics) , *TRACKING algorithms , *SIMULATION methods & models , *PSEUDOCODE (Computer program language) , *BAYESIAN analysis - Abstract
This paper presents a box-particle implementation of the standard probability hypothesis density (PHD) filter for extended target tracking, called the extended target box-particle PHD (ET-Box-PHD) filter. The proposed filter can dynamically track multiple extended targets and estimate the unknown number of extended targets, in the presence of clutter measurements, false alarms and missed detections, where the extended targets are described as a Poisson model developed by Gilholm et al. To get the PHD recursion of the ET-Box-PHD filter, a suitable cell likelihood function for one given reliable partition is derived, and the main filter steps are presented along with the necessary box manipulations and approximations. The capabilities and limitations of the proposed ET-Box-PHD filter are illustrated both in linear simulation examples and in nonlinear ones. The simulation results show that the proposed ET-Box-PHD filter can effectively avoid the high number of particles and obviously reduce computational burden, compared to a particle implementation of the standard PHD filter for extended target tracking. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
23. GPU-LMDDA: a bit-vector GPU-based deadlock detection algorithm for multi-unit resource systems.
- Author
-
Abell, Stephen, Do, Nhan, and Lee, John Jaehwan
- Subjects
- *
GRAPHICS processing units , *COMPUTER algorithms , *PSEUDOCODE (Computer program language) , *COMPUTER software , *RESOURCE allocation - Abstract
This article presents the detailed description of a GPU-based multi-unit deadlock detection methodology, GPU-LMDDA with 12 pieces of pseudo code. Our design utilises the massively parallel hardware of the GPU to perform computations of deadlock detection in multi-unit resource systems. As a result, it is able to overcome the major limitations of prior software and hardware-based solutions by handling thousands of processes and resources concurrently. GPU-LMDDA employs a bit-vector technique with a novel bit-matrix multiplication algorithm to store and perform computations on algorithm matrices, thus decreasing the memory footprint and maximizing throughput. Our design treats deadlock detection as a service to the operating system by requiring minimal interaction with the CPU. By treating deadlock detection as an interactive service, all matrix management and algorithm computation are handled by the GPU, freeing CPU compute cycles. Our algorithm is implemented on three GPU cards: Tesla C2050, Tesla K20c, and Titan X, which showed speedups of 3-434X against single-threaded CPU equivalents. As an interactive service to the CPU and with bit-vector technique, GPU-LMDDA provides significant speedups against CPU implementation for increasing number of resources and processes. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
24. Eye-tracking verification of the strategy used to analyse algorithms expressed in a flowchart and pseudocode.
- Author
-
Andrzejewska, Magdalena, Stolińska, Anna, Błasiak, Władysław, Pęczkowski, Paweł, Rosiek, Roman, Rożek, Bożena, Sajka, Mirosława, and Wcisło, Dariusz
- Subjects
- *
EYE tracking , *PROBLEM solving , *PSEUDOCODE (Computer program language) , *EDUCATIONAL planning , *FLOW charts - Abstract
The results of qualitative and quantitative investigations conducted with individuals who learned algorithms in school are presented in this article. In these investigations, eye-tracking technology was used to follow the process of solving algorithmic problems. The algorithmic problems were presented in two comparable variants: in a pseudocode and flowchart. The eye-tracking data resulted in both qualitative (films registering the gaze paths) and quantitative measures, which allowed the detection and interpretation of the differences in the task-solving strategies between those who found the correct answer and those that did not. The results confirmed a hypothesis that use of the formal notation characteristic of a programming language for presenting algorithms is often a practical difficulty in the process of solving even simple tasks. This study opens a new direction of research; they show that eye-tracking technology can be used to optimise the educational process of learning programming. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
25. Integrated calibration and magnetic disturbance compensation of three-axis magnetometers.
- Author
-
Pang, Hongfeng, Pan, Mengchun, Chen, Jinfei, Li, Ji, Zhang, Qi, and Luo, Shitu
- Subjects
- *
CALIBRATION , *MATHEMATICAL models , *MAGNETIC fields , *MAGNETOMETERS , *MAGNETIC devices , *PSEUDOCODE (Computer program language) , *COMPUTER network resources - Abstract
Technological limitations in sensor manufacturing and unwanted magnetic fields will corrupt the measurements of three-axis magnetometers. An experiment with four different magnetic disturbance situations is designed, and the influence of hard-iron and soft-iron material is analyzed. The calibration method with magnetic disturbance parameters is proposed for calibration and magnetic disturbance compensation of three axis magnetometers. It is not necessary to compute pseudo-linear parameters, thus the integrated parameters are computed directly by solving nonlinear equations. To employ this method, a nonmagnetic rotation equipment, a CZM-3 proton magnetometer, a DM-050 three-axis magnetometer, two magnets and two steel tubes are used. Calibration performance is discussed in the four situations. Compared with several traditional calibration methods, experiment results show that the proposed method has better integrated compensation performance in all situations, and error is reduced by several orders of magnitude. After compensation, RMS error is reduced from 10797.962 nT to 15.309 nT when the big magnet and steel tube are deployed. It suggests an useful method for calibration and magnetic disturbance compensation of three-axis magnetometers. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
26. Pseudo-differential operator associated with the fractional Fourier transform.
- Author
-
PRASAD, AKHILESH and KUMAR, PRAVEEN
- Subjects
- *
PSEUDOCODE (Computer program language) , *FOURIER transforms , *SCHWARTZ spaces , *CEPSTRUM analysis (Mechanics) , *FOURIER analysis - Abstract
The main goal of this paper is to study properties of the fractional Fourier transform on Schwartz type space ℒθ. Symbol class Sρ,σm,θ is introduced. The fractional pseudo-differential operators (f.p.d.o.) associated with the symbol a(x, ξ) are a continuous linear mapping of ℒ into ℒθ. Kernel and integral representations of f.p.d.o are obtained. The boundedness property of f.p.d.o. is studied. Application of the fractional Fourier transform in solving a generalized Fredholm integral equation is also given. [ABSTRACT FROM AUTHOR]
- Published
- 2016
27. Efficient editing and data abstraction by finding homogeneous clusters.
- Author
-
Ougiaroglou, Stefanos and Evangelidis, Georgios
- Subjects
- *
ABSTRACT data types (Computer science) , *DATA reduction , *DATA mining , *CLASSIFICATION algorithms , *PSEUDOCODE (Computer program language) - Abstract
The efficiency of the k-Nearest Neighbour classifier depends on the size of the training set as well as the level of noise in it. Large datasets with high level of noise lead to less accurate classifiers with high computational cost and storage requirements. The goal of editing is to improve accuracy by improving the quality of the training datasets. To obtain such datasets, editing removes noise and mislabeled data as well as smooths the decision boundaries between the discrete classes. On the other hand, prototype abstraction aims to reduce the computational cost and the storage requirements of classifiers by condensing the training data. This paper proposes an editing algorithm called Editing through Homogeneous Clusters (EHC). Then, it extends the idea by introducing a prototype abstraction algorithm that integrate the EHC mechanism and is capable of creating a small noise-free representative set of the initial training data. This algorithm is called Editing and Reduction through Homogeneous Clusters (ERHC). Both are based on a fast and parameter free iterative execution of k-means clustering that forms homogeneous clusters. Both consider as noise and remove clusters consisting of a single item. In addition, ERHC summarizes the items of the remaining clusters by storing the mean item for each one in the representative set. EHC and ERHC are tested on several datasets. The results show that both run very fast and achieve high accuracy. In addition, ERHC achieves high reduction rates. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
28. A Fast Algorithm to Generate Feasible Solution of Production Facilities Layout Based on Plane Segmentation.
- Author
-
Hou, Shi-wang, Li, Zhibin, and Wang, Hui
- Subjects
- *
ALGORITHMS , *TOPOLOGICAL spaces , *NUMBER theory , *ARBITRARY constants , *PSEUDOCODE (Computer program language) , *DECISION support systems - Abstract
For facility layout problem with continuous block and unequal area, it is key to generate feasible solution of facility layout with arbitrary space form in order to find the optimal arrangement scheme under a given goal. According to the given slicing position and slicing mode, the plane for arrangement was divided into many block areas by use of plane segmentation, which was consistent with the facilities in number. The precise coordinates of the lower-left corner and the top-right corner of each facility were calculated in light of its area, width, and length. The corresponding algorithm was designed in the form of pseudocode. The procedure proposed can provide a feasible facility layout solution. The running results of facilities layout instance containing 14 facilities show that the scheme can output facilities plane layout scheme quickly and provide decision support for the facilities planning. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
29. Clustering based unit commitment with wind power uncertainty.
- Author
-
Shukla, Anup and Singh, S.N.
- Subjects
- *
WIND power plants , *UNIT commitment problem (Electric power systems) , *CLUSTERING of particles , *PUMPED storage power plants , *PSEUDOCODE (Computer program language) , *PARTICLE swarm optimization - Abstract
Wind power generation is continuously increasing around the world, but due to uncertainty in wind power generation, the unit commitment problem has become complex. In this paper, scenario generation and reduction techniques are used to consider wind power uncertainty on system operation. Also, a new approach is developed for creating clusters of unit status associated with a probability of occurrence from an initial set of large wind power generation scenarios. And then a model of wind-hydro-thermal coordination problem along with the pumped storage plant is established. Combination of proposed weighted-improved crazy particle swarm optimization along with a pseudo code based algorithm and scenario analysis method is utilized to solve above problem. The effectiveness and feasibility of the proposed method is tested on systems with and without pumped storage plant integration. The results are analyzed in detail, which demonstrate the model and the proposed method is practicable in solving the unit commitment. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
30. Efficient computation of Sommerfeld integral tails – methods and algorithms.
- Author
-
Michalski, Krzysztof A. and Mosig, Juan R.
- Subjects
- *
GREEN'S functions , *MATHEMATICAL analysis , *NUMERICAL analysis , *PSEUDOCODE (Computer program language) , *ALGORITHMS - Abstract
A review is presented of the most effective methods for the computation of Sommerfeld integral tails. Such integrals, which are often oscillatory, singular and divergent, commonly arise in layered media Green functions. The mathematical foundations of various pertinent methods are discussed in detail and their performance is illustrated by relevant numerical examples. The associated algorithms are given in pseudocode for their easy computer implementation. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
31. Global Center Point Splitting: New Linear Node Splitting Algorithm for R-Trees.
- Author
-
Arafat, Manar
- Subjects
- *
PSEUDOCODE (Computer program language) , *SPLITTING extrapolation method , *QUADRATIC programming - Abstract
We introduce a new linear algorithm to split overflowed nodes of an R-tree index called the Global Center Point Splitting (GCPS) algorithm. The proposed method is an enhancement of the Quadratic splitting algorithm proposed by Guttmann (Guttman A, 1984; 47-57). Most known algorithms do not take advantage of the fact that most spatial objects data is known beforehand, and these objects are relatively easy to identify. In this paper we have adopted an informative approach by making use of spatial information provided by the problem space. Objects in the problem space are scanned and the Global Center Point (GCP) that the objects are concentrated around is determined. The GCPS algorithm uses the proximity between the Global Center Point (GCP) and the remaining objects in selecting a splitting axis that produces the most even split. We conducted several experiments using both real and synthetic data sets. Results show that the proposed splitting method outperforms the quadratic version in terms of construction time especially for nodes with high capacity. The query performance approximately remains the same. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
32. Implementation-level verification of algorithms with KeY.
- Author
-
Bruns, Daniel, Mostowski, Wojciech, and Ulbrich, Mattias
- Subjects
- *
SOFTWARE verification , *BENCHMARKING (Management) , *COMPUTER algorithms , *JAVA programming language , *PSEUDOCODE (Computer program language) , *STRUCTURAL analysis (Engineering) - Abstract
We give an account on the authors' experience and results from the software verification competition held at the Formal Methods 2012 conference. Competitions like this are meant to provide a benchmark for verification systems. It consisted of three algorithms which the authors have implemented in Java, specified with the Java Modeling Language, and verified using the KeY system. Building on our solutions, we argue that verification systems which target implementations in real-world programming languages better have powerful abstraction capabilities. Regarding the KeY tool, we explain features which, driven by the competition, have been freshly implemented to accommodate for these demands. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
33. Specifying and Verifying External Behaviour of Fair Input/Output Automata by Using the Temporal Logic of Actions.
- Author
-
KAPUS, Tatjana
- Subjects
- *
DATA transmission systems , *DATA replication , *PROGRAMMING languages , *FINITE state machines , *PSEUDOCODE (Computer program language) - Abstract
Fair input/output (or I/O) automata are a state-machine model for specifying and verifying reactive and concurrent systems. For the verification purposes, one is usually interested only in the sequences of interactions fair I/O automata offer to their environment. These sequences are called fair traces. The usual approach to the verification consists in proving fair trace inclusion between fair I/O automata. This paper presents a simple approach to the specification of fair traces and shows how to establish a fair trace inclusion relation for a pair of fair I/O automata by using the temporal logic of actions. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
34. ANALYSIS OF GNSS DATA USING PRECISE POINT POSITIONING TECHNIQUE FOR THE DETERMINATION OF PERMANENT STATION IN ROMANIA.
- Author
-
NISTOR, SORIN and BUDA, AURELIAN STELIAN
- Subjects
- *
PSEUDOCODE (Computer program language) , *EPHEMERIDES , *GLOBAL Positioning System , *IONOSPHERE - Abstract
To obtain the coordinates by means of precise point positioning (PPP) technique we need to use the undifferenced GPS pseudocode and carrier phase observations but to obtain the "precise" positioning we need precise orbit and clock data too. This products and other information for obtaining the results by using PPP technique on a centimeter level accuracy can be downloaded from different locations, but the most reliable satellite ephemerides and clock correction are available from International GNSS Service (IGS). In the PPP analysis we determined the parameters such as the receiver clock error, ionospheric delays code biases, code multipath and the total neutral atmosphere delay of the observations. For the determination of the permanent station coordinates, using the PPP technique, we used precise orbit and clock solutions to enable absolute positioning of a single receiver. In this article we present the results obtained by using the PPP technique on the permanent station Oradea, from which we can conclude that the PPP technique can be used for different GNSS application. [ABSTRACT FROM AUTHOR]
- Published
- 2015
35. On Decoding of the (73, 37, 13) Quadratic Residue Code.
- Author
-
Li, Yong, Liu, Hongqing, Chen, Qianbin, and Truong, Trieu-Kien
- Subjects
- *
BERLEKAMP-Massey algorithm , *GAUSSIAN elimination , *LINEAR programming , *PSEUDOCODEWORDS , *PSEUDOCODE (Computer program language) - Abstract
In this paper, a method to search the set of syndromes' indices needed in computing the unknown syndromes for the (73, 37, 13) quadratic residue (QR) code is proposed. According to the resulting index sets, one computes the unknown syndromes and thus finds the corresponding error-locator polynomial by using an inverse-free Berlekamp–Massey (BM) algorithm. Based on the modified Chase-II algorithm, the performance of soft-decision decoding for the (73, 37, 13) QR code is given. This result is new. Moreover, the error-rate performance of linear programming (LP) decoding for the (73, 37, 13) QR code is also investigated, and LP-based decoding is shown to be significantly superior in performance to the algebraic soft-decision decoding while requiring almost the same computational complexity. In fact, the algebraic hard-decision and soft-decision decoding of the (89, 45, 17) QR code outperforms that of the (73, 37, 13) QR code because the former has a larger minimal distance. However, experimental results indicate that the (73, 37, 13) QR code outperforms the (89, 45, 17) QR code with much fewer arithmetic operations when using the LP-based decoding algorithms. The pseudocodewords analysis partially explains this seemingly strange phenomenon. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
36. Utilising the chaos-induced discrete self organising migrating algorithm to solve the lot-streaming flowshop scheduling problem with setup time.
- Author
-
Davendra, Donald, Senkerik, Roman, Zelinka, Ivan, Pluhacek, Michal, and Bialic-Davendra, Magdalena
- Subjects
- *
LOZI mapping , *EVOLUTIONARY algorithms , *MATHEMATICAL formulas , *QUANTITATIVE research , *DATA analysis , *PSEUDOCODE (Computer program language) - Abstract
The Dissipative Lozi chaotic map is embedded in the discrete self organising migrating algorithm (DSOMA), as a pseudorandom generator. This novel chaotic based algorithm is applied to the constraint based lot-streaming flowshop scheduling problem. Two new and unique data sets generated using the Lozi and Delayed Logistic maps are used to compare the chaos embedded DSOMA and the generic DSOMA utilising the venerable Mersenne Twister. In total, 100 data sets were tested by these two algorithms, for the idling and the non-idling case. From the obtained results, the chaos variant algorithm is shown to significantly improve the performance of generic DSOMA. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
37. Incorporating Observation Quality Information into the Incremental LMS Adaptive Networks.
- Author
-
Rastegarnia, Amir and Khalili, Azam
- Subjects
- *
MEAN square algorithms , *ADAPTIVE filters , *HESSIAN matrices , *ELECTRIC network topology , *PSEUDOCODE (Computer program language) - Abstract
In this paper we investigate the effect of observation quality information (OQI) on the performance of a special class of adaptive networks known as distributed incremental least-mean square (DILMS) algorithm. To this aim we consider two different cases: (1) a homogeneous environment where all the nodes have the same observation noise variance (ONV) and (2) an inhomogeneous environment, where different nodes have different ONVs. In the first case we show that, for the same steady-state error, the DILMS algorithm has faster convergence rate in comparison with a non-cooperative scheme. In the second case, we first show that regardless of what ONVs are, the steady-state curves of mean-square deviation, excess mean-square error and mean square error (MSE) in each node are monotonically increasing functions of step-size parameter. Then, to use the OQI, we reformulate the parameter estimation as a constrained optimization problem with MSE criterion as the cost function and ONVs as the constraints. Using the Robbins-Monro method to solve the resultant problem, a new algorithm (which we call noise-constrained incremental LMS algorithm) is obtained which has faster convergence rate than the existing incremental LMS algorithm. Simulation results are also provided to clarify the performance of proposed algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
38. COSET: a program for deriving and testing merohedral and pseudo-merohedral twin laws.
- Author
-
Boyle, Paul D.
- Subjects
- *
COMPUTER software research , *POSIX (Computer software standard) , *PSEUDOCODE (Computer program language) , *DETECTORS , *SPREADSHEET software , *COMPUTER software - Abstract
COSET is a program written in ISO C99 with POSIX extensions which uses left coset decompositions to determine possible merohedral and pseudo-merohedral twin laws. In addition to a stand-alone program, the code may be compiled as a Python extension module. The program can create SHELXL instruction files which incorporate the appropriate TWIN and BASF instructions for the possible twin law(s). COSET may also be directed to execute a locally installed copy of the SHELXL binary executable to test the candidate twin laws in trial refinements. This facilitates the quick screening and assessment of possible twin laws. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
39. PSEUDO BEST ESTIMATOR BY A SEPARABLE APPROXIMATION OF SPATIAL COVARIANCE STRUCTURES.
- Author
-
Toshihiro Hirano
- Subjects
- *
PSEUDOCODE (Computer program language) , *SPATIAL analysis (Statistics) , *STATISTICAL correlation , *SPATIAL systems , *MONTE Carlo method - Abstract
We consider a linear regression model with a spatially correlated error term on a lattice. When estimating coefficients in the linear regression model, the generalized least squares estimator (GLSE) is used ifthe covariance structures are known. However, the GLSE for large spatial data sets is computationally expensive because of the matrix inversion. To reduce the computational complexity, we propose a pseudo best estimator (PBE) using spatial covariance structures approximated by separable covariance functions and derive its asymptotic covariance matrix. Monte Carlo simulations demonstrate that our proposed PBE performs well. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
40. ON THE DISAMBIGUATION OF FINITE AUTOMATA AND FUNCTIONAL TRANSDUCERS.
- Author
-
MOHRI, MEHRYAR
- Subjects
- *
FINITE state machines , *FUNCTIONAL analysis , *ALGORITHMS , *PSEUDOCODE (Computer program language) , *MATHEMATICAL models , *NUMERICAL analysis - Abstract
This paper introduces a new disambiguation algorithm for finite automata and functional finite-state transducers. It gives a full description of this algorithm, including a detailed pseudocode and analysis, and several illustrating examples. The algorithm is often more efficient and the result dramatically smaller than the one obtained using determinization for finite automata or the construction of Schützenberger. The unambiguous automaton or transducer created by our algorithm are never larger than those generated by the construction of Schützenberger. In fact, in a variety of cases, the size of the unambiguous transducer returned by our algorithm is only linear in that of the input transducer while the transducer created by the construction of Schützenberger is exponentially larger. Our algorithm can be used effectively in many applications to make automata and transducers more efficient to use. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
41. Leveraging data lineage to infer logical relationships between astronomical catalogs.
- Author
-
Buddelmeijer, Hugo and Valentijn, Edwin
- Subjects
- *
ASTRONOMY databases , *CELESTIAL cartography , *PSEUDOCODE (Computer program language) , *INFORMATION storage & retrieval systems , *ASTRONOMICAL observatories , *DATA mining , *ASTRONOMY - Abstract
A novel method to infer logical relationships between sets is presented. These sets can be any collection of elements, for example astronomical catalogs of celestial objects. The method does not require the contents of the sets to be known explicitly. It combines incomplete knowledge about the relationships between sets to infer a priori unknown relationships. Relationships between sets are represented by sets of Boolean hypercubes. This leads to deductive reasoning by application of logical operators to these sets of hypercubes. A pseudo code for an efficient implementation is described. The method is used in the Astro-WISE information system to infer relationships between catalogs of astronomical objects. These catalogs can be very large and, more importantly, their contents do not have to be available at all times. Science products are stored in Astro-WISE with references to other science products from which they are derived, or their dependencies. This creates a full data lineage that links every science product all the way back to the raw data. Catalogs are created in a way that maximizes knowledge about their relationship with their dependencies. The presented algorithm is used to determine which objects a catalog represents by leveraging this information. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
42. Optimal design of the synthetic chart for the process mean based on median run length.
- Author
-
Khoo, MichaelB. C., Wong, V.H., Wu, Zhang, and Castagliola, Philippe
- Subjects
- *
OPTIMAL designs (Statistics) , *APPROXIMATION theory , *PSEUDOCODE (Computer program language) , *MARKOV processes , *WOLFRAM language (Computer program language) , *STOCHASTIC processes - Abstract
Control charts are usually designed using average run length as the criterion to be optimized. The shape of the run length distribution changes according to the shift in the mean, from highly skewed for an in-control process to approximately symmetric when the mean shift is large. Since the shape of the run length distribution changes with the mean shift, the Median Run Length (MRL) provides a more meaningful interpretation for in-control and out-of-control performances of the charts, and it is readily understood by practitioners. This article presents an optimal design procedure for a synthetic chart able to monitor the mean based on the MRL under the zero- and steady-state modes. The synthetic chart integrates and conforming run length charts. Pseudocodes and Mathematica programs are presented for the computation of the optimal parameters of the synthetic chart based on a desired in-control MRL (MRL(0)), a given sample size, and a specified mean shift for which a quick detection is needed. [Supplementary materials are available for this article. Go to the publisher's online edition of IIE Transactions for additional example, additional performance study, proof, tables, and figures.] [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
43. Domain Coloring of Complex Functions: An Implementation-Oriented Introduction.
- Author
-
Poelke, Konstantin and Polthier, Konrad
- Subjects
- *
MATHEMATICAL complexes , *INTERPOLATION , *JAVA programming language , *INTERPOLATION algorithms , *CODING theory , *DIMENSIONAL analysis , *PSEUDOCODE (Computer program language) - Abstract
This article gives a short overview of domain coloring for complex functions that have four-dimensional function graphs and therefore can't be visualized traditionally. The authors discuss several color schemes, focus on various aspects of complex functions, and provide Java-like pseudocode examples explaining the crucial ideas of the coloring algorithms to allow for easy reproduction. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
44. SPEED: An Inhabitant Activity Prediction Algorithm for Smart Homes.
- Author
-
Alam, Muhammad Raisul, Reaz, M. B. I., and Mohd Ali, M. A.
- Subjects
- *
HOME automation , *PREDICTION theory , *ALGORITHMS , *HOUSEHOLD appliances , *MARKOV processes , *DECISION trees , *PSEUDOCODE (Computer program language) , *COMPLEXITY (Philosophy) - Abstract
This paper proposes an algorithm, called sequence prediction via enhanced episode discovery (SPEED), to predict inhabitant activity in smart homes. SPEED is a variant of the sequence prediction algorithm. It works with the episodes of smart home events that have been extracted based on the on –off states of home appliances. An episode is a set of sequential user activities that periodically occur in smart homes. The extracted episodes are processed and arranged in a finite-order Markov model. A method based on prediction by partial matching (PPM) algorithm is applied to predict the next activity from the previous history. The result shows that SPEED achieves an 88.3% prediction accuracy, which is better than LeZi Update, Active LeZi, IPAM, and C4.5. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
45. Computing the moments of -bounded pseudo-Boolean functions over Hamming spheres of arbitrary radius in polynomial time
- Author
-
Sutton, Andrew M., Darrell Whitley, L., and Howe, Adele E.
- Subjects
- *
BOOLEAN functions , *POLYNOMIALS , *RADIUS (Geometry) , *PSEUDOCODE (Computer program language) , *ALGEBRAIC functions , *ACCESS to information - Abstract
Abstract: We show that given a -bounded pseudo-Boolean function , one can always compute the th moment of over regions of arbitrary radius in Hamming space in polynomial time using algebraic information from the adjacency structure (where and are constants). This result has implications for evolutionary algorithms and local search algorithms because information about promising regions of the search space can be efficiently retrieved, even if the cardinality of the region is exponential in the problem size. Finally, we use our results to introduce a method of efficiently calculating the expected fitness of mutations for evolutionary algorithms. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
46. Quasi-Cyclic LDPC Codes: Influence of Proto- and Tanner-Graph Structure on Minimum Hamming Distance Upper Bounds.
- Author
-
Smarandache, Roxana and Vontobel, Pascal O.
- Subjects
- *
PROGRAMMING languages , *GRAPHIC methods , *MATRICES (Mathematics) , *PSEUDOCODE (Computer program language) , *ISOMORPHISM (Mathematics) - Abstract
Quasi-cyclic (QC) low-density parity-check (LDPC) codes are an important instance of proto-graph-based LDPC codes. In this paper we present upper bounds on the minimum Hamming distance of QC LDPC codes and study how these upper bounds depend on graph structure parameters (like variable degrees, check node degrees, girth) of the Tanner graph and of the underlying proto-graph. Moreover, for several classes of proto-graphs we present explicit QC LDPC code constructions that achieve (or come close to) the respective minimum Hamming distance upper bounds. Because of the tight algebraic connection between QC codes and convolutional codes, we can state similar results for the free Hamming distance of convolutional codes. In fact, some QC code statements are established by first proving the corresponding convolutional code statements and then using a result by Tanner that says that the minimum Hamming distance of a QC code is upper bounded by the free Hamming distance of the convolutional code that is obtained by “unwrapping” the QC code. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
47. Coding and Counting Arrangements of Pseudolines.
- Author
-
Felsner, Stefan and Valtr, Pavel
- Subjects
- *
DISCRETE geometry , *COMPUTATIONAL mathematics , *PSEUDOCODE (Computer program language) , *NUMERICAL analysis , *MATHEMATICAL analysis - Abstract
Arrangements of lines and pseudolines are important and appealing objects for research in discrete and computational geometry. We show that there are at most $2^{0.657\> n^{2}}$ simple arrangements of n pseudolines in the plane. This improves on previous work by Knuth who proved an upper bound of $3^{\binom{n}{2}} \cong 2^{0.792\> n^{2}}$ in 1992 and the first author, who obtained $2^{0.697\> n^{2}}$ in 1997. The argument uses surprisingly little geometry. The main ingredient is a lemma that was already central to the argument given by Knuth. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
48. Pseudo-cyclic renewal systems
- Author
-
Shin, Sujin
- Subjects
- *
MODULES (Algebra) , *ALPHABETS , *COMPUTER systems , *SET theory , *PSEUDOCODE (Computer program language) , *COMPUTER algorithms , *COMPUTER programming - Abstract
Abstract: A finite set of words over an alphabet is cyclic if, whenever and , we have . If it is only assumed that the property holds for all with a large length, then is called pseudo-cyclic, that is, there is such that, whenever with , and , we have . We analyze the class of pseudo-cyclic sets and describe how it is related to the open question which asks whether every irreducible shift of finite type is conjugate to a renewal system. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
49. Faster Recursions in Sphere Decoding.
- Author
-
Ghasemmehdi, Arash and Agrell, Erik
- Subjects
- *
DECODERS (Electronics) , *RECURSION theory , *SPHERES , *LATTICE theory , *MAXIMUM likelihood statistics , *MIMO systems , *PSEUDOCODE (Computer program language) , *FLOATING-point arithmetic - Abstract
Most of the calculations in standard sphere decoders are redundant in the sense that they either calculate quantities that are never used or calculate some quantities more than once. A new method, which is applicable to lattices as well as finite constellations, is proposed to avoid these redundant calculations while still returning the same result. Pseudocode is given to facilitate immediate implementation. Simulations show that the speed gain with the proposed method increases linearly with the lattice dimension. At dimension 60, the new algorithms avoid about 75% of all floating-point operations. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
50. Character confusion versus focus word-based correction of spelling and OCR variants in corpora.
- Author
-
Reynaert, Martin W. C.
- Subjects
- *
ORTHOGRAPHY & spelling , *OPTICAL character recognition , *PSEUDOCODE (Computer program language) , *CORPORA , *LEXICON , *ANAGRAMS - Abstract
We present a new approach based on anagram hashing to handle globally the lexical variation in large and noisy text collections. Lexical variation addressed by spelling correction systems is primarily typographical variation. This is typically handled in a local fashion: given one particular text string some system of retrieving near-neighbors is applied, where near-neighbors are other text strings that differ from the particular string by a given number of characters. The difference in characters between the original string and one of its retrieved near-neighbors constitutes a particular character confusion. We present a global way of performing this action: for all possible particular character confusions given a particular edit distance, we sequentially identify all the pairs of text strings in the text collection that display a particular confusion. We work on large digitized corpora, which contain lexical variation due to both the OCR process and typographical or typesetting error and show that all these types of variation can be handled equally well in the framework we present. The character confusion-based prototype of Text-Induced Corpus Clean-up ( ticcl) is compared to its focus word-based counterpart and evaluated on 6 years' worth of digitized Dutch Parliamentary documents. The character confusion approach is shown to gain an order of magnitude in speed on its word-based counterpart on large corpora. Insights gained about the useful contribution of global corpus variation statistics are shown to also benefit the more traditional word-based approach to spelling correction. Final tests on a held-out set comprising the 1918 edition of the Dutch daily newspaper 'Het Volk' show that the system is not sensitive to domain variation. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.