397 results on '"*COMPUTER software correctness"'
Search Results
2. An architecture refactoring approach to reducing software hierarchy complexity.
- Author
-
Zhao, Yongxin, Wu, Wenhan, Fei, Yuan, Liu, Zhihao, Li, Yang, Yang, Yilong, Shi, Ling, and Zhang, Bo
- Subjects
- *
SOFTWARE refactoring , *COMPUTER software correctness , *COMPUTER software quality control , *SOFTWARE architecture , *COMPUTER programming , *BATTERY management systems - Abstract
Summary: Software complexity is the very essence of computer programming. As the complexity increases, the potential risks and defects of software systems will increase. This makes the software correctness analysis and the software quality improvement more difficult. In this paper, we present a quantitative metric to describe the complexity of a hierarchical software and a Complexity‐oriented Software Architecture Refactoring (CoSSR) approach to reduce the complexity. The main idea is to identify and then reassemble subcomponents into one hierarchical component, which achieves minimum complexity in terms of the solution algorithm. Moreover, our algorithm can be improved by introducing partition constraint, heuristic search strategy, and spectral clustering. We implement the proposed method as an automated refactoring tool and demonstrate our algorithm through a case study of battery management system (BMS). The results show that our approach is more efficient and effective to reduce the complexity of hierarchical software system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. LoRe: A Programming Model for Verifiably Safe Local-first Software.
- Author
-
Haas, Julian, Mogk, Ragnar, Yanakieva, Elena, Bieniusa, Annette, and Mezini, Mira
- Subjects
- *
COMPUTER software correctness , *COMPILERS (Computer programs) , *REACTIVE flow , *COMPUTER software - Abstract
Local-first software manages and processes private data locally while still enabling collaboration between multiple parties connected via partially unreliable networks. Such software typically involves interactions with users and the execution environment (the outside world). The unpredictability of such interactions paired with their decentralized nature make reasoning about the correctness of local-first software a challenging endeavor. Yet, existing solutions to develop local-first software do not provide support for automated safety guarantees and instead expect developers to reason about concurrent interactions in an environment with unreliable network conditions. We propose LoRe, a programming model and compiler that automatically verifies developer-supplied safety properties for local-first applications. LoRe combines the declarative data flow of reactive programming with static analysis and verification techniques to precisely determine concurrent interactions that violate safety invariants and to selectively employ strong consistency through coordination where required. We propose a formalized proof principle and demonstrate how to automate the process in a prototype implementation that outputs verified executable code. Our evaluation shows that LoRe simplifies the development of safe local-first software when compared to state-of-the-art approaches and that verification times are acceptable. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. A Deep Learning-Based Consistency Test Approach for Earth System Models on Heterogeneous Many-Core Systems.
- Author
-
Yangyang Yu, Shaoqing Zhang, Haohuan Fu, Dexun Chen, Yang Gao, Xiaopei Lin, Zhao Liu, and Xiaojing Lv
- Subjects
- *
DEEP learning , *COMPUTER software correctness , *HETEROGENEOUS computing , *SOFTWARE verification , *TEMPORAL integration , *HUMAN error - Abstract
Physical and heat limits of the semiconductor technology require the adaptation of heterogeneous architectures in supercomputers to maintain a continuous increase of computing performance. The coexistence of general-purpose cores and accelerator cores, which usually employ different hardware architectures, can lead to bit-level differences, especially when we try to maximize the performance on both kinds of cores. Such differences further lead to unavoidable computational perturbations through temporal integration, which can blend with software or human errors. Software correctness verification in the form of quality assurance is a critically important step in the development and optimization of Earth system models (ESMs) on heterogeneous many-core systems with mixed perturbations of software changes and hardware updates. We have developed a deep learning-based consistency test approach for Earth System Models referred to as ESM-DCT. The ESM-DCT is based on the unsupervised bidirectional gate recurrent unit-autoencoder (BGRU-AE) model, which can still detect the existence of software or human errors when taking hardware-related perturbations into account. We use the Community Earth System Model (CESM) on the new Sunway system as an example of large-scale ESMs to evaluate the ESM-DCT. The results show that facing with the mixed perturbations caused by hardware designs and software changes in heterogeneous computing, the ESM-DCT can detect software or human errors when determining whether or not the model simulation is consistent with the original results in homogeneous computing. Our ESM-DCT tool provides an efficient and objective approach for verifying the reliability of the development and optimization of scientific computing models on the heterogeneous many-core systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Verifying Correctness.
- Author
-
Hoffmann, Leah
- Subjects
- *
CRYPTOGRAPHERS , *CRYPTOGRAPHY , *MATHEMATICAL proofs , *MATHEMATICAL logic , *COMPUTER software correctness - Abstract
An interview is presented with cryptographer Yael Tauman Kalai. She discusses her career as a Senior Principal Researcher at Microsoft Research and an adjunct professor at the Massachusetts Institute of Technology (MIT), her work on proof systems, specifically the Fiat-Shamir paradigm, and her development of certificates that would certify correctness of a computation.
- Published
- 2024
- Full Text
- View/download PDF
6. Model Transformation Testing and Debugging: A Survey.
- Author
-
TROYA, JAVIER, SEGURA, SERGIO, BURGUEÑO, LOLA, and WIMMER, MANUEL
- Subjects
- *
DEBUGGING , *COMPUTER software correctness , *SYSTEMS software , *COMMUNITIES - Abstract
Model transformations are the key technique in Model-Driven Engineering (MDE) to manipulate and construct models. As a consequence, the correctness of software systems built with MDE approaches relies mainly on the correctness of model transformations, and thus, detecting and locating bugs in model transformations have been popular research topics in recent years. This surge of work has led to a vast literature on model transformation testing and debugging, which makes it challenging to gain a comprehensive view of the current state-of-the-art. This is an obstacle for newcomers to this topic and MDE practitioners to apply these approaches. This article presents a survey on testing and debugging model transformations based on the analysis of 140 papers on the topics. We explore the trends, advances, and evolution over the years, bringing together previously disparate streams of work and providing a comprehensive view of these thriving areas. In addition, we present a conceptual framework to understand and categorize the different proposals. Finally, we identify several open research challenges and propose specific action points for the model transformation community. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Modeling of processes of technological preparation of additive manufacturing based on synthetic and analytical models of surfaces.
- Author
-
Anamova, R. R. and Nartova, L. G.
- Subjects
- *
COMPUTER software correctness , *SURFACE roughness , *COMPUTER-aided design software , *SURFACE properties , *MECHANICAL engineering - Abstract
In aircraft and mechanical engineering, parts with complex internal cavities (channels) are often found. Manufacturing such parts using additive technologies is easier than traditional methods. However, this raises the problem of obtaining the required surface quality, namely, a certain value of the surface roughness parameters. Prediction of the roughness of the channel surface of a part is one of the most important stages in the technological preparation of additive manufacturing. The aim of the study is to model a channel surface based on its analytical description for subsequent forecasting of its roughness. The article provides a systematization of the basic concepts, constructive properties of surfaces of various classes. The features of the design of surfaces defined by a synthetic (graphic) method and by means of a mathematical description have been studied. A metric is introduced for analytically defined surfaces based on the properties of their internal geometry. Specific examples of surfaces are considered. An algorithm is given for modeling tangent planes to channel surfaces in order to further predict the roughness of the channel surface during its manufacture using the additive manufacturing technology. Directions for further research should be related to the development of algorithmic and CAD software, as well as verification of the developed software for the correctness of the results obtained. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Quantum software testing: State of the art.
- Author
-
García de la Barrera, Antonio, García‐Rodríguez de Guzmán, Ignacio, Polo, Macario, and Piattini, Mario
- Subjects
- *
COMPUTER software correctness , *SOFTWARE engineering , *COMPUTER software testing , *QUANTUM computing , *QUANTUM theory , *SOFTWARE engineers - Abstract
Quantum computing is expected to exponentially outperform classic computing on a broad set of problems, including encryption, machine learning, and simulations. It has an impact yet to explore on all software lifecycle's processes and techniques. Testing quantum software raises a significant number of challenges due to the unique properties of quantum physics—such as superposition and entanglementand the stochastic behavior of quantum systems. It is, therefore, an open research issue. In this work, we offer a systematic mapping study of quantum software testing engineering, presenting a comprehensive view of the current state of the art. The main identified trends in testing techniques are (1) the statistic approaches based on repeated measurements and (2) the use of Hoare‐like logics to reason about software correctness. Another relevant line of research is reversible circuit testing, which is partially applicable to quantum software unitary testing. Finally, we have observed a flourishing of secondary studies and frameworks supporting testing processes from 2018 onwards. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. Sports training auxiliary decision support system based on neural network algorithm.
- Author
-
Wang, Tianyi
- Subjects
- *
DECISION support systems , *PHYSICAL training & conditioning , *MACHINE learning , *COMPUTER software correctness , *STIMULUS & response (Psychology) , *INTELLIGENT tutoring systems - Abstract
In order to improve the effect of sports training auxiliary decision, this paper combines the needs of sports training auxiliary system to carry out functional analysis and improve the traditional machine learning algorithm. The domain adversarial neural network based on maximum entropy loss combines the ability of maximum entropy loss to process misclassified samples and uses classification loss and domain adversarial loss to solve the problem of inconsistent edge distribution of category features between domains. Moreover, this paper takes sports decision as the core and introduces tasks of different difficulty and video training into research. In addition, this paper uses simulation software to measure the correctness of sports training in different scenarios and the data of the response latency and applies the neural network algorithm to the construction of the sports training auxiliary decision system. Finally, this paper designs experiments to study sports training recognition and sports training decision-making and builds an intelligent system through a simulation platform. The experimental research results show that the system constructed in this paper has a good sports training auxiliary decision function. The reliability of the method in this article can be verified in practice in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Electromagnetic control based on Lie symmetry transformation.
- Author
-
Zheng Mingliang
- Subjects
- *
MAXWELL equations , *TRANSFORMATION optics , *COORDINATE transformations , *SYMMETRY , *COMPUTER software correctness - Abstract
A design method of electromagnetic metamaterial based on Lie symmetry of Maxwell's equation is proposed, which is applied to the modulation of electromagnetic wave/light. Firstly, the electromagnetic control model based on metamaterials is introduced. Then, according to the theory of Transformation Optics (TO), Lie symmetry analysis is applied to the coordinate transformation of material physical space, and the key core is the determining equations of Lie symmetry derived. Secondly, the analytical forms of constitutive parameters (permittivity and permeability) of metamaterials are introduced, which can be used to design all kinds of electromagnetic metamaterials. Finally, the Lie symmetry method is applied to the control of electromagnetic beam width. The results show that the metamaterial based on Lie symmetry of Maxwell's equation has good field distribution, and it overcomes the single subjectivity of traditional coordinate transformation in optical transformation. The wave simulation by COMSOL Multiphysics software verifies the correctness of Lie symmetry method. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
11. An anonymous verifiable random function with unbiasability and constant size proof.
- Author
-
Yao, Shuang and Zhang, Dawei
- Subjects
- *
RANDOM functions (Mathematics) , *UNIQUENESS (Mathematics) , *COMPUTER software correctness , *ANONYMITY , *BILINEAR forms - Abstract
Recently, verifiable random function (VRF) has been frequently applied in secure consensus protocols and e-lottery system to achieve random selection. In this context, how to build various secure verifiable random function to suit increasing application requirements receives much attention. Therefore, in this paper, we propose a new type of anonymous verifiable random function (AVRF). Concretely, the proposed anonymous verifiable random function achieves identity anonymity of prover based on the Decision Linear assumption that is secure in bilinear groups in contrast to the known anonymous verifiable random function which is relied on the Decisional Diffie–Hellman (DDH) assumption. The anonymity indicates that the verifier cannot recognize the prover's identity. In addition, our proposed AVRF is also unbiasable. It can provide unpredictability under malicious key generation. Security analysis shows that the proposed scheme satisfies correctness, uniqueness, pseudorandomness, anonymity and unbiasability. Theoretical analysis results indicate that our anonymous verifiable random function is relatively efficient, and its proof size is constant regardless of input size. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Mathematical model and motion analysis of a wheeled vibro-impact locomotion system.
- Author
-
Korendiy, Vitaliy, Gursky, Volodymyr, Kachur, Oleksandr, Dmyterko, Petro, Kotsiumbas, Oleh, and Havrylchenko, Oleksandr
- Subjects
- *
MOTION analysis , *MATHEMATICAL models , *COMPUTER software correctness , *MATHEMATICAL optimization - Abstract
The paper is aimed at investigating the motion conditions of the wheeled vibro-impact locomotion system equipped with the twin crank-slider excitation mechanism and the additional braking mechanisms allowing only one-way rotation of the wheels. The novelty of the present research consists in the improved mathematical model describing the motion conditions of the vibro-impact system and the proposed parameters optimization technique that allows for maximizing the average translational velocity of the wheeled platform. The main idea of this technique is to provide the maximal velocities of internal bodies when they get in contact with the corresponding impact plates. The numerical modeling results describing the dynamic behavior of the vibro-impact system are obtained in Mathematica software and substantiate the correctness of the developed mathematical model and of the proposed parameters optimization technique. The paper can be of significant practical and scientific interest for researchers and engineers studying and improving the vibratory locomotion systems, e.g., for inspecting and cleaning the pipelines. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. [formula omitted]: A template to build verified thread-local interfaces with software scheduler abstractions.
- Author
-
Kim, Jieung, Koenig, Jérémie, Chen, Hao, Gu, Ronghui, and Shao, Zhong
- Subjects
- *
COMPUTER software , *COMPUTER software correctness , *SOFTWARE reliability , *RESOURCE allocation , *SOFTWARE engineering , *SOFTWARE verification - Abstract
This paper presents ThreadAbs , an extension of the layer-based software formal verification toolkit CCAL (Gu et al., 2018). ThreadAbs is specifically designed to provide better expressiveness and proof management for thread abstraction in multithreaded libraries. Thread abstraction isolates the behavior of each thread from others when providing a top-level formal specification for software. Compared to the original CCAL , ThreadAbs offers significant improvements in this regard. CCAL is a verification framework that enables a layered approach to building certified software, as demonstrated by multiple examples (Gu et al. 2016; Li et al. 2021; Shin et al. 2019). Obviously, its main targets usually include multithread libraries, which support significant improvement in the utilization and isolation of system resources. However, it poses new challenges for formal verification. Firstly, it requires a sudden change in the granularity of concurrency during the implementation and verification of the target software. Typically, systems are associated with software schedulers that are built on top of several underlying components in the system (e.g. , thread spawn, yield, sleep, and wake-up). Due to the software scheduler, these systems can be divided into low-level components consisting of modules that the software scheduler depends on (e.g. , allocators for shared resources and scheduling queues) and high-level components that use software schedulers (e.g. , condition variables, semaphores, and IPCs). Therefore, software formal verification on those systems has to provide proper method to deal with those distinct features, which is usually abstracting other threads' behavior as much as possible to provide an independent thread model and its formal specification. Secondly, it requires handling side effects from other threads, such as dynamic resource allocation from parents with proper isolation of all threads from each other. CCAL has limited support for two crucial aspects of formal verification in multithreaded systems. Firstly, its previous thread abstraction method does not handle the side effects caused by a parent thread during dynamic initial state allocation properly. Secondly, the proofs produced by CCAL are tied to CertiKOS , which makes it challenging to use them for similar proofs that use CCAL as their verification toolkit. To address these issues, we introduce ThreadAbs , a new mechanized methodology that provides proper thread abstraction to reason about multithreaded programs in conjunction with CCAL. We also extend the previous CertiKOS proof with ThreadAbs to demonstrate its usability and expressiveness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. On methods and tools for rigorous system design.
- Author
-
Bliudze, Simon, Katsaros, Panagiotis, Bensalem, Saddek, and Wirsing, Martin
- Subjects
- *
SYSTEMS design , *COMPUTER software correctness , *SYSTEMS software , *SOFTWARE verification - Abstract
Full a posteriori verification of the correctness of modern software systems is practically infeasible due to the sheer complexity resulting from their intrinsic concurrent nature. An alternative approach consists of ensuring correctness by construction. We discuss the Rigorous System Design (RSD) approach, which relies on a sequence of semantics-preserving transformations to obtain an implementation of the system from a high-level model while preserving all the properties established along the way. In particular, we highlight some of the key requirements for the feasibility of such an approach, namely availability of (1) methods and tools for the design of correct-by-construction high-level models and (2) definition and proof of the validity of suitable domain-specific abstractions. We summarise the results of the extended versions of seven papers selected among those presented at the 1 st and the 2 nd International Workshops on Methods and Tools for Rigorous System Design (MeTRiD 2018–2019), indicating how they contribute to the advancement of the RSD approach. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
15. A Unified Analysis for the Free Vibration of the Sandwich Piezoelectric Laminated Beam with General Boundary Conditions under the Thermal Environment.
- Author
-
Gao, Guohua, Sun, Ningze, Shao, Dong, Tao, Yongqiang, and Wu, Wei
- Subjects
- *
LAMINATED composite beams , *FREE vibration , *HAMILTON'S principle function , *SHEAR (Mechanics) , *COMPUTER software correctness - Abstract
This article mainly analyzes the free vibration characteristic of the sandwich piezoelectric beam under elastic boundary conditions and thermal environment. According to the first-order shear deformation theory and Hamilton's principle, the thermo-electro-elastic coupling equations of the sandwich piezoelectric beam are obtained. Meanwhile, elastic boundary conditions composed of an array of springs are introduced, and the displacement variables and external potential energy of the beam are expressed as wave functions. By using the method of reverberation-ray matrix to integrate and solve the governing equations, a search algorithm based on golden-section search is introduced to calculate the required frequency parameters. A series of numerical results are compared with those reported in literature studies and obtained by simulation software to verify the correctness and versatility of the search algorithm. In addition, three parametric research cases are proposed to investigate the frequency parameters of sandwich piezoelectric beams with elastic restraint conditions, material parameters, thickness ratio, different temperature rises, and external electric potential. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
16. Empirical studies on software traceability: A mapping study.
- Author
-
Charalampidou, Sofia, Ampatzoglou, Apostolos, Karountzos, Evangelos, and Avgeriou, Paris
- Subjects
- *
CARTOGRAPHY software , *COMPUTER software correctness , *EMPIRICAL research , *MAINTAINABILITY (Engineering) - Abstract
During the last decades, software traceability has been studied in a large number of studies, from different perspectives (e.g., how to create traces and what are its benefits). This large body of knowledge needs to be better explored and exploited by both practitioners and researchers: We need an overview of different aspects of traceability and a structured way to assess and compare existing work in order to extend it with new research or apply it in practice, Thus, we have conducted a secondary study on this large corpus of primary studies, focusing on empirical studies on software traceability, without setting any further restrictions in terms of investigating a specific domain or concrete artifacts. The study explores the goals of existing approaches and the empirical methods used for their evaluation. Its main contributions are the investigation of (a) the type of artifacts linked through traceability approaches; (b) the benefits of using artifact traceability approaches; (c) the ways of measuring their benefit; and (d) the research methods used. The results of the study suggest that (i) requirements artifacts are dominant in traceability; (ii) the research corpus focuses on the proposal of novel techniques for establishing traceability; and (iii) the main benefits are the improvement of software correctness and maintainability. Finally, although many studies include some empirical validation, there is still room for improvements and research methods that can be used more extensively. The obtained results are discussed under the prism of both researchers and practitioners and are compared against the state‐of‐the‐art. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
17. A virtual inertial control strategy for bidirectional interface converters in hybrid microgrid.
- Author
-
Ren, Mingwei, Sun, Xu, Sun, Yuxin, Shi, Kai, and Xu, Peifeng
- Subjects
- *
MICROGRIDS , *AC DC transformers , *SYNCHRONOUS generators , *COMPUTER software correctness , *POWER transmission , *DYNAMICAL systems - Abstract
• Based on virtual synchronous generator (VSG) technology, virtual inertia equations are established in AC measurement and DC side in AC/DC hybrid microgrid. • Accurate power sharing among parallel BICs is achieved by making BICs share active power equally under stable conditions through virtual resistors. • Proposes a virtual inertial control strategy, simulation and verify the validity of the analysis. Insufficient inertia is one of the urgent problems to be solved in the stability of AC-DC hybrid microgrid. In order to improve AC bus frequency and DC bus voltage inertia in hybrid microgrid, a virtual inertia control strategy for bidirectional interface converter (BIC) based on virtual synchronous generator (VSG) is proposed. The virtual inertia equations on both sides of the AC and DC are used to slow down the variation of the DC voltage and the AC frequency. When the load fluctuates, the dynamic response of the DC voltage, the AC frequency and the power transmission between the AC and DC subnets is improved, thus improving the stability of the system. This control strategy enables BICs to share active power equally under stable conditions, prevents circulating power, and achieves accurate power sharing among parallel BICs. The small signal model of BIC under virtual inertia control is established, the dynamic characteristics of the system under different control parameters are analyzed, and the influence of the control parameters on the system is discussed. The simulation model is established in PSCAD/EMTDC software to verify the correctness of the proposed control strategy. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
18. Blame and coercion: Together again for the first time.
- Author
-
SIEK, JEREMY G., THIEMANN, PETER, and WADLER, PHILIP
- Subjects
- *
PROGRAMMING languages , *COMPUTER software correctness , *BISIMULATION , *MATHEMATICAL optimization , *PROOF theory - Abstract
C#, Dart, Pyret, Racket, TypeScript, VB: many recent languages integrate dynamic and static types via gradual typing. We systematically develop four calculi for gradual typing and the relations between them, building on and strengthening previous work. The calculi are as follows: $\lambda{B}$ , based on the blame calculus of Wadler and Findler (2009); $\lambda{C}$ , inspired by the coercion calculus of Henglein (1994); $\lambda{S}$ inspired by the space-efficient calculus of Herman, Tomb, and Flanagan (2006); and $\lambda{T}$ based on the threesome calculus of Siek and Wadler (2010). While $\lambda{B}$ and $\lambda{T}$ are little changed from previous work, $\lambda{C}$ and $\lambda{S}$ are new. Together, $\lambda{B}$ , $\lambda{C}$ , $\lambda{S}$ , and $\lambda{T}$ provide a coherent foundation for design, implementation, and optimization of gradual types. We define translations from $\lambda{B}$ to $\lambda{C}$ , from $\lambda{C}$ to $\lambda{S}$ , and from $\lambda{S}$ to $\lambda{T}$. Much previous work lacked proofs of correctness or had weak correctness criteria; here we demonstrate the strongest correctness criterion one could hope for, that each of the translations is fully abstract. Each of the calculi reinforces the design of the others: $\lambda{C}$ has a particularly simple definition, and the subtle definition of blame safety for $\lambda{B}$ is justified by the simple definition of blame safety for $\lambda{C}$. Our calculus $\lambda{S}$ is implementation-ready: the first space-efficient calculus that is both straightforward to implement and easy to understand. We give two applications: first, using full abstraction from $\lambda{C}$ to $\lambda{S}$ to establish an equational theory of coercions; and second, using full abstraction from $\lambda{B}$ to $\lambda{S}$ to easily establish the Fundamental Property of Casts, which required a custom bisimulation and six lemmas in earlier work. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
19. Beyond Relational Databases: Preserving the Data.
- Author
-
Ramalho, José Carlos, Ferreira, Bruno, Faria, Luis, and Ferreira, Miguel
- Subjects
- *
RELATIONAL databases , *INFORMATION storage & retrieval systems , *ELECTRONIC information resources , *DATABASES , *COMPUTER software correctness - Abstract
Relational databases are one of the main technologies supporting information assets in today's organizations. They are designed to store, organize and retrieve digital information, and are such a fundamental part of information systems that most would not be able to function without them. Very often, the information contained in databases is irreplaceable or prohibitively expensive to reacquire; therefore, steps must be taken to ensure that the information within databases is preserved. This paper describes a methodology for long-term preservation of relational databases based on information extraction and format migration to a preservation format. It also presents a tool that was developed to support this methodology: Database Preservation Toolkit (DBPTK), as well as the processes and formats needed to preserve databases. The DBPTK connects to live relational databases and extracts information into formats more adequate for long-term preservation. Supported preservation formats include the SIARD 2, created by a cooperation between the Swiss Federal Archives and the E-ARK project that is becoming a standard in the area. DBPTK has a flexible plugin-based architecture enabling its use for other purposes like database upgrade and database migration between different systems. Presented real case scenarios demonstrate the usefulness, correctness and performance of the tool. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
20. Why There is no General Solution to the Problem of Software Verification.
- Author
-
Symons, John and Horner, Jack K.
- Subjects
- *
SOFTWARE engineering , *SOFTWARE verification , *COMPUTER software correctness , *PHILOSOPHY of science - Abstract
How can we be certain that software is reliable? Is there any method that can verify the correctness of software for all cases of interest? Computer scientists and software engineers have informally assumed that there is no fully general solution to the verification problem. In this paper, we survey approaches to the problem of software verification and offer a new proof for why there can be no general solution. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
21. Properties and numerical simulation for self-weight consolidation of the dredged material.
- Author
-
Wang, Liang, Sun, Jinshan, Zhang, Minsheng, Yang, Lijing, Li, Lei, and Yan, Jinhui
- Subjects
- *
DREDGING spoil , *PORE water pressure , *COMPUTER software correctness , *PARTICLE size distribution , *EXTRACTION apparatus , *COMPOSITE columns - Abstract
Large strain self-weight consolidation is widely used for the management of the dredged material (DM) in dredged material disposal sites. A multilayer vacuum extraction method, which consists of a settling column, pore pressure measurement apparatus and multilayer vacuum extraction apparatus, is developed. The interface height, water content, excess pore water pressure, grain size distribution, as well as the compressibility and permeability relationships involved in self-weight consolidation are determined. Experimental results show that the measured data are reasonable and the method is feasible. Gibson's governing equation for one-dimensional finite strain consolidation is discretised using a modified upwind difference form, and a corresponding computer program is compiled. It is found that the interface height settlement, void ratio and excess pore water pressure approximate laboratory experimental results. Therefore, the rationality of the analytical model and finite difference numerical solution as well as the correctness of the computer program are validated. The model possibly provides a satisfactory prediction for the self-weight consolidation of the DM. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
22. A Survey of Timing Verification Techniques for Multi-Core Real-Time Systems.
- Author
-
MAIZA, CLAIRE, RIHANI, HAMZA, RIVAS, JUAN M., GOOSSENS, JOËL, ALTMEYER, SEBASTIAN, and DAVIS, ROBERT I.
- Subjects
- *
SCIENTIFIC literature , *COMPUTER software correctness - Abstract
This survey provides an overview of the scientific literature on timing verification techniques for multi-core real-time systems. It reviews the key results in the field from its origins around 2006 to the latest research published up to the end of 2018. The survey highlights the key issues involved in providing guarantees of timing correctness for multi-core systems. A detailed review is provided covering four main categories: full integration, temporal isolation, integrating interference effects into schedulability analysis, and mapping and allocation. The survey concludes with a discussion of the advantages and disadvantages of these different approaches, identifying open issues, key challenges, and possible directions for future research. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
23. Field distribution and dispersion characteristics of a coaxial oversized slow wave structure with deep corrugation operating on high-order mode.
- Author
-
Deng, Bingfang, He, Juntao, Ling, Junpu, and Wang, Lei
- Subjects
- *
SLOW wave structures , *COMPUTER software correctness , *DISPERSION (Chemistry) , *SIMULATION software - Abstract
Compared with the traditional slow wave structure (SWS) with a fundamental mode, the oversized SWS with deep corrugation operating in a high-order mode has a higher power handling capacity and lower beam density. However, for the new structures, the traditional Floquet harmonic expansion method is not applicable in analyzing the dispersion characteristics of the slow wave structure. In this paper, we present a new analysis method based on the mode matching method, which is suitable for the arbitrary depth of corrugation of SWS. Besides, as the field matches at the corrugated surface of inner and outer conductors, the method is applicable whether the conductors are corrugated or uniform. The method is validated by software simulation. The calculation results of the software verify the correctness of the theory and show that the theoretical calculation is more convenient. By the method we present, the dispersion characteristics, field distribution, and coupling impedance are theoretically derived and analyzed. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
24. Approximation algorithms for querying incomplete databases.
- Author
-
Greco, Sergio, Molinaro, Cristian, and Trubitsyna, Irina
- Subjects
- *
APPROXIMATION algorithms , *POLYNOMIAL time algorithms , *DATABASES , *COMPUTER software correctness , *ALGORITHMS - Abstract
Certain answers are a widely accepted semantics of query answering over incomplete databases. As their computation is a coNP-hard problem, recent research has focused on developing (polynomial time) evaluation algorithms with correctness guarantees , that is, techniques computing a sound but possibly incomplete set of certain answers. The aim is to make the computation of certain answers feasible in practice, settling for under-approximations. In this paper, we present novel evaluation algorithms with correctness guarantees, which provide better approximations than current techniques, while retaining polynomial time data complexity. The central tools of our approach are conditional tables and the conditional evaluation of queries. We propose different strategies to evaluate conditions, leading to different approximation algorithms—more accurate evaluation strategies have higher running times, but they pay off with more certain answers being returned. Thus, our approach offers a suite of approximation algorithms enabling users to choose the technique that best meets their needs in terms of balance between efficiency and quality of the results. • Algorithms to compute sound sets of certain query answers over incomplete databases. • Approximation algorithms trading-off evaluation time vs. quality of the approximation. • Suite of algorithms enabling users to choose the method that best meets their needs. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
25. Secure and trusted partial grey-box verification.
- Author
-
Cai, Yixian, Karakostas, George, and Wassyng, Alan
- Subjects
- *
SOFTWARE architecture , *SYSTEMS development , *COMPUTER software correctness , *SOFTWARE verification - Abstract
A crucial aspect in the development of software-intensive systems is verification. This is the process of checking whether the system has been implemented in compliance with its specification. In many situations, the manufacture of one or more components of the system is outsourced. We study the case of how a third party (the verifier) can verify an outsourced component effectively, without access to all the details of the internal design of that component built by the developer. We limit the design detail that is made available to the verifier to a diagram of interconnections between the different design units within the component, but encrypt the design details within the units and also the intermediate values passed between the design units. We formalize this notion of limited information using tabular expressions to describe the functions in both the specifications and the design. The most common form of verification is testing, and it is known that black-box testing of the component is not effective enough in deriving test cases that will adequately determine the correctness of the implementation, and the safety of its behaviour. We have developed protocols that allow for the derivation of test cases that take advantage of the design details disclosed as described above. We can regard this as partial grey-box testing that does not compromise the developer's secret information. Our protocols work with both trusted and untrusted developers, as well as trusted and untrusted verifiers, and allow for the checking of the correctness of the verification process itself by any third party, and at any time. Currently our results are derived under the simplifying assumption that the software design units are linked acyclically. We leave the lifting of this assumption as an open problem for future research. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
26. An expert system for checking the correctness of memory systems using simulation and metamorphic testing.
- Author
-
Cañizares, Pablo C., Núñez, Alberto, and de Lara, Juan
- Subjects
- *
EXPERT systems , *COMPUTER memory management , *MEMORY testing , *COMPUTER performance , *SIMULATION methods & models , *COMPUTING platforms , *COMPUTER software correctness - Abstract
• A novel expert system for checking the correctness of memory systems. • The expert system properly combines simulation and metamorphic testing. • The expert system automatically generates test cases to check memory models. • Mutation testing has been applied to check the effectiveness of the ES. • The ES provides promising results, detecting 99. During the last few years, computer performance has reached a turning point where computing power is no longer the only important concern. This way, the emphasis is shifting from an exclusive focus on the optimisation of the computing system to optimising other systems, like the memory system. Broadly speaking, testing memory systems entails two main challenges: the oracle problem and the reliable test set problem. The former consists in deciding if the outputs of a test suite are correct. The latter refers to providing an appropriate test suite for determining the correctness of the system under test. In this paper we propose an expert system for checking the correctness of memory systems. In order to face these challenges, our proposed system combines two orthogonal techniques – simulation and metamorphic testing – enabling the automatic generation of appropriate test cases and deciding if their outputs are correct. In contrast to conventional expert systems, our system includes a factual database containing the results of previous simulations, and a simulation platform for computing the behaviour of memory systems. The knowledge of the expert is represented in the form of metamorphic relations, which are properties of the analysed system involving multiple inputs and their outputs. Thus, the main contribution of this work is two-fold: a method to automatise the testing process of memory systems, and a novel expert system design focusing on increasing the overall performance of the testing process. To show the applicability of our system, we have performed a thorough evaluation using 500 memory configurations and 4 different memory management algorithms, which entailed the execution of more than one million of simulations. The evaluation used mutation testing, injecting faults in the memory management algorithms. The developed expert system was able to detect over 99% of the critical injected faults, hence obtaining very promising results, and outperforming other standard techniques like random testing. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
27. An optimized RGA supporting selective undo for collaborative text editing systems.
- Author
-
Lv, Xiao, He, Fazhi, Cai, Weiwei, and Cheng, Yuan
- Subjects
- *
CLOUD computing , *COMPUTER software correctness - Abstract
Collaboration plays a key role in distributed applications. As a fundamental vehicle for collaboration, collaborative text editing systems have been an important field within CSCW. More recently, with the increasing popularity of cloud computing, collaborative text editing systems move towards large-scale collaborations based on the cloud computing/cloud platform. The computing performance is the key factor of success for large-scale collaborations. CRDT algorithms have been proved to outperform traditional algorithms in publications. However, how to support selective undo has been a challenging issue for existing CRDT algorithms. This paper proposes an efficient CRDT algorithm called ORGAU that provides integrated do and selective undo efficiently. The correctness and operation intentions preserving of the proposed algorithm under an integrated do/undo framework are formally proved. Compared with the typical CRDT algorithms, the proposed algorithm has better computing performance both in theoretical analysis and experimental evaluation while keeping the same space complexity. • An efficient algorithm called ORGAU with integrated support of do and selective undo. • It has a time complexity lower than typical CRDT algorithms while keeping the same space complexity. • The experiment evaluations show that the proposed algorithm has better computing performance. • Both the correctness and the intention preserving of the proposed algorithm have been formally proved. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
28. CHERI Concentrate: Practical Compressed Capabilities.
- Author
-
Woodruff, Jonathan, Joannou, Alexandre, Xia, Hongyan, Fox, Anthony, Norton, Robert M., Chisnall, David, Davis, Brooks, Gudka, Khilan, Filardo, Nathaniel W., Markettos, A. Theodore, Roe, Michael, Neumann, Peter G., Watson, Robert N. M., and Moore, Simon W.
- Subjects
- *
IMAGE compression , *COMPUTER systems , *COMPUTER architecture , *PIPELINES , *COMPUTER software correctness - Abstract
We present CHERI Concentrate, a new fat-pointer compression scheme applied to CHERI, the most developed capability-pointer system at present. Capability fat pointers are a primary candidate to enforce fine-grained and non-bypassable security properties in future computer systems, although increased pointer size can severely affect performance. Thus, several proposals for capability compression have been suggested elsewhere that do not support legacy instruction sets, ignore features critical to the existing software base, and also introduce design inefficiencies to RISC-style processor pipelines. CHERI Concentrate improves on the state-of-the-art region-encoding efficiency, solves important pipeline problems, and eases semantic restrictions of compressed encoding, allowing it to protect a full legacy software stack. We present the first quantitative analysis of compiled capability code, which we use to guide the design of the encoding format. We analyze and extend logic from the open-source CHERI prototype processor design on FPGA to demonstrate encoding efficiency, minimize delay of pointer arithmetic, and eliminate additional load-to-use delay. To verify correctness of our proposed high-performance logic, we present a HOL4 machine-checked proof of the decode and pointer-modify operations. Finally, we measure a 50 to 75 percent reduction in L2 misses for many compiled C-language benchmarks running under a commodity operating system using compressed 128-bit and 64-bit formats, demonstrating both compatibility with and increased performance over the uncompressed, 256-bit format. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
29. Analysis of Errors in Priority Vector Estimation and Their Relationship with the Correctness of the Final Ranking of Decision Alternatives.
- Author
-
Grzybowski, Andrzej Z. and Starczewski, Tomasz
- Subjects
- *
ERROR analysis in mathematics , *ANALYTIC hierarchy process , *DECISION making , *COMPUTER software correctness , *PULSE-code modulation - Abstract
Ranking creation is the main purpose of the multiple-criteria decision analysis (MCDA). In practice, it is typically achieved by estimation of the priority weights that reflect the importance of each of the available alternatives - this process is called prioritization. One of the most popular MCDA methodologies is the Analytic Hierarchy Process (AHP). All priority-weights-estimation technics that are used under the AHP scheme are based on the so-called pairwise comparison matrix (PCM). The PCM elements represent the decision maker's judgments about the priority-weights-ratios. They are typically expressed in values from a predefined set of numbers that is called a scale. Because of human brain limitations, it is natural that these judgments are usually erroneous, and consequently, the estimates of the priority weights are erroneous as well. It is well understood, however, that serious errors in judgments make the information contained in the PCM worthless. Thus the decision makers need a procedure that enables them to accept a given PCM or reject it as a useless one. This paper is devoted to the simulation analysis of the prioritization errors and their relationship with the correctness of the final ranking of decision-alternatives. Our simulation experiments reveal some interesting facts about the impact of the adopted scale on the priority-weights-estimation errors and allow us to formulate important remarks about the PCM acceptance procedure. [ABSTRACT FROM AUTHOR]
- Published
- 2019
30. Generalized Meeting Businessmen Problem.
- Author
-
Challita, Khalil
- Subjects
- *
BUSINESSMEN , *COMPUTER software correctness , *EVIDENCE , *INTERNET , *IMAGE encryption , *ENCODING - Abstract
In this paper we address the problem where a distributed group of people wish to meet securely over the Internet. Obviously, basic encryption techniques fail to achieve this aim since they involve the encoding or decoding of messages between two parties only. For this purpose, we propose a cryptographic protocol that allows any number of people to meet remotely while keeping their discussions secure from any potential eavesdropper. Our solution uses a trusted third party and is based on a refined version of the original Otway-Rees protocol. Prior to the establishment of a secure communication channel, we start by authenticating all the users involved. We also provide a formal proof of the correctness of the suggested protocol. [ABSTRACT FROM AUTHOR]
- Published
- 2019
31. Automated synthesis of application-layer connectors from automata-based specifications.
- Author
-
Autili, Marco, Inverardi, Paola, Spalazzese, Romina, Tivoli, Massimo, and Mignosi, Filippo
- Subjects
- *
UBIQUITOUS computing , *MACHINE theory , *TECHNICAL specifications , *INTERNET of things , *COMPUTER software correctness , *ROBOTS - Abstract
Ubiquitous and Pervasive Computing, and the Internet of Things, promote dynamic interaction among heterogeneous systems. To achieve this vision, interoperability among heterogeneous systems represents a key enabler, and mediators are often built to solve protocol mismatches. Many approaches propose the synthesis of mediators. Unfortunately, a rigorous characterization of the concept of interoperability is still lacking, hence making hard to assess their applicability and soundness. In this paper, we provide a framework for the synthesis of mediators that allows us to: (i) characterize the conditions for the mediator existence and correctness; and (ii) establish the applicability boundaries of the synthesis method. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
32. Performance monitoring beyond choice tasks: The time course of force execution monitoring investigated by event-related potentials and multivariate pattern analysis.
- Author
-
Siswandari, Yohana, Bode, Stefan, and Stahl, Jutta
- Subjects
- *
MULTIVARIATE analysis , *TASK forces , *COMPUTER software correctness , *TASKS - Abstract
Accurate force production is an essential motor function which, in most cases, requires continuous performance monitoring. Unlike choice-response tasks with two response alternatives, the accuracy in a force production paradigm is defined as an area between an upper and lower limit on the force continuum. In the present study, we investigated the neural mechanisms underlying force production. We used a force production task in which the participants (n = 48) were asked to exert a brief force pulse within a specific force range. This allowed: (1) investigation of action monitoring activity during force execution using response-locked and feedback-locked event-related potential (ERP) components known to be involved in error monitoring; (2) multivariate pattern analysis (MVPA) for ERPs. We found that the different force production ranges (characterised as too low , correct , and too high with respect to the target force range) showed no clear error-specific variations in the ERP components of interest. MVPA, on the other hand, allowed for successful classification, not only between the correct and the incorrect outcome conditions, but also between the two incorrect outcome conditions. This suggests that the classifier identified neural patterns reflecting the force magnitude rather than the correctness of a response. Moreover, additional support-vector regression (SVR) analyses showed that single-trial response parameters (i.e. peak force and time-to-peak) could be decoded from the brain activity pattern starting from 140 ms (for peak force) and 270 ms (for time-to-peak) before the response onset. These results indicate that the motor program defined the magnitude and timing of the force pulse before response execution, while the correctness of that response (in relation to the "default force" required) was not yet foreshadowed in neural signals. Finally, this study presents the first evidence of a post-error force adjustment mechanism, for which participants produced a higher force in trials after under-producing the required force, and a lower force in trials after over-producing the required force. • Neural activity predicts performance parameters in force execution. • Distributed patterns of ERPs predictive for specific force rather than correctness. • Peak force (PF) and time-to-peak (TTP) were decodable already before response onset. • Systematic force adjustments were observed after committing errors. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
33. The Influence of Aggregate Diameter Description on the Correctness of the Sedimentation Model – CFD Investigations.
- Author
-
Moskal, A. and Fus, A. A.
- Subjects
- *
MASS transfer coefficients , *COMPUTATIONAL fluid dynamics , *SEDIMENTATION & deposition , *DIAMETER , *COMPUTER software correctness - Abstract
The results of numerical simulation on aggregates subjected to sedimentation show that the process of falling of particles is influenced by many parameters, including the assumption of aggregate diameter. The often used method of the equivalent sphere approach does not fully reflect the movement of particles. The CFD (Computational Fluid Dynamics) simulations were performed for falling aggregates and the diameters most relevant to the particle were found for them. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
34. Cryptography Miracles, Secure Auctions, Matching Problem Verification.
- Author
-
MICALI, SILVIO and RABIN, MICHAEL O.
- Subjects
- *
PRICE fixing , *CRYPTOGRAPHY , *BIDDING strategies , *AUCTIONS , *ZERO-knowledge proofs , *SECRECY , *COMPUTER software correctness - Abstract
The article discusses a solution to the persistent problem of preventing collusion in Vickrey sealed-bid auctions. The authors extend the methods used by M. Rabin et al. for proving the correctness of announced results of computations, where zero knowledge proofs (ZKP) concealed input values and intermediate results of the computation, offering examples of various applications used by Rabin et al. Topics include the main application for the extended method for secrecy preserving proofs of correctness, how the use of cryptography enables the prevention of collusion in one-time Vickrey auctions, and the properties of the authors' auction mechanism, designed using the methods of Rabin et al. and so-called tools of deniable revelation of a secret value and uncontrollable deniable bidding.
- Published
- 2014
- Full Text
- View/download PDF
35. A hybrid universal blind quantum computation.
- Author
-
Zhang, Xiaoqian, Luo, Weiqi, Zeng, Guoqiang, Weng, Jian, Yang, Yaxi, Chen, Minrong, and Tan, Xiaoqing
- Subjects
- *
QUANTUM computing , *BIG data , *COMPUTER software correctness , *QUANTUM information science , *QUANTUM computers - Abstract
• Our protocol has a stronger privacy for big data since all quantum inputs and algorithms are encrypted by one-time-pad both in measurement-based process and circuit-based process. It is the reason that our protocol is called 'blind'. • Our protocol first mergers two processes: measurement-based process and circuit-based process. The cluster states only need to realize entangled gates determinately, so we don't need a large-scale entangled state. • In our protocol, Alice has less workload than others since she only needs to measure trap qubits appearing in the final column of the graph state in Fig. 4. Therefore, our protocol can tackle the two problems theoretically: generating a large-scale entangled state in experiment and realizing the probabilistically successful entangled gates. In blind quantum computation (BQC), a client delegates her quantum computation to a server with universal quantum computers who learns nothing about the client's private information. In measurement-based BQC model, entangled states are generally used to realize quantum computing. However, to generate a large-scale entangled state in experiment becomes a challenge issue. In circuit-based BQC model, single-qubit gates can be realized precisely, but entangled gates are probabilistically successful. This remains a challenge to realize entangled gates with a deterministic method in some systems. To solve above two problems, we propose the first hybrid universal BQC protocol based on measurements and circuits, where the client prepares single-qubit states and the server performs universal quantum computing. We analyze and prove the correctness, blindness and verifiability of the proposed protocol. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
36. Formal Verification of Automated Teller Machine Systems using SPIN.
- Author
-
Iqbal, Ikhwan Mohammad, Adzkiya, Dieky, and Mukhlash, Imam
- Subjects
- *
AUTOMATED teller machines , *SPIN (Computer program language) , *COMPUTER software correctness , *ML (Computer program language) , *MATHEMATICAL formulas - Abstract
Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
37. MCS 2 : minimal coordinated supports for fast enumeration of minimal cut sets in metabolic networks.
- Author
-
Miraskarshahi, Reza, Zabeti, Hooman, Stephen, Tamon, and Chindelevitch, Leonid
- Subjects
- *
LISTS , *INTERNET servers , *METABOLIC models , *FLUX (Energy) , *COMPUTER software correctness - Abstract
Motivation Constraint-based modeling of metabolic networks helps researchers gain insight into the metabolic processes of many organisms, both prokaryotic and eukaryotic. Minimal cut sets (MCSs) are minimal sets of reactions whose inhibition blocks a target reaction in a metabolic network. Most approaches for finding the MCSs in constrained-based models require, either as an intermediate step or as a byproduct of the calculation, the computation of the set of elementary flux modes (EFMs), a convex basis for the valid flux vectors in the network. Recently, Ballerstein et al. proposed a method for computing the MCSs of a network without first computing its EFMs, by creating a dual network whose EFMs are a superset of the MCSs of the original network. However, their dual network is always larger than the original network and depends on the target reaction. Here we propose the construction of a different dual network, which is typically smaller than the original network and is independent of the target reaction, for the same purpose. We prove the correctness of our approach, minimal coordinated support (MCS2), and describe how it can be modified to compute the few smallest MCSs for a given target reaction. Results We compare MCS2 to the method of Ballerstein et al. and two other existing methods. We show that MCS2 succeeds in calculating the full set of MCSs in many models where other approaches cannot finish within a reasonable amount of time. Thus, in addition to its theoretical novelty, our approach provides a practical advantage over existing methods. Availability and implementation MCS2 is freely available at https://github.com/RezaMash/MCS under the GNU 3.0 license. Supplementary information Supplementary data are available at Bioinformatics online. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
38. Decentralized tube‐based model predictive control of uncertain nonlinear multiagent systems.
- Author
-
Nikou, Alexandros and Dimarogonas, Dimos V.
- Subjects
- *
MULTIAGENT systems , *PREDICTIVE control systems , *NONLINEAR systems , *PREDICTION models , *SYSTEM dynamics , *COMPUTER software correctness - Abstract
Summary: This paper addresses the problem of decentralized tube‐based nonlinear model predictive control (NMPC) for a general class of uncertain nonlinear continuous‐time multiagent systems with additive and bounded disturbance. In particular, the problem of robust navigation of a multiagent system to predefined states of the workspace while using only local information is addressed under certain distance and control input constraints. We propose a decentralized feedback control protocol that consists of two terms: a nominal control input, which is computed online and is the outcome of a decentralized finite horizon optimal control problem that each agent solves at every sampling time, for its nominal system dynamics; and an additive state‐feedback law which is computed offline and guarantees that the real trajectories of each agent will belong to a hypertube centered along the nominal trajectory, for all times. The volume of the hypertube depends on the upper bound of the disturbances as well as the bounds of the derivatives of the dynamics. In addition, by introducing certain distance constraints, the proposed scheme guarantees that the initially connected agents remain connected for all times. Under standard assumptions that arise in nominal NMPC schemes, controllability assumptions, communication capabilities between the agents, it is guaranteed that the multiagent system is input‐to‐state stable with respect to the disturbances, for all initial conditions satisfying the state constraints. Simulation results verify the correctness of the proposed framework. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
39. Cloud Manufacturing Service Composition Optimization with Improved Genetic Algorithm.
- Author
-
Li, Yongxiang, Yao, Xifan, and Liu, Min
- Subjects
- *
GENETIC algorithms , *MATHEMATICAL optimization , *MATHEMATICAL models , *MODEL theory , *DEFINITIONS , *COMPUTER software correctness - Abstract
Aiming at the problems in which there exists collocation between services and manufacturing tasks, multiobjective cloud manufacturing service composition optimization seldom considers the synergy degree of composite cloud services and the complexity of service composition, so a novel service composition optimization approach, called improved genetic algorithm based on entropy (IGABE), is put forward. First, the mathematical expressions of service collocation degree, composition synergy degree, composition entropy, and their related influence factors of the service composition are analyzed, and their definitions and calculation methods are given. Then, a multiobjective cloud manufacturing service composition optimization mathematical model is established. Moreover, crossover and mutation operators are improved by introducing normal cloud model theory and piecewise function, and improved roulette selection method is used to perform the selection operation. And the fitness function of the proposed IGABE is designed by combining Euclidean deviation with angular deviation. Finally, the manufacturing task of a wheeled cleaning robot is exemplified to verify the correctness of the proposed multiobjective optimization model for cloud manufacturing service composition and the effectiveness of the proposed algorithm, compared with Standard Genetic Algorithm (SGA), Hybrid Genetic Algorithm (HGA), and Cloud-entropy Enhanced Genetic Algorithm (CEGA). The studied results show that IGABE converges faster than SGA and HGA and can analyze and reflect the content difference expressed by the objective functions of service composition scheme and its approximation degree to the corresponding dimensions of the ideal point vector more comprehensively than CEGA. As such, the optimal service composition obtained by IGABE algorithm can better meet the complex needs of users. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
40. Red–black trees with constant update time.
- Author
-
Elmasry, Amr, Kahla, Mostafa, Ahdy, Fady, and Hashem, Mahmoud
- Subjects
- *
TREES , *COMPUTER software correctness , *EVIDENCE , *TIME , *COLORS - Abstract
We show how a few modifications to the red–black trees allow for constant worst-case update time (once the position of the element to be inserted or deleted is known). The resulting structure is based on relaxing some of the properties of the red–black trees while guaranteeing that the height remains logarithmic with respect to the number of nodes. Compared to the other search trees with constant worst-case update time, our tree is the first to provide a tailored deletion procedure without using the global rebuilding technique. In addition, our procedures are simpler and allow for an easier proof of correctness than those alternative trees. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
41. Partial Correctness of a Factorial Algorithm.
- Author
-
Jaszczak, Adrian and Korniłowicz, Artur
- Subjects
- *
FACTORIALS , *COMPUTER software correctness , *NATURAL numbers , *ALGORITHMS , *LOGIC - Abstract
In this paper we present a formalization in the Mizar system [3],[1] of the partial correctness of the algorithm: i := val.1 j := val.2 n := val.3 s := val.4 while (i <> n) i := i + j s := s * i return s computing the factorial of given natural number n, where variables i, n, s are located as values of a V-valued Function, loc, as: loc/.1 = i, loc/.3 = n and loc/.4 = s, and the constant 1 is located in the location loc/.2 = j (set V represents simple names of considered nominative data [16]). This work continues a formal verification of algorithms written in terms of simple-named complex-valued nominative data [6],[8],[14],[10],[11],[12]. The validity of the algorithm is presented in terms of semantic Floyd-Hoare triples over such data [9]. Proofs of the correctness are based on an inference system for an extended Floyd-Hoare logic [2],[4] with partial pre- and post-conditions [13],[15],[7],[5]. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
42. Partial Correctness of a Power Algorithm.
- Author
-
Jaszczak, Adrian
- Subjects
- *
COMPUTER software correctness , *COMPLEX numbers , *ALGORITHMS , *LOGIC , *MATHEMATICAL complexes - Abstract
This work continues a formal verification of algorithms written in terms of simple-named complex-valued nominative data [6],[8],[15],[11],[12],[13]. In this paper we present a formalization in the Mizar system [3],[1] of the partial correctness of the algorithm: i := val.1 j := val.2 b := val.3 n := val.4 s := val.5 while (i <> n) i := i + j s := s * b return s computing the natural n power of given complex number b, where variables i, b, n, s are located as values of a V-valued Function, loc, as: loc/.1 = i, loc/.3 = b, loc/.4 = n and loc/.5 = s, and the constant 1 is located in the location loc/.2 = j (set V represents simple names of considered nominative data [17]). The validity of the algorithm is presented in terms of semantic Floyd-Hoare triples over such data [9]. Proofs of the correctness are based on an inference system for an extended Floyd-Hoare logic [2],[4] with partial pre- and post-conditions [14],[16],[7],[5]. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
43. 基于MCK软件的微控制器时钟系统设计与虛拟仿真.
- Author
-
漆强 and 刘爽
- Subjects
- *
VIRTUAL design , *COMPUTER software correctness , *CLOCKS & watches , *COMPUTER software , *MICROCONTROLLERS - Abstract
This paper analyzes the basic structure and difficulties in design of microcontroller’s clock system. A method is proposed to complete initialization process of clock system based on MDK software’s initialization code and graphical configuration interface. With the virtual simulation function based on MDK software, user can use the configuration values of the internal registers and clock generation schematic to verify the correctness of the design of the clock system. This design method and virtual simulation function can reduce the development difficulty and improve the development efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2019
44. Revisiting inaccuracies of time series averaging under dynamic time warping.
- Author
-
Jain, Brijnesh
- Subjects
- *
COMPUTER software correctness , *TRIANGLES - Abstract
• Dispels misconception on correctness and drift-out phenomenon of time series averaging. • Shows that correctness-criterion is unsatisfiable and drift-out is inconclusive. • Shows that Fréchet means never drift out. • Suggests coherence of approximations as a special version of drift-outs. • Shows that state-of-the-art averaging methods are incoherent. This article revisits an analysis on (in)accuracies of time series averaging under dynamic time warping (dtw) conducted by Niennattrakul and Ratanamahatana [16]. They proposed a correctness-criterion for dtw-averages and postulated that dtw-averages can drift out of the cluster of time series to be averaged. They claimed that dtw-averages are inaccurate if they violate the correctness-criterion or suffer from the drift-out phenomenon. Furthermore, they conjectured that such inaccuracies are caused by the lack of triangle inequality. In this article, we show that a rectified version of the correctness-criterion is unsatisfiable and that the concept of drift-out is geometrically and operationally inconclusive. Satisfying the triangle inequality is insufficient to achieve correctness and unnecessary to overcome the drift-out phenomenon. We place the concept of drift-out on a principled basis and show that Fréchet means never drift out. The adjusted drift-out is a way to test to which extent an approximated dtw-average is coherent. Empirical results show that approximations obtained by the state-of-the-art averaging methods are incoherent in over a third of all cases. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
45. Live interactive queries to a software application's memory profile.
- Author
-
Fragkoulis, Marios, Spinellis, Diomidis, and Louridas, Panos
- Subjects
- *
APPLICATION software , *MEMORY , *INTERACTIVE computer systems , *SQL , *DATA structures , *COMPUTER software correctness , *PYTHON programming language - Abstract
Memory operations are critical to an application's reliability and performance. To reason about their correctness and track opportunities for optimisations, sophisticated instrumentation frameworks, such as Valgrind and Pin, have been developed. Both provide only limited facilities for analysing the collected data. This work presents a Valgrind's extension for examining a software applications' dynamic memory profile through live interactive analysis with SQL. The Pico COllections Query Library (pico ql) module maps Valgrind's data structures that contain the instrumented application's memory operations metadata to a relational interface. Queries are type-safe and the module imposes only a trivial overhead when idle. The authors evaluate the proposed approach on ten applications and through a qualitative study. They find 900 kb of undefined bytes in bzip2 that account for 12% of its total memory use and a performance-critical code execution path in the Unix commands sort and uniq. The referenced functions are part of glibc and have been independently modified to boost the library's performance. The qualitative study has users rate the usefulness, usability, effort, correctness, and expressiveness of PICO QL queries compared to Python scripts. The findings indicate that querying with PICO QL incurs lower user effort. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
46. Quantum Protocol for Millionaire Problem.
- Author
-
Liu, Wen, Wang, Yong-Bin, Sui, Ai-Na, and Ma, Min-Yao
- Subjects
- *
MILLIONAIRES , *QUANTUM measurement , *QUANTUM states , *QUANTUM noise , *COMPUTER software correctness , *QUANTUM computing - Abstract
A quantum protocol for millionaire problem based on commutative encryption is proposed. In our protocol, Alice and Bob don't have to use the entangled character, joint measurement of quantum states. They encrypt their private information and privately get the result of their private information with the help of a third party (TP). Correctness analysis shows that the proposed protocol can be used to get the result of their private information correctly. The proposed protocol can also resist various attacks and overcomes the problem of information leakage with acceptable efficiency. In theory, our protocol can be used to build complex secure protocols for other multiparty computation problems and also have lots of other important applications in distributed networks. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
47. Analysis of Exponential Stability for Neutral Stochastic Cohen-Grossberg Neural Networks with Mixed Delays.
- Author
-
Yang, Tianqing, Xiong, Zuoliang, and Yang, Cuiping
- Subjects
- *
EXPONENTIAL stability , *STOCHASTIC analysis , *ARTIFICIAL neural networks , *COMPUTER simulation , *COMPUTER software correctness - Abstract
This paper is concerned with the mean-square exponential input-to-state stability problem for a class of stochastic Cohen-Grossberg neural networks. Different from prior works, neutral terms and mixed delays are discussed in our system. By employing the Lyapunov-Krasovskii functional method, Itô formula, Dynkin formula, and stochastic analysis theory, we obtain some novel sufficient conditions to ensure that the addressed system is mean-square exponentially input-to-state stable. Moreover, two numerical examples and their simulations are given to illustrate the correctness of the theoretical results. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
48. Quality assessment of Major Trauma Registry of Navarra: completeness and correctness.
- Author
-
Ali Ali, Bismil, Lefering, Rolf, and Belzunegui Otano, Tomas
- Subjects
- *
TRAUMA registries , *COMPUTER software correctness , *SCIENTIFIC community , *DATA quality - Abstract
This study assessed the completeness of the Major Trauma Registry of Navarra (MTR-N) data and their concordance with the patients' medical files. It retrospectively reviewed all the MTR-N cases documented in June and July of 2014 and 2015. For each case, 42 parameters' values were taken from the MTR-N. To assess concordance between the MTR-N and medical files, the same variables values were re-recorded. Data completeness was calculated for all cases and data correctness for those documented in the MTR-N, separately for each variable. The overall average completeness rate for all variables was 92.8%. The percentages of completely missing data ranged from 0% (29 variables) to 76.8% (base excess). The overall average rate of correctness was 98.0%. Exact concordance ranged from 93.0% (7 variables) to 100% (22 variables). This study demonstrates the reliability and validity of the MTR-N data and its effectiveness for quality improvement and research in our community. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
49. Stepsize domain confirmation and optimum of ZeaD formula for future optimization.
- Author
-
Zhang, Yunong, Qi, Zhiyuan, Li, Jian, Qiu, Binbin, and Yang, Min
- Subjects
- *
COMPUTER software correctness , *PARTICLE swarm optimization - Abstract
Future optimization, which is also known as discrete-time time-variant optimization problem, is an important issue in scientific fields. Recently, Guo et al. have proposed a new effective three-step discrete-time zeroing dynamics (DTZD) model (Guo et al. Numer. Algorithms 77(1), 23–36, 2018) to solve future optimization problems, which is discretized from continuous-time zeroing dynamics (CTZD) model via utilizing a type of Zhang et al. discretization (ZeaD) formula whose coefficients are proportional to 6 , 3 , 2 , and 1 (termed as ZeaD formula 6321). In this paper, we mainly focus on the stability of this DTZD model. There is an important parameter that closely relates to the stability of the DTZD model, which is called stepsize. Through theoretical study, we obtain the accurate stepsize domain, which makes the DTZD model stable, and the result, i.e., stepsize h ∈ (0 , 0.8) , confirms Guo et al.'s previous investigation. Furthermore, the optimum of the stepsize, which makes the DTZD model converge fastest to steady state in terms of residual error and also provides the best stability (i.e., most away from unstable state), is discussed and investigated as well on the basis of theoretical derivation. Eventually, numerical experiments are carried out to confirm again the correctness of the stepsize domain and the optimum in the DTZD model for future optimization. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
50. SURVEY QUESTIONNAIRE FOR THE STUDY ON OCCUPATIONAL SAFETY CULTURE IN A PRODUCTION PLANT. CONSTRUCTION AND STATISTICAL VERIFICATION OF DATA CORRECTNESS.
- Author
-
Krupa, Patryk, Gabryelewicz, Izabela, Edl, Milan, Pantya, Peter, and Patalas-Maliszewska, Justyna
- Subjects
- *
INDUSTRIAL safety , *STATISTICS , *COMPUTER software correctness , *SURVEYS , *SAFETY factor in engineering - Abstract
The article presents the issue related with a proper preparation of a data sheet for the analysis, the way of verifying the correctness and reliability of input information, and proper data encoding. Improper input or coding of data can significantly influence the correctness of performed analyses or extend their time. This stage of an analysis is presented by an authorship questionnaire for the study on occupational safety culture in a manufacturing plant, using the Statistica software for analyses. There were used real data, obtained during the research on the issue of occupational safety and factors having the greatest influence on the state of occupational safety. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.