28,314 results on '"TEST methods"'
Search Results
2. A new integrated strategy for optimising the maintenance cost of complex systems using reliability importance measures.
- Author
-
Rebaiaia, Mohamed-Larbi and Ait-Kadi, Daoud
- Subjects
RELIABILITY in engineering ,MAINTENANCE costs ,CONDITION-based maintenance ,FAILURE mode & effects analysis ,TEST methods - Abstract
With the aging of production systems, failure modes become more common resulting in additional maintenance costs. To reduce these costs excess, flexible and smart maintenance strategies should be considered. This article proposes an integrated condition-based maintenance method associating opportunistic and importance measure concepts (IMC). The objective of an IMC-based model is to determine the contribution of the system's components according to their criticality degree for reliability improvement and maintenance planning. The identification of the best maintenance planning consists of determining the expected minimal cost that guarantees the repair actions of a group of critical components in one shot. Therefore, determining IMC values is not so easy for complex systems, it requires knowing their operational structure, the determination of the reliability value of each configuration and finally calculating each component's IMC degree and ranking them for the prioritisation selection. For testing the proposed method, an industrial case study has been used, regarding a complex system whose components fail at random times. The system undergoes minimal repairs if one or more components fail accidentally or by decision after inspection actions. The numerical results show that the developed approach incurs minimal maintenance costs, and can be integrated as a decision-aid solution for manufacturers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Variational umbrella seeding for calculating nucleation barriers.
- Author
-
Gispen, Willem, Espinosa, Jorge R., Sanz, Eduardo, Vega, Carlos, and Dijkstra, Marjolein
- Subjects
- *
NUCLEATION , *HOMOGENEOUS nucleation , *UMBRELLAS , *SEEDS , *TEST methods - Abstract
In this work, we introduce variational umbrella seeding, a novel technique for computing nucleation barriers. This new method, a refinement of the original seeding approach, is far less sensitive to the choice of order parameter for measuring the size of a nucleus. Consequently, it surpasses seeding in accuracy and umbrella sampling in computational speed. We test the method extensively and demonstrate excellent accuracy for crystal nucleation of nearly hard spheres and two distinct models of water: mW and TIP4P/ICE. This method can easily be extended to calculate nucleation barriers for homogeneous melting, condensation, and cavitation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. A practical comparative investigation of Bruceton and 3POD2.0 statistical testing methods on friction and impact sensitivity testing.
- Author
-
Turpeinen, Teijo, Ahponen, Jenni, and Vesterinen, Aleksi
- Subjects
- *
TEST methods , *COMPUTER software testing , *DESELECTION of library materials , *TESTING equipment , *FRICTION - Abstract
There are multiple different equipment and testing methods to perform sensitivity tests and to determine probabilities and stimulus values for sensitivity of energetic materials. In this study a Bruceton and Three-Phase Optimal Design (3POD2.0), implemented in gonogo software, statistical testing methods were compared in practice by using them to determine impact and friction sensitivity of selected energetic materials. With Bruceton method 50% of the impact sensitivity and 27% of the friction sensitivity test series had to be discarded, due to too high or low s/d ratio. With 3POD2.0 0% of impact and 18% of friction test series had to be discarded, due to an error produced by the gonogo software. The number of individual tests that had to be discarded was much lower with 3POD2.0, since the gonogo software terminates the test process at early phase, while with Bruceton the test series most often must be finished before the validity of results can be assessed. Based on these results and literary reviews it can be concluded that 3POD2.0 is a suitable replacement for Bruceton to determine 50% initiation probability, while additionally gaining the entire sensitivity curve, resulting in fewer futile tests and less wasted resources. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Epoxy Value Measurement of E44 Epoxy Resin by Various Chemical Titration Method.
- Author
-
Li, Xianhua, Zou, Xiujuan, Zhang, Xiangyu, Yin, Xiaoyan, and Sui, Xianhang
- Subjects
- *
PHYSICAL constants , *CONFIDENCE intervals , *VOLUMETRIC analysis , *TEST methods , *BROMIDES , *EPOXY resins - Abstract
During the production of epoxy resins, it is crucial to control or to know the epoxy groups. It is known that epoxy value is the most important analytical index of epoxy resin physical quantity to characterize the number of epoxy groups in epoxy resin. In this paper, two ways for testing the epoxy values of E44 epoxy resin are developed based on chemical titration method: hydrochloric acid‐acetone method and perchloric acid‐tetraethylammonium bromide method. The features of different methods and the influence of approaches on epoxy value are studied systematically. At the significant level of 0.05, for hydrochloric acid‐acetone method, the confidence interval for the mean is (0.454±0.003)eq/100 g, and for perchloric acid‐tetraethylammonium bromide method, the confidence interval for the mean is (0.453±0.003)eq/100 g. It has been proved that the test results of the two methods are basically the same. And the advantages and disadvantages of the two methods have been put forward. In addition, the testing mechanism of hydrochloric acid‐acetone method and perchloric acid‐tetraethylammonium bromide method is illustrated in detail. On the basis of superiority over the two methods, a novel method for testing epoxy resin with rapid and accurate epoxy value can be envisioned. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Optimal decay for one-dimensional damped wave equations with potentials via a variant of Nash inequality.
- Author
-
Sobajima, Motohiro
- Subjects
- *
INVARIANT measures , *HEAT equation , *TEST methods - Abstract
The optimality of decay properties of the one-dimensional damped wave equations with potentials belonging to a certain class is discussed. The typical ingredient is a variant of Nash inequality which involves an invariant measure for the corresponding Schrödinger semigroup. This enables us to find a sharp decay estimate from above. Moreover, the use of a test function method with the Nash-type inequality provides the decay estimate from below. The diffusion phenomena for the damped wave equations with potentials are also considered. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Characterising equivalent droplet indicators of sprinkler irrigation from a kinetic energy perspective.
- Author
-
Zhang, Rui, Liu, Yichuan, Zhu, Delan, Wu, Pute, Zheng, Changjuan, Zhang, Xiaomin, Khudayberdi, Nazarov, and Liu, Changxin
- Subjects
- *
SPRINKLER irrigation , *OPTICAL instruments , *KINETIC energy , *SPRINKLERS , *TEST methods - Abstract
Equivalent droplet velocity and diameter are important parameters for measuring the effectiveness of sprinkler spraying; however, non-optical test methods (paper stain, flour pellet, and oil immersion methods) can only obtain the droplet number and diameter. With the widespread use of optical instruments in sprinkler testing, droplet velocity can also be measured, therefore, it has become possible to calculate the average droplet characteristics from an energy perspective. This paper proposes an energy-weighted method for calculating droplet equivalence indicators. Statistical analyses were performed based on five types of sprinkler irrigation droplet distribution data to compare the characteristics and differences between the energy-weighted method and the calculation results of the other methods. The results showed that 1) the velocity outcomes of the energy-weighted droplet equivalent method, empirical formula I, and empirical formula II consistently increase and decrease; 2) the equivalent droplet diameter based on the energy-weighted method is the largest, followed by the equivalent method related to droplet volume, and the smallest is the equivalent method related to droplet quantity; and 3) the equivalent droplet velocity and diameter calculated by the energy-weighted equivalent method can characterise droplets with a high energy contribution. The energy-weighted equivalent droplet velocity and diameter indicators derived in this study provide new ideas for characterising droplet averaging. • An energy-weighted method to characterise droplet equivalent indicators is proposed. • The differences between the various equivalent methods are compared. • The equivalent droplet method can characterise droplets with high energy contribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Quantification of Total Sulfites in Shrimp by BIOFISH 300/3000 SUL Method, Collaborative Study: Final Action 2021.09.
- Author
-
Garate, Jone, Zarate, Itziar Ortiz de, Gonzalez, Roberto, Añorga, Larraitz, and Salleres, Sandra
- Subjects
- *
BUFFER solutions , *ELECTRIC currents , *SHRIMPS , *STANDARD deviations , *TEST methods - Abstract
Background In December 2021, the BIOFISH 300 SUL method for the determination of total sulfites in shrimp was adopted as a First Action Official Method of AnalysisSM by the AOAC INTERNATIONAL. Objective A collaborative study was conducted in February 2023 in order to test the reproducibility of the method. Methods The method is based on the use of a benchtop biosensor device that relates the concentration of sulfite to a quantifiable electric current signal. The sensing element, the Biotest, harbors an enzyme that specifically oxidizes sulfite, and the reaction products are electrochemically detected by the device in less than 3 min. The sulfite is extracted from the solid using an aqueous-based buffer solution, which ensures that all sulfite is present as a free anion. Results Eleven collaborators participated in the study of nine different shrimp samples. Values of repeatability and reproducibility relative standard deviation (RSDr and RSDR) obtained from the statistical analysis of valid data ranged from 2.1–8.1% and 7.5–14.3%, respectively, for shrimp samples above the quantification limit of the method, set at 7 mg/kg. Conclusion These results showed good repeatability and reproducibility of the method, even at concentrations below the legal threshold for sulfite in food, where the reference optimised Monier-Williams (OMW) method shows relatively high imprecision. Highlights On the basis of these results, the enzymatic amperometric biosensor method developed by BIOLAN Microbiosensores was adopted as Final Action Official Method in September 2023. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Bringing the state back in: Populism and economic nationalism in Europe.
- Author
-
Ganga, Paula D.
- Subjects
- *
ECONOMIC policy , *POWER (Social sciences) , *GOVERNMENT ownership , *ECONOMIC globalization , *TEST methods - Abstract
Objective: Economic nationalism has been on the rise for the past two decades. Scholars have also noted the shift away from globalization and deregulation toward a more prominent role of the state in the economy. I explore the role played by populist governments in the increased adoption of economic nationalism and in this return of the state. I argue that the populist worldview lends itself naturally to a consolidation of power—not just political, but also economic. This consolidation of economic power results in a more prominent state even in countries where the populist regime is a right‐leaning one. Methods: I test this argument quantitatively by analyzing governments in 30 European countries since 1990, levels of state ownership in the economy, and a battery of economic and political controls. Results: The election of a populist government is associated with a strengthening of state ownership in the economy. Conclusion: I conclude with a discussion of the prospects for the future study of this populist economic agenda both domestically and internationally. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Transient mode-III problem of the elastic matrix with a line inclusion.
- Author
-
Wang, YS, Wang, BL, and Wang, KF
- Subjects
- *
SINGULAR integrals , *STRESS concentration , *COMPOSITE materials , *TEST methods , *ELASTICITY - Abstract
The method of pull-out test has been used to identify the mechanical performance of hybrid and fiber-reinforced composite materials. This paper investigates the elastic phase preceding the pull-out of a rigid line inclusion from the polymer matrix with fixed top and bottom surfaces. The mode-III problem is investigated such that the pull-out force is applied from the out-of-plane direction and it can be either transient or static. By applying the singular integral equation technique, the semi-analytical elastic field expressions are obtained. Under the static pull-out force, the stress intensity factor (SIF) near the inclusion tip shows a monotonic increase as the length and height of the matrix increase, whereas for the transient pull-out force, the SIF displays an initial increasing and followed by a decline. The maximum SIF is obtained for (1) the matrix length is 2 to 2.5 times of the inclusion length, and (2) the matrix height is 1 to 2 times of the inclusion length. Moreover, this paper provides a solution approach that incorporates the elasticity of the inclusion, showing that there is an optimal shear stiffness that minimizes the stress singularity of system. The conclusions of this study hold significance for the design and performance evaluation of fiber-reinforced composite materials. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Kernel-based Sensitivity Analysis for (Excursion) Sets.
- Author
-
Fellmann, N., Blanchet-Scalliet, C., Helbert, C., Spagnol, A., and Sinoquet, D.
- Subjects
- *
RANDOM sets , *ROBUST optimization , *SENSITIVITY analysis , *TEST methods , *POLLUTANTS - Abstract
In this article, we aim to perform sensitivity analysis of set-valued models and, in particular, to quantify the impact of uncertain inputs on feasible sets, which are key elements in solving a robust optimization problem under constraints. While most sensitivity analysis methods deal with scalar outputs, this article introduces a novel approach to perform sensitivity analysis with set-valued outputs. Our innovative methodology is designed for excursion sets, but is versatile enough to be applied to set-valued simulators, including those found in viability fields, or when working with maps like pollutant concentration maps or flood zone maps. We propose to use the Hilbert-Schmidt Independence Criterion (HSIC) with a kernel designed for set-valued outputs. After setting a probabilistic framework for random sets, a first contribution is the proof that this kernel is characteristic, an essential property in a kernel-based sensitivity analysis context. To measure the contribution of each input, we then propose to use HSIC-ANOVA indices. With these indices, we can identify which inputs should be neglected (screening) and we can rank the others according to their influence (ranking). The estimation of these indices is also adapted to the set-valued outputs. Finally, we test the proposed method on three test cases of excursion sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Research progress on protective performance evaluation and lifetime prediction of organic coatings.
- Author
-
Hao, Pan, Dun, Yuchao, Gong, Jiyun, Li, Shenghui, Zhao, Xuhui, Tang, Yuming, and Zuo, Yu
- Subjects
- *
ORGANIC coatings , *PROTECTIVE coatings , *SERVICE life , *TECHNOLOGICAL progress , *TEST methods - Abstract
Purpose: Organic coatings are widely used for protecting metal equipment and structures from corrosion. Accurate detection and evaluation of the protective performance and service life of coatings are of great importance. This paper aims to review the research progress on performance evaluation and lifetime prediction of organic coatings. Design/methodology/approach: First, the failure forms and aging testing methods of organic coatings are briefly introduced. Then, the technical status and the progress in the detection and evaluation of coating protective performance and the prediction of service life are mainly reviewed. Findings: There are some key challenges and difficulties in this field, which are described in the end. Originality/value: The progress is summarized from a variety of technical perspectives. Performance evaluation and lifetime prediction include both single-parameter and multi-parameter methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Phage susceptibility testing methods or 'phagograms': where do we stand and where should we go?
- Author
-
Kolenda, Camille, Jourdan, Julie, Roussel-Gaillard, Tiphaine, Medina, Mathieu, and Laurent, Frédéric
- Subjects
- *
DRUG resistance in microorganisms , *TEST methods , *AUTOMATION , *COLLECTIONS - Abstract
Phage therapy is a highly promising approach to address the challenge that is presented by the global burden of antimicrobial resistance. Given the natural specificity of phages, phage susceptibility testing (PST) is a prerequisite for successful personalized therapy, allowing the selection of active phages from large and diverse collections. However, the issue of an easy-to-use and standardized technique remains. In this review, we describe the principles, advantages and drawbacks of two routinely used PST techniques: plaque and growth kinetic assays. These are labour-intensive and time-consuming methods that require automation of one or more steps, including preparation of test panels, incubation, reading and analysis of results. In addition to automation, there is an urgent need to establish a reference method to enable efficient of PST techniques selection of therapeutic phages. We discuss knowledge gaps and parameters that need to be investigated to work towards this goal. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Distribution Patterns of the Dominant Mining Fracture Fields in Pressure Relief Methane Migration.
- Author
-
Meng, Xiangqi, Hu, Shengyong, Liu, Fengyun, Feng, Guorui, and Guo, Shuyun
- Subjects
- *
MINING law , *TEST methods , *METHANE , *SPECKLE interference , *CAVING - Abstract
The evolution and distribution laws of fracture fields directly affect the efficiency of mining methane extraction. In this study, a physical model test method and a DIC speckle analysis system were used to examine the distribution laws of the dominant mining fracture fields of pressure relief methane migration in working face advancement processes. The results revealed that during the advancement of working faces, the fracture angles on both sides of the goafs and the top separation fractures form ladder-type overburden fracture structures. A large number of penetrating cracks were generated in the caving zone, transition zone and vertical fracture zone in the ladder structure, which provided dominant channels for methane migration. The average extraction purity of 12 m3/min and the average concentration of methane extraction in surface wells were both above 90% in the fracture-filled area. Methane had an excellent migration channel and storage space. As the fracture begins to close, the methane extraction efficiency decreases until the methane extraction concentration was reduced to a minimum of 8.1% and the methane extraction purity was reduced to a minimum of 0.03 m3/min. The field data change results of methane extraction were consistent with the experimental theory. The research results can provide references for improving the efficiency of methane extraction and optimizing the layout of drilling wells and their horizons. Highlights: The fracture law of mining overburden was studied, and the displacement characteristics of mining overburden were analyzed by DIC system. The distribution characteristics of abutment stress of overlying strata from open-off cut to stop line were studied. The number and angle of cracks in the fracture zone were studied. The extraction parameters of surface wells in a mine were measured on site, and the test conclusions were verified. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. A combined superiority and non-inferiority procedure for comparing predictive values of two diagnostic tests.
- Author
-
Takahashi, Kanae, Yamamoto, Kouji, and Shintani, Ayumi
- Subjects
- *
PREDICTIVE tests , *TEST methods , *DIAGNOSIS methods - Abstract
Positive and negative predictive values are useful to quantify the performance of medical tests, and both are often used simultaneously. Although there are several methods to test the equality of these predictive values between two medical tests, these approaches separately compare positive and negative predictive values. Therefore, we propose a testing procedure that combines the approximate likelihood ratio test defined by Tang et al. with the non-inferiority test for predictive values. The procedure can confirm that compared to an existing test, a new medical test is non-inferior in terms of both positive and negative predictive values, as well as superior regarding at least one of these values. It can make a comprehensive judgment of the performance of the new test based on both measures. A simulation study showed that the performance of the proposed testing procedure is appropriate, and the procedure is considered useful for evaluating the performance of predictive values of medical tests. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. MIsgivings about measurement invariance.
- Author
-
Funder, David C and Gardiner, Gwendolyn
- Subjects
- *
ETHNOPSYCHOLOGY , *RESEARCH personnel , *CROSS-cultural differences , *TEST methods , *ACQUISITION of data - Abstract
This paper critically evaluates the conventional insistence on establishing measurement invariance (MI) in cross-cultural psychology. We argue that complex and seemingly arbitrary benchmarks for assessing MI can be unrealistic and effectively prohibit meaningful research. The widespread use of various MI criteria creates unnecessary and often unattainable hurdles for cross-cultural researchers who have made the effort to collect data in multiple cultural contexts. Additionally, the prohibitionist tone of discussions surrounding MI is unhelpful, unscientific, and discouraging. We argue that emerging findings that cultural differences might not be as widespread or profound as once assumed imply that significant cross-cultural differences in measurement should not be the default assumption. Additionally, we advocate a shift towards external validity as a more useful metric of measurement quality. Our overall message is that researchers who go to the considerable trouble of gathering data in more than one country should not be disadvantaged compared to researchers who avoid cross-cultural complications by gathering data only at their home campus. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Transmittance estimation of translucent polymeric pellets.
- Author
-
Wu, Jiaying, Luo, Zhenhua, Dougherty, Joseph, and Shamey, Renzo
- Subjects
- *
REFLECTANCE measurement , *MANUFACTURING processes , *QUALITY control , *TEST methods , *PLASTICS - Abstract
Plastics are commonly produced and sold in pellet form due to their superior handling characteristics. However, due to their small size, it is often impractical, if not unfeasible, to determine the transmittance of a single pellet instrumentally. Moreover, such measurements may be highly variable. Therefore, translucent films of certain thickness, known as plaques, are commonly molded to enable instrumental determination of their transmittance. These plaques, however, are not needed beyond the quality control process while they add a costly step to the production process. In this study, we test a method, based on the layer theory, that enables the estimation of the transmittance spectra of nearly transparent plastic plaques from the reflectance measurements of their pellet counterparts. The comparison of the estimated transmittance spectra of pellets versus measured transmittance of plaques shows the RMSE ranging from 0.37%–1.80%, with a color difference, CIEDE2000(1:1:1), of 0.07–0.48, thus validating the applicability of the method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. A novel T-bar test method ensuring full-flow mechanism in stiffer clay.
- Author
-
Han, Yunrui, Yu, Long, Wang, Zhongtao, Yang, Qing, and Hu, Yuxia
- Subjects
- *
SURFACE pressure , *DRILL core analysis , *SHEAR strength , *TEST methods , *KAOLIN - Abstract
The T-bar penetrometer is currently used to characterise seabed sediments, mostly based on the full-flow mechanism around the probe in soft clays. However, open and trapped cavities may form above the T-bar during its penetration in stiffer clays, which can make the traditional interpretation method inaccurate and requires complex corrections. This paper proposes a novel T-bar test method by adding sufficient surface pressure on the soil surface to ensure the full-flow mechanism and no cavity. In this way, the current formulas based on the full-flow soil mechanism can be directly used to interpret soil strength. To verify the effectiveness of the proposed method, both laboratory tests on kaolin clay and Guangzhou offshore clay and large-deformation finite-element analyses with various surface pressures and soil strengths were conducted. The results show that, if the surface pressure is sufficient, no open cavity or trapped cavity was formed during a monotonic T-bar penetration and the first cycle of cyclic T-bar penetration tests. Without surface pressure, however, an open cavity or trapped cavity was always formed. The cavity formation contributed to a T-bar resistance 7·9–18·6% lower relative to that of a T-bar with sufficient surface pressure to maintain a full-flow mechanism. In the cyclic tests, with an additional number of T-bar loading cycles, the cavity effect diminished and the surface pressure showed minimal effect on the T-bar resistance. A critical surface pressure that ensures the full-flow mechanism was suggested for T-bar monotonic and cyclic tests in the box core sample. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Constructing tests for skill assessment with competence‐based test development.
- Author
-
Anselmi, Pasquale, Heller, Jürgen, Stefanutti, Luca, and Robusto, Egidio
- Subjects
- *
TEST design , *TEST methods , *COLLECTIONS - Abstract
Competence‐based test development is a recent and innovative method for the construction of tests that are as informative as possible about the competence state (the set of skills an individual has available) underlying the observed item responses. It finds application in different contexts, including the development of tests from scratch, and the improvement or shortening of existing tests. Given a fixed collection of competence states existing in a population of individuals and a fixed collection of competencies (each of which being the subset of skills that allow for solving an item), the competency deletion procedure results in tests that differ from each other in the competencies but are all equally informative about individuals' competence states. This work introduces a streamlined version of the competency deletion procedure that considers information necessary for test construction only, illustrates a straightforward way to incorporate test developer preferences about competencies into the test construction process, and evaluates the performance of the resulting tests in uncovering the competence states from the observed item responses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Optimization design method for multi-stress accelerated degradation test based on Tweedie exponential dispersion process.
- Author
-
Tian, Runcao, Zhang, Fan, Du, Hongguang, and Wang, Peng
- Subjects
- *
ADAPTIVE testing , *PARAMETER estimation , *STATISTICAL correlation , *TEST design , *TEST methods , *ACCELERATED life testing - Abstract
• Established a multi-stress acceleration model that accounts for the generalized coupling among stresses. • Introduced the MSGC-TED process to characterize the relationship between stress and degradation behavior. • Employed both V -optimization and D -optimization for a multi-decision optimized design of experiments. • Incorporated the relationship between energy and instrument wear costs and the levels of accelerated stress into the test cost constraints. In the operational environment of electronic products, simultaneous multiple stresses contribute to product degradation, with their interactions often being intricate. To effectively analyze and optimize multi-stress accelerated degradation tests, we introduce a Multi-Stress Generalized Coupling accelerated degradation model based on the Tweedie exponential dispersion process, coupled with a multi-decision optimization approach for test design. The model integrates stress interactions elucidated through fuzzy correlation analysis, reflecting the accelerated degradation behavior consistent with the Tweedie exponential dispersion process. Addressing the complexities of multi-parameter estimation, we propose a parameter estimation method underpinned by the Sine Cosine Algorithm for multi-objective optimization within the proposed framework. Considering budgetary constraints, D -optimization and V -optimization strategies are deployed to finetune test scheme allocations, optimizing resource utilization. Validation through a three-stress accelerated degradation test on LED chips confirms the model's efficacy in reliability assessment. Our findings indicate the superior descriptive power of the Tweedie exponential dispersion model for complex degradation phenomena and the proposed model's closer approximation to stress conditions in real-world settings, with a methodological error maintained below 2 %. Furthermore, the test optimization method enhances the D -optimization and V -optimization objective values by 21.20 % and reduces them by 4.08 %, respectively, marking a substantial improvement over traditional schemes and endorsing the holistic advancement of multi-stress accelerated degradation testing methodologies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Comparison of baseline correction algorithms for in vivo 1H‐MRS.
- Author
-
Pasmiño, Diego, Slotboom, Johannes, Schweisthal, Brigitte, Guevara, Pamela, Valenzuela, Waldo, and Pino, Esteban J.
- Subjects
TEST methods ,IN vivo studies ,ALGORITHMS ,RESONANCE ,METABOLITES - Abstract
Proton MRS is used clinically to collect localized, quantitative metabolic data from living tissues. However, the presence of baselines in the spectra complicates accurate MRS data quantification. The occurrence of baselines is not specific to short‐echo‐time MRS data. In short‐echo‐time MRS, the baseline consists typically of a dominating macromolecular (MM) part, and can, depending on B0 shimming, poor voxel placement, and/or localization sequences, also contain broad water and lipid resonance components, indicated by broad components (BCs). In long‐echo‐time MRS, the MM part is usually much smaller, but BCs may still be present. The sum of MM and BCs is denoted by the baseline. Many algorithms have been proposed over the years to tackle these artefacts. A first approach is to identify the baseline itself in a preprocessing step, and a second approach is to model the baseline in the quantification of the MRS data themselves. This paper gives an overview of baseline handling algorithms and also proposes a new algorithm for baseline correction. A subset of suitable baseline removal algorithms were tested on in vivo MRSI data (semi‐LASER at TE = 40 ms) and compared with the new algorithm. The baselines in all datasets were removed using the different methods and subsequently fitted using spectrIm‐QMRS with a TDFDFit fitting model that contained only a metabolite basis set and lacked a baseline model. The same spectra were also fitted using a spectrIm‐QMRS model that explicitly models the metabolites and the baseline of the spectrum. The quantification results of the latter quantification were regarded as ground truth. The fit quality number (FQN) was used to assess baseline removal effectiveness, and correlations between metabolite peak areas and ground truth models were also examined. The results show a competitive performance of our new proposed algorithm, underscoring its automatic approach and efficiency. Nevertheless, none of the tested baseline correction methods achieved FQNs as good as the ground truth model. All separately applied baseline correction methods introduce a bias in the observed metabolite peak areas. We conclude that all baseline correction methods tested, when applied as a separate preprocessing step, yield poorer FQNs and biased quantification results. While they may enhance visual display, they are not advisable for use before spectral fitting. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Optimization of multiple responses in the laser cladding process parameters.
- Author
-
Liu, Zenghua, Wu, Sha, Liu, Wen, Wu, Yufeng, and Liang, Xiubing
- Subjects
TUNGSTEN carbide ,TENSILE strength ,ANALYSIS of variance ,TEST methods ,LASERS - Abstract
An orthogonal test method was used to optimize parameters for Ni60A-25% tungsten carbide (WC) laser cladding layers. Meanwhile, the tensile properties of the optimized laser cladding layer are explored. A multi-factor analysis of variance (ANOVA) was conducted to examine the effect of factors (laser power-A, scanning speed-B, powdering rate-C, and spot diameter-D) and interactions on responding variable. The results showed that the interaction of scanning speed and powdering rate with spot diameter has a more pronounced impact on the dilution rate compared to a single factor. Based on the ANOVA of the dilution rate, the optimal combination of process parameters was A
3 C2 B1 D3 . Furthermore, the optimal combination of process parameters was C3 B2 D3 A3 for the effective area. The results showed that the yield strength of the specimens was 512.3 MPa and 509.7 MPa, and the tensile strength was 646.3 MPa and 605.7 MPa for process parameters I and II, respectively. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
23. Model‐based frequency‐and‐phase correction of 1H MRS data with 2D linear‐combination modeling.
- Author
-
Simicic, Dunja, Zöllner, Helge J., Davies‐Jenkins, Christopher W., Hupfeld, Kathleen E., Edden, Richard A. E., and Oeltzschner, Georg
- Subjects
NUCLEAR magnetic resonance spectroscopy ,AMPLITUDE estimation ,TEST methods ,RECORDING & registration - Abstract
Purpose: Retrospective frequency‐and‐phase correction (FPC) methods attempt to remove frequency‐and‐phase variations between transients to improve the quality of the averaged MR spectrum. However, traditional FPC methods like spectral registration struggle at low SNR. Here, we propose a method that directly integrates FPC into a 2D linear‐combination model (2D‐LCM) of individual transients ("model‐based FPC"). We investigated how model‐based FPC performs compared to the traditional approach, i.e., spectral registration followed by 1D‐LCM in estimating frequency‐and‐phase drifts and, consequentially, metabolite level estimates. Methods: We created synthetic in‐vivo‐like 64‐transient short‐TE sLASER datasets with 100 noise realizations at 5 SNR levels and added randomly sampled frequency and phase variations. We then used this synthetic dataset to compare the performance of 2D‐LCM with the traditional approach (spectral registration, averaging, then 1D‐LCM). Outcome measures were the frequency/phase/amplitude errors, the SD of those ground‐truth errors, and amplitude Cramér Rao lower bounds (CRLBs). We further tested the proposed method on publicly available in‐vivo short‐TE PRESS data. Results: 2D‐LCM estimates (and accounts for) frequency‐and‐phase variations directly from uncorrected data with equivalent or better fidelity than the conventional approach. Furthermore, 2D‐LCM metabolite amplitude estimates were at least as accurate, precise, and certain as the conventionally derived estimates. 2D‐LCM estimation of FPC and amplitudes performed substantially better at low‐to‐very‐low SNR. Conclusion: Model‐based FPC with 2D linear‐combination modeling is feasible and has great potential to improve metabolite level estimation for conventional and dynamic MRS data, especially for low‐SNR conditions, for example, long TEs or strong diffusion weighting. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Cutting Tool Monitoring Technology Using Wireless Acoustic Emission Sensor.
- Author
-
Uematsu, Mizuki, Kato, Kazuya, Watanabe, Kota, Watanobe, Tomoya, and Natsu, Wataru
- Subjects
ACOUSTIC emission ,MEASURING instruments ,MILLING-machines ,DETECTORS ,TEST methods - Abstract
Acoustic emission (AE) waves in milling are measured by mounting an AE sensor on the workpiece owing to the need of a signal line for the AE sensor. However, if the distance between the AE sensor and the machining point is excessive, attenuation may hinder the measurement of AE waves. Therefore, AE waves in milling should preferably be measured at the tool side. This study developed a device for monitoring cutting tools using a wireless AE sensor. The proposed device measures the amount of chipping at the cutting edge of the end mill during milling. This study specifically focused on the milling of narrow grooves with a small-diameter end mill using a machining center. With the developed device, the AE waves generated from the chipping of the cutting edge are measured at the tool, and the measured data are transmitted wirelessly. Three different methods were tested for attaching the AE sensor to the tool holder, confirming that the AE waves could be measured. Then, an end mill was placed in contact with a diamond, and the AE waves generated by chipping of the cutting edge were measured. Narrow grooves were milled with an end mill to demonstrate that the device could measure the AE waves generated from the cutting edge when chipping occurred. Observations suggest that the developed device can monitor the size of chipping of cutting edges. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Boosting Multimode Ruling in DHR Architecture With Metamorphic Relations.
- Author
-
Li, Ruosi, Kong, Xianglong, Guo, Wei, Guo, Jingdong, Li, Hongfa, and Zhang, Fan
- Subjects
PLURALITY voting ,TEST methods ,CYBERSPACE ,SQL ,TEST design - Abstract
The DHR architecture provides a revolutionary security defense structure for cyberspace. The multimode ruling in DHR is expected to alleviate the oracle problem, which still suffers from the existence of common model vulnerability. In this work, we design a test segmentation method to transform multimode ruling to a metamorphic testing problem. The text test input that causes inconsistency of heterogeneous executors is converted to a condition set, and we extract subsets of conditions based on its syntax tree. The original test can exploit a specific vulnerability, the follow‐up tests are composed by different subsets of conditions within the original test. We collect the execution matrix for the follow‐up tests to analyse the impact of each subset of conditions on ruling decision. Metamorphic relations are extracted based on the localization of independent condition, that is, the subsets of conditions that can impact ruling decision independently. The executors in an inconsistent ruling should be examined with metamorphic testing methods, rather than traditional majority voting mechanism. The proposed test segmentation and improved multimode ruling methods are evaluated on two DHR‐based cases, SQL injection in cyber‐range system and deserialization attack in super$$ super $$‐ xray$$ xray $$ project. The experimental results show that our test segmentation can help to locate malicious expressions and the metamorphic testing‐based multimode ruling can generate more correct results than majority voting mechanism with an average 15.8% performance loss. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Study of hydrogen embrittlement in steels using modified pressurized disks.
- Author
-
Santana, L.M., Pinto, D. Lopes, Osipov, N., Furtado, J., Bourguignon, F., Marchais, P.-J., Madi, Y., and Besson, J.
- Subjects
- *
HYDROGEN embrittlement of metals , *MATERIALS testing , *BINDING energy , *ENERGY density , *TEST methods - Abstract
The integration of hydrogen into natural gas pipelines presents challenges due to Hydrogen Embrittlement (HE), requiring critical material selection and testing methods. One such test is the Disk Pressure Test (DPT), which consists in pressurizing a clamped disk to failure. However, failure happens often at the clamping zone, making analysis difficult. This study aims to develop new disk geometries to control failure location while maintaining the test setup. Using two steel grades (a vintage X52 pipeline steel and a modern E355 modified steel with potential for pipeline use), new disk geometries were tested under helium and hydrogen at various pressure rise rates. Results show successful displacement of failure away from the clamping zones, demonstrating the effectiveness of the new geometries. Hydrogen embrittlement is demonstrated by comparing failure pressures under helium and hydrogen at various pressure rise rates. Using both material testing and simulation, this study provides insights into hydrogen embrittlement. • New disk geometries: Specimens redesigned to shift failure from clamping zones. • Hydrogen embrittlement: Pressure tests showed similar rupture pressures and depths. • Simulation: Models linking hydrogen trap density to plastic strain matched results. • Trap binding energy and trap density evolution matching embrittlement depth. • Modified geometries enable distinct stress states for hydrogen embrittlement studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Impact Absorption Power of Polyolefin Fused Filament Fabrication 3D‐Printed Sports Mouthguards: In Vitro Study.
- Author
-
Nassani, Leonardo Mohamad, Storts, Samuel, Novopoltseva, Irina, Place, Lauren Ann, Fogarty, Matthew, and Schupska, Pete
- Subjects
- *
IMPACT testing , *MATERIALS testing , *ETHYLENE-vinyl acetate , *IMPACT strength , *TEST methods - Abstract
ABSTRACT Background/Aim Materials and Methods Results Conclusions This study aims to evaluate and compare the impact absorption capacities of thermoformed ethylene vinyl acetate (EVA) mouthguards and 3D‐printed polyolefin mouthguards used in sports dentistry applications. The objective is to determine whether 3D‐printed polyolefin mouthguards offer superior impact toughness compared to traditional EVA mouthguards commonly used in sports settings.Six material samples were assessed: five pressure‐formed EVA mouthguards (PolyShok, Buffalo Dental, Erkoflex, Proform, and Drufosoft) and one 3D‐printed synthetic polymer (polyolefin). The materials were evaluated using a modified American Society for Testing and Materials (ASTM) D256 Test Method A for Izod pendulum impact resistance of plastics. Polyolefin samples were 3D‐printed using fused filament fabrication (FFF) technology. Notably, the FFF process included samples printed with notches placed either parallel or perpendicular to the build direction. This orientation served as a study factor, allowing for comparison of material behavior under different printing conditions. Impact testing was conducted using an Izod impact tester to assess the materials' performance under controlled impact conditions.The study achieved a high power (1.0) in power analysis, indicating strong sensitivity to detect significant differences. Among molded materials, PolyShok showed significantly lower impact toughness compared to others (p = 0.06). The mean impact absorption of EVA materials was 5.4 ± 0.3 kJ/m2, significantly lower than polyolefin materials, which demonstrated 12.9 ± 0.7 kJ/m2 and superior performance (p = 0.0). Horizontal‐notched polyolefin samples exhibited higher impact strength compared to vertical‐notched samples (p = 0.009).3D‐printed polyolefin mouthguards exhibited significantly higher impact toughness than thermoformed EVA mouthguards. While EVA materials demonstrated structural robustness, their lower impact resistance and observed tearing in other test specimens suggest the need for alternative testing standards to better reflect real‐world conditions. 3D‐printed mouthguards fabricated with build orientations perpendicular to the direction of impact demonstrate significantly enhanced impact absorption. Further research into manufacturing methods and testing protocols is recommended to optimize mouthguard performance under impact scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Methods of behavioral testing in dogs: a scoping review and analysis of test stimuli.
- Author
-
Moser, Ariella Y., Welch, Mitchell, Brown, Wendy Y., McGreevy, Paul, and Bennett, Pauleen C.
- Subjects
PSYCHOLOGICAL tests ,BEHAVIORAL assessment ,SCIENTIFIC literature ,ONLINE databases ,TEST methods - Abstract
Background: Behavioral testing is widely used to measure individual differences in behavior and cognition among dogs and predict underlying psychological traits. However, the diverse applications, methodological variability, and lack of standardization in canine behavioral testing has posed challenges for researchers and practitioners seeking to use these tests. To address these complexities, this review sought to synthesize and describe behavioral testing methods by creating a framework that uses a “dog-centric” perspective to categorize the test stimuli used to elicit responses from dogs. Methods: A scoping review was conducted to identify scientific literature that has reported behavioral testing to assess psychological traits in dogs. Five online databases were systematically searched. Following this, an inductive content analysis was conducted to evaluate and summarize the behavioral testing methods in the literature. Results: A total of 392 publications met the selection criteria and were included in the analysis, collectively reporting 2,362 behavioral tests. These tests were individually evaluated and categorized. Our content analysis distinguished 29 subcategories of behavioral testing stimuli that have been used, grouped into three major categories: human-oriented stimuli; environmental stimuli; and motivator-oriented stimuli. Conclusion: Despite the methodological heterogeneity observed across behavioral testing methods, our study identified commonalities in many of the stimuli used in test protocols. The resulting framework provides a practical overview of published behavioral tests and their applications, which may assist researchers in selecting and designing appropriate tests for their purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. LLpowershap: logistic loss-based automated Shapley values feature selection method.
- Author
-
Madakkatel, Iqbal and Hyppönen, Elina
- Subjects
- *
MACHINE learning , *FEATURE selection , *SIMPLE machines , *DEBUGGING , *TEST methods - Abstract
Background: Shapley values have been used extensively in machine learning, not only to explain black box machine learning models, but among other tasks, also to conduct model debugging, sensitivity and fairness analyses and to select important features for robust modelling and for further follow-up analyses. Shapley values satisfy certain axioms that promote fairness in distributing contributions of features toward prediction or reducing error, after accounting for non-linear relationships and interactions when complex machine learning models are employed. Recently, feature selection methods using predictive Shapley values and p-values have been introduced, including powershap. Methods: We present a novel feature selection method, LLpowershap, that takes forward these recent advances by employing loss-based Shapley values to identify informative features with minimal noise among the selected sets of features. We also enhance the calculation of p-values and power to identify informative features and to estimate number of iterations of model development and testing. Results: Our simulation results show that LLpowershap not only identifies higher number of informative features but outputs fewer noise features compared to other state-of-the-art feature selection methods. Benchmarking results on four real-world datasets demonstrate higher or comparable predictive performance of LLpowershap compared to other Shapley based wrapper methods, or filter methods. LLpowershap is also ranked the best in mean ranking among the seven feature selection methods tested on the benchmark datasets. Conclusion: Our results demonstrate that LLpowershap is a viable wrapper feature selection method that can be used for feature selection in large biomedical datasets and other settings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Using H‐Convergence to Calculate the Numerical Errors for 1D Unsaturated Seepage Under Steady‐State Conditions.
- Author
-
Chapuis, Robert P., Taveau, Coline, Duhaime, François, Weber, Simon, Marefat, Vahid, Zhang, Lu, Blessent, Daniela, Bouaanani, Najib, and Pelletier, Dominique
- Subjects
- *
PARTIAL differential equations , *NONLINEAR differential equations , *NUMERICAL analysis , *MATHEMATICAL domains , *TEST methods - Abstract
ABSTRACT Unsaturated zones are important for geotechnical design, geochemical reactions, and microbial reactions. The numerical analysis of unsaturated seepage is complex because it involves highly nonlinear partial differential equations. The permeability can vary by orders of magnitude over short vertical distances. This article defines and uses H‐convergence tests to quantify numerical errors made by uniform meshes with element size (
ES ) for 1D steady‐state conditions. The quantitative H‐convergence should not be confused with a qualitative mesh sensitivity study. The difference between numerical and mathematical convergences is stated. A detailed affordable method for an H‐convergence test is presented. The true but unknown solution is defined as the asymptote of the numerical solutions for all solution components whenES decreases to zero. The numerical errors versusES are then assessed with respect to the true solution, and using a log–log plot, which indicates whether a code is correct or incorrect. If a code is correct, its results follow the rules of mathematical convergence in a mathematical convergence domain (MCD) which is smaller than the numerical convergence domain (NCD). If a code is incorrect, it has an NCD but no MCD. Incorrect algorithms of incorrect codes need to be modified and repaired. Existing codes are shown to converge numerically within large NCDs but generate large errors, up to 500%, in the NCDs, a dangerous situation for designers. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
31. TG 与无磷保水剂对重组马肉 品质特性的影响.
- Author
-
王梓棚, 张斌, 蒋洪安, 顾雪敏, 梅洁, 马慧, and 孔令明
- Subjects
HORSEMEAT ,TEST methods ,RAW materials ,GLUTAMINE ,WATER quality ,SORBITOL - Abstract
Copyright of Food Research & Development is the property of Food Research & Development Editorial Department and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
32. Organoid intelligence for developmental neurotoxicity testing.
- Author
-
El Din, Dowlette-Mary Alam, Shin, Jeongwon, Lysinger, Alexandra, Roos, Matthew J., Johnson, Erik C., Shafer, Timothy J., Hartung, Thomas, and Smirnova, Lena
- Subjects
ARTIFICIAL intelligence ,MACHINE learning ,NEUROTOXICOLOGY ,TEST methods ,XENOBIOTICS - Abstract
The increasing prevalence of neurodevelopmental disorders has highlighted the need for improved testing methods to determine developmental neurotoxicity (DNT) hazard for thousands of chemicals. This paper proposes the integration of organoid intelligence (OI); leveraging brain organoids to study neuroplasticity in vitro, into the DNT testing paradigm. OI brings a new approach to measure the impacts of xenobiotics on plasticity mechanisms - a critical biological process that is not adequately covered in current DNT in vitro assays. Finally, the integration of artificial intelligence (AI) techniques will further facilitate the analysis of complex brain organoid data to study these plasticity mechanisms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Issues with Cefiderocol Testing: Comparing Commercial Methods to Broth Microdilution in Iron-Depleted Medium—Analyses of the Performances, ATU, and Trailing Effect According to EUCAST Initial and Revised Interpretation Criteria.
- Author
-
Stracquadanio, Stefano, Nicolosi, Alice, Marino, Andrea, Calvo, Maddalena, and Stefani, Stefania
- Subjects
- *
ACINETOBACTER baumannii , *KLEBSIELLA pneumoniae , *GRAM-negative bacteria , *PSEUDOMONAS aeruginosa , *TEST methods - Abstract
Background: The rise of multi-drug-resistant Gram-negative bacteria necessitates the development of new antimicrobial agents. Cefiderocol shows promising activity by exploiting bacterial iron transport systems to penetrate the outer membranes of resistant pathogens. Objectives: This study evaluates the efficacy of cefiderocol testing methods and trailing effect impact using a ComASP® Cefiderocol panel, disk diffusion (DD), and MIC test strips (MTS) compared to iron-depleted broth microdilution (ID-BMD). Methods: A total of 131 Gram-negative strains from clinical samples was tested by commercial methods and the gold standard. Results were interpreted as per 2024 and 2023 EUCAST guidelines. Results: ID-BMD revealed high cefiderocol susceptibility among Enterobacterales and Pseudomonas aeruginosa, with one Klebsiella pneumoniae isolate being resistant. Acinetobacter baumannii exhibited higher MIC values, particularly considering trailing effects that complicated MIC readings. ComASP® showed 97% categorical agreement (CA) and 66% essential agreement (EA) with ID-BMD for Enterobacterales but failed to detect the resistant K. pneumoniae. DD tests demonstrated variable CA (72% or 93%), and 38% or 34% of strains within the ATU according to EUCAST Breakpoint Tables v13.0 and 14.0, respectively, with major errors only. MTS for P. aeruginosa had 100% CA but 44% EA, and often underestimated MIC values. Conclusions: The study emphasizes the need for standardized criteria to address trailing effects and ATU and highlights the discrepancies between testing methods. While cefiderocol resistance remains rare, accurate susceptibility testing is crucial for its effective clinical use. The findings suggest that current commercial tests have limitations, necessitating careful interpretation and potential supplementary testing to guide appropriate antibiotic therapy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Caries Detection and Classification in Photographs Using an Artificial Intelligence-Based Model—An External Validation Study.
- Author
-
Frenkel, Elisabeth, Neumayr, Julia, Schwarzmaier, Julia, Kessler, Andreas, Ammar, Nour, Schwendicke, Falk, Kühnisch, Jan, and Dujic, Helena
- Subjects
- *
ARTIFICIAL intelligence , *DEEP learning , *DENTAL caries , *TEST methods , *PHOTOGRAPHS - Abstract
Objective: This ex vivo diagnostic study aimed to externally validate a freely accessible AI-based model for caries detection, classification, localisation and segmentation using an independent image dataset. It was hypothesised that there would be no difference in diagnostic performance compared to previously published internal validation data. Methods: For the independent dataset, 718 dental images representing different stages of carious (n = 535) and noncarious teeth (n = 183) were retrieved from the internet. All photographs were evaluated by the dental team (reference standard) and the AI-based model (test method). Diagnostic performance was statistically determined using cross-tabulations to calculate accuracy (ACC), sensitivity (SE), specificity (SP) and area under the curve (AUC). Results: An overall ACC of 92.0% was achieved for caries detection, with an ACC of 85.5–95.6%, SE of 42.9–93.3%, SP of 82.1–99.4% and AUC of 0.702–0.909 for the classification of caries. Furthermore, 97.0% of the cases were accurately localised. Fully and partially correct segmentation was achieved in 52.9% and 44.1% of the cases, respectively. Conclusions: The validated AI-based model showed promising diagnostic performance in detecting and classifying caries using an independent image dataset. Future studies are needed to investigate the validity, reliability and practicability of AI-based models using dental photographs from different image sources and/or patient groups. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Spatial Resolution Enhancement Framework Using Convolutional Attention-Based Token Mixer.
- Author
-
Peng, Mingyuan, Li, Canhai, Li, Guoyuan, and Zhou, Xiaoqing
- Subjects
- *
SPATIAL resolution , *MULTISENSOR data fusion , *REMOTE sensing , *REMOTE-sensing images , *TEST methods - Abstract
Spatial resolution enhancement in remote sensing data aims to augment the level of detail and accuracy in images captured by satellite sensors. We proposed a novel spatial resolution enhancement framework using the convolutional attention-based token mixer method. This approach leveraged spatial context and semantic information to improve the spatial resolution of images. This method used the multi-head convolutional attention block and sub-pixel convolution to extract spatial and spectral information and fused them using the same technique. The multi-head convolutional attention block can effectively utilize the local information of spatial and spectral dimensions. The method was tested on two kinds of data types, which were the visual-thermal dataset and the visual-hyperspectral dataset. Our method was also compared with the state-of-the-art methods, including traditional methods and deep learning methods. The experiment results showed that the method was effective and outperformed state-of-the-art methods in overall, spatial, and spectral accuracies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Cloud Detection Using a UNet3+ Model with a Hybrid Swin Transformer and EfficientNet (UNet3+STE) for Very-High-Resolution Satellite Imagery.
- Author
-
Choi, Jaewan, Seo, Doochun, Jung, Jinha, Han, Youkyung, Oh, Jaehong, and Lee, Changno
- Subjects
- *
TRANSFORMER models , *CONVOLUTIONAL neural networks , *REMOTE-sensing images , *COMMERCIAL product testing , *TEST methods , *DEEP learning - Abstract
It is necessary to extract and recognize the cloud regions presented in imagery to generate satellite imagery as analysis-ready data (ARD). In this manuscript, we proposed a new deep learning model to detect cloud areas in very-high-resolution (VHR) satellite imagery by fusing two deep learning architectures. The proposed UNet3+ model with a hybrid Swin Transformer and EfficientNet (UNet3+STE) was based on the structure of UNet3+, with the encoder sequentially combining EfficientNet based on mobile inverted bottleneck convolution (MBConv) and the Swin Transformer. By sequentially utilizing convolutional neural networks (CNNs) and transformer layers, the proposed algorithm aimed to extract the local and global information of cloud regions effectively. In addition, the decoder used MBConv to restore the spatial information of the feature map extracted by the encoder and adopted the deep supervision strategy of UNet3+ to enhance the model's performance. The proposed model was trained using the open dataset derived from KOMPSAT-3 and 3A satellite imagery and conducted a comparative evaluation with the state-of-the-art (SOTA) methods on fourteen test datasets at the product level. The experimental results confirmed that the proposed UNet3+STE model outperformed the SOTA methods and demonstrated the most stable precision, recall, and F1 score values with fewer parameters and lower complexity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Hybrid Refractive and Diffractive Testing Method for Free-Form Convex Mirror in High-Resolution Remote-Sensing Cameras.
- Author
-
Deng, Nan, Li, Yanjie, Ma, He, and Zhang, Feifei
- Subjects
- *
REMOTE-sensing images , *CONVEX surfaces , *OPTICAL aberrations , *HOLOGRAPHY , *TEST methods - Abstract
The development of high-resolution and large field of view remote-sensing cameras is inextricably linked to the application of free-form mirrors. The free-form mirror offers higher design of freedom and is more effective at correcting aberrations in optical systems. The surface shape error of a free-form mirror directly affects the imaging quality of remote-sensing cameras. Consequently, a high-precision free-form mirror detection method is of paramount importance. For the convex free-form surface mirror with a large aperture, a hybrid refractive and diffractive testing method combining computer-generated holography (CGH) and spherical mirrors for high-precision null test is proposed in this paper. When comparing the effect of error and the detection sensitivity of different designs, the results showed that the influence of the system error is reduced by about 42% and the sensitivity is increased by more than 2.6 times. The proposed method can achieve higher testing accuracy and represents an effective and feasible approach for the surface shape detection method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Robust biomimetic strain sensor based on butterfly wing-derived skeleton structure.
- Author
-
Teng, Fu-Rui, Tan, Si-Chen, Fang, Jia-Bin, Zi, Tao-Qing, Wu, Di, and Li, Ai-Dong
- Subjects
- *
STRAIN sensors , *SENSOR arrays , *FINITE element method , *FACIAL expression , *TEST methods , *CARBON nanotubes - Abstract
A biomimetic strain sensor was designed and constructed based on Ir nanoparticles-modified multi-wall carbon nanotubes (Ir NPs@MWCNTs) and parallel Pt layer/dragon skin with carbonized butterfly wing patterns. This sensor exhibits high gauge factor (∼515.4), extensive tensile range (0%–96%), and swift response (∼300 ms), especially remarkable stability up to 60 000 cycles. The work mechanism has been proposed based on the experimental test and finite-element method. Some important applications such as human motion and micro-expression recognition have been confirmed using 3 × 3 flexible biomimetic sensor array. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Review on bonding strength testing methods for polymer-based microfluidics.
- Author
-
Lu, Yuhan, Ma, Liang, Chen, Lida, Wan, Penghui, and Fan, Yiqiang
- Subjects
- *
TENSILE tests , *MICROFLUIDICS , *TEST methods , *BOND strengths , *ADHESIVES , *MICROFLUIDIC devices - Abstract
Bonding is the key step in the fabrication of microfluidic devices. In the conventional approaches for the fabrication of polymer-based microfluidics, the substrate and cover plate were fabricated and then bonded to enclose the micro channels. Various methods have been invented for bonding polymer-based microfluidics, e.g. adhesive bonding, solvent bonding, thermal fusion, etc., materials with the same or different materials were bonded. The bonding quality of the polymer-based microfluidics is critical during use, leakage or even detachment is not allowed. The bonding quality of polymer-based microfluidic devices has been evaluated in different ways, some of the typical methods include the tensile test, shear test, and burst opening test, and each method has its own strengths and weaknesses. The standard procedure for bonding strength test hasn't been established yet. In this study, different evaluation methods for bonding quality were discussed and compared in detail, the application scenarios for each method are also discussed. An outlook for the future standardization trend of bonding test methods for microfluidic device is also provided in this study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. An automated commercial open access assay for detection of Mycoplasma genitalium macrolide resistance.
- Author
-
Lindroth, Ylva, Hansson, Lucia, and Forslund, Ola
- Subjects
- *
DETECTION limit , *MYCOPLASMA , *GENETIC mutation , *TEST methods , *AZITHROMYCIN - Abstract
Azithromycin, a macrolide antibioticum, is the first‐line treatment for Mycoplasma genitalium (MG), but resistant MG is an increasing problem. Macrolide resistance‐mediated mutations (MRM) has been linked to point mutations in region V of the MG 23S rRNA gene. We have evaluated an open access analyzer (Panther Fusion, Hologic Inc) for detectability of MRM (mutations A2071G and A2072G) and MG wild type (WT) in clinical samples. Also, the agreement of the Panther Fusion assay results with a corresponding established In‐house MRM‐WT PCR (ABI 7500) was calculated. Left over material from 55 clinical samples positive for MG by the Aptima test (Hologic) based on transcription‐mediated amplification (TMA), collected from January to February 2023 in Region Skåne, Sweden, was analyzed. Specific amplification curves were generated for positive controls of MG mutations (A2071G and A2072G) and WT by the Panther Fusion assay. The limit of detection (LOD) was 5.3 copies/mL for WT, 8.1 copies/mL for mutation A2071G, and 81 copies/mL for mutation A2072G. The overall concordance was 91% between the Panther Fusion and the In‐house PCR (Kappa 0.621, 95% CI; 0.327–0.914) for detection of WT or MRM in MG‐positive clinical samples. The Panther Fusion detected MRM in 20% (11/55) and WT in 62% (34/55) of the samples. The corresponding In‐house PCR results were 25% (14/55) and 65% (36/55). In summary, the Panther Fusion assay demonstrated detection of low copy number of MRM and WT of MG. Among clinical samples substantial agreement between the Panther Fusion and In‐house PCR results was observed. Integrating MG‐analysis (TMA) and MRM‐WT assay on the Panther platform could make MRM testing more readily available. However, the Panther Fusion had a lower success rate (82% vs 90%) for macrolide susceptibility testing, hence testing with a complementary method should be considered for samples where neither WT nor MRM MG are detectable. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Defining the segmental tension generated in a vertebral body tethering system for scoliosis.
- Author
-
Upasani, Vidyadhar V., Farnsworth, Christine L., Caffrey, Jason P., Olmert, Tony, Brink, Ian, Cain, Phoebe, and Mannen, Erin
- Subjects
- *
ORTHOPEDISTS , *INTRACLASS correlation , *SPRING , *SCOLIOSIS , *TEST methods - Abstract
Vertebral body tethering (VBT) uses a flexible tether affixed across the curve convexity with tension applied at each segment to treat scoliosis. Intraoperative tether tension may be achieved directly with a counter‐tensioner or with an extension spring tube. The purpose of this study was to quantify the force generated with and without the extension spring tube using current FDA‐approved VBT instrumentation, to understand the variation between surgeons using the same instrumentation, and to define the force range that is generated intra‐operatively. Using a benchtop mechanical testing setup to simulate a spinal segment, we affixed the tether and applied tension using a tensioner and counter‐tensioner alone (method T1) or by adding an extension spring tube (method T2). Eight orthopedic surgeons used T1 and T2 at six tensioner settings, and one surgeon completed three trials. A two‐way ANOVA with a Tukey's HSD post hoc test (
p < 0.05) compared the tensioner methods and testing levels. Inter‐ and intra‐rater reliabilities were calculated using intraclass correlation coefficients (ICCs). Methods T1 and T2 exhibited linear tension‐setting relationships, with high determination coefficients (R 2 > 0.93). T2 consistently produced higher forces (increase of 62.1 N/setting), compared to T1 (increase of 50.6 N/setting,p < 0.05). Inter‐rater reliability exhibited excellent agreement (ICC = 0.951 and 0.943 for T1 and T2, respectively), as did intra‐rater reliability (ICC = 0.971). [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
42. New classes of goodness‐of‐fit tests for the one‐sided Lévy distribution.
- Author
-
Kumari, Aditi, Bhati, Deepesh, and Batsidis, Apostolos
- Subjects
- *
FALSE positive error , *ERROR rates , *TEST methods , *CONFORMANCE testing , *PERFORMANCE theory - Abstract
New estimators for the scale parameter of the one‐sided Lévy distribution are proposed, which are used for constructing new classes of goodness‐of‐fit (gof) tests, based on the ratio of two estimators of the scale parameter. A Monte Carlo study examines the performance of the new gof tests in controlling type I error rate and evaluates their power performance. The performance of the new classes of gof test is also compared with existing tests. Finally, the applicability of the new gof tests is illustrated using three real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Sliding window based adaptative fuzzy measure for edge detection.
- Author
-
Marco‐Detchart, Cedric, Lucca, Giancarlo, Santos Silva, Miquéias Amorim, Rincon, Jaime A., Julian, Vicente, and Dimuro, Graçaliz
- Subjects
- *
FEATURE extraction , *INTEGRAL functions , *INFORMATION measurement , *TEST methods , *GENERALIZATION , *FUZZY measure theory - Abstract
In this work, we explore the impact of adaptive fuzzy measures on edge detection, aiming to enhance how computers interpret images by identifying edges more accurately. Traditional methods rely on analysing changes in image brightness to locate edges, but they often use fixed rules that do not account for the unique characteristics of each image. Our approach differs by adjusting fuzzy measures based on the information within specific areas of an image under a sliding window approach, utilizing a variety of fusion functions and generalizations of the Choquet integral to analyse and combine pixel data. The proposed method is flexible, allowing for the adaptation of measures in response to the image's local features. We put our method to the test against the well‐established Canny edge detector to evaluate its effectiveness. Our experimental results suggest that by adapting fuzzy measures for each image section, we can improve edge detection results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Factors Influencing Measurement of Dynamic Elastic Modulus from Disk-Shaped Concrete Specimen.
- Author
-
Kim, Min Suk, Son, Jeong Jin, Chung, Chul-Woo, and Lee, Chang Joon
- Subjects
ELASTIC modulus ,REFERENCE values ,TEST methods ,DYNAMIC testing ,RESONANCE - Abstract
The decrease in dynamic elastic modulus is a primary indicator of quantitative damage in concrete. To quantitatively assess depth-by-depth damage within a concrete structure, cylindrical specimens obtained through coring can be cut into disk specimens to measure the dynamic elastic modulus of concrete at each depth. To minimize external damage during coring, it is essential to extract cylinders with the smallest possible diameter. In addition, for higher resolution in depth-based damage assessment, creating disk specimens with the smallest possible thickness is necessary. However, there is no information available in the literature on experimental limitation of smallest possible diameter and thickness for dynamic elastic modulus of disk-shaped specimens. This study evaluated whether the dynamic modulus measured from various sizes of concrete disk specimens provided sufficient reliability compared to reference values obtained from cylinders. Moreover, the study examined how the presence of coarse aggregate and variation in the water–cement ratio significantly influenced the dynamic modulus measurement. In addition, test results from impulse excitation technique (IET) and impact resonance (IR) were compared to find a more reliable test method for dynamic elastic modulus of disk specimen. The experimental findings revealed that as the thickness-to-radius ratio of the disk specimens decreased, measured data variation increased. Mortar specimens without coarse aggregates showed less variability compared to concrete specimens, and the variation in dynamic modulus measured by IR was lower than that measured by IET. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Efficiency of Parametric and Non Parametric Indices as the Indicators of Grain Yield Stability of Bread Wheat (Triticum aestivum L.) Genotypes under Rainfall Conditions.
- Author
-
Bendada, H., Mehanni, O., Louahdi, A. N., Selloum, S., Guemaz, S., Frih, B., and Guendouz, A.
- Subjects
- *
AGRICULTURE , *GENOTYPES , *TEST methods , *BREAD , *SEASONS , *GRAIN yields - Abstract
Background: Under rainfall condition, stability of grain yield in diverse environments has been one of the most important objectives of breeding programs. Stability analysis is the best method to test the relative performance of genotypes over environments. Thus the aims of this study are to select adapted and stable bread wheat genotypes based on some parametric and non-parametric index. Methods: The experiment was conducted during the four consecutive agricultural seasons (2016-17, 2017-18, 2018-19 and 2019-20) at the level of the experimental station of Setif (ITGC). Eight genotypes of bread wheat were tested following Stability soft program to calculate the parametric and non-parametric indices. Result: The association between Wricke's ecovalence (Wi²), the mean variance component (θi) and the Stability variance (ω²i) indices with respect to grain yield revealed that G1. G2. Hidhab. Arz. Wifak and Ain Abid are suitable genotypes for growing under variable environmental conditions. In addition, selection based on the non-parametric index and the combination selection based on highest grain yield with the parametric and non parametric indices proved that the genotypes G1. G2 and Wifak are more stable and adapted genotypes under semi-arid conditions. Further, based on the static and dynamic concepts, the parametric indices, bi and CVi are related to the dynamic concept, while the other indices are associated with static stability concept. Overall, the results of this study confirmed that the parametric and Non-parametric methods are suitable tools to identify the most stable bread wheat genotypes under various environmental conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Determination of In-use Properties of Paper Towels.
- Author
-
Soon Wan Kweon, Young Chan Ko, Yong Ju Lee, Ji Eun Cha, Byoung Geun Moon, and Hyoung Jin Kim
- Subjects
- *
TENSILE tests , *PAPER towels , *PAPER products , *TENSILE strength , *TEST methods - Abstract
For a hygiene paper such as tissue and towel, strength, softness, and absorbency are known as attributes that a user is looking for. It is proposed here that purchasing decisions are likely to be influenced by in-use experiences, which may be quite different from the physical properties measured with current standardized tests. There have been continuous efforts on developing physical test methods to replace subjective in-use tests because the benefits of the former are too significant to be overlooked. This paper considered some in-use test methods for paper towel products that can be carried out by panel members quickly in the course of sensory panel testing. In addition, laboratory tests were developed in an attempt to quantify such input. The sensory panel testing showed that (wet) strength and absorbency were the key contributions to the performance of paper towels. Softness did not show any significant contribution to it. Wet strength showed a high correlation with absorbency. The (wet) ball burst strength had the highest correlation with the in-use strength. Although both the tensile strength and the ball burst strength had a high correlation with preference, the ball burst tester is preferred because more reproducible and simpler to operate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Research on characterization methods for the anti-sagging performance of polyethylene.
- Author
-
Wang, Jingfan, Jin, Yucheng, Zhuang, Liqi, He, Ruiheng, and Zhao, Shicheng
- Subjects
- *
MATERIALS testing , *MOLECULAR weights , *TEST methods , *RHEOLOGY , *VISCOSITY - Abstract
The quality of anti-sagging performance directly influences the application scope of bimodal polyethylene. Although research exists on the characterization methods of anti-sagging performance in bimodal polyethylene, few studies have indicated which testing methods can most accurately characterize this property. This study compares different testing indicators, including melt flow index, melt strength, rheological behavior, molecular weight, and entanglement of molecular chains, to identify the most precise ones for characterizing the anti-sagging performance of polyethylene. A comprehensive analysis combining the properties of the test materials and the test results indicates that zero shear viscosity and relaxation time can effectively characterize the anti-sagging resistance properties of polyethylene. In contract, melt flow index, melt strength, and entanglement of molecular chains are inadequate for accurately characterizing anti-sagging performance. This study will provide an effective and accurate method for characterizing the modification of anti-sagging performance of polyethylene. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. A data-driven approach to analyze bubble deformation in turbulence.
- Author
-
Calado, Andre, Capuano, Francesco, and Balaras, Elias
- Subjects
- *
TURBULENCE , *EDDIES , *COMPUTER simulation , *TEST methods , *SPHERES - Abstract
Bubble deformation and breakup from turbulence is present in many engineering applications and in nature, yet the physical mechanisms still remain poorly understood. Depending on the local turbulence intensity or Weber number, a bubble may either deform without breakup, suffer a violent breakup, or exhibit a resonant behavior, where the turbulent eddies excite the bubble's natural frequencies. Recent studies have used spherical harmonic decomposition to analyze bubble interaction with turbulence, quantifying the deformation energy of each eigenmode. However, this approach is only applicable for small levels of deformation (linear regime), while the bubble shape remains close to a sphere. In the present work, we present a novel data-driven approach combining large deformation diffeomorphic metric mapping and proper orthogonal decomposition, which is more robust for large deformations. The method is tested on a set of validation cases and applied to turbulent bubble deformation cases obtained from direct numerical simulations data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Entropic Regression Dynamic Mode Decomposition (ERDMD) Discovers Informative Sparse and Nonuniformly Time Delayed Models.
- Author
-
Curtis, Christopher W., Bollt, Erik, and Alford-Lago, Daniel Jay
- Subjects
- *
INFORMATION theory , *BUILDING repair , *MODEL theory , *TEST methods , *ENTROPY - Abstract
In this work, we present a method that determines optimal multistep Dynamic Mode Decomposition (DMD) models via Entropic Regression (ER), which is a nonlinear information flow detection algorithm. Motivated by the Higher-Order DMD (HODMD) method of [Le Clainche & Vega, 2017], and the ER technique for network detection and model construction found in [Sun et al., 2015; AlMomani et al., 2020], we develop a method that we call ERDMD, which produces high fidelity time-delay DMD models that allow for nonuniformity in the delays. This optimal choice of delays is discovered by maximizing informativity as measured through ER. These models are shown to be highly efficient and robust. We test our method over several data sets generated by chaotic attractors and show that we are able to build excellent reconstructions using relatively minimal models. We likewise are able to better identify multiscale features via our models which enhances the utility of DMD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Calculation of Geomagnetic Cutoff Rigidity Using Tracing Based on the Buneman–Boris Method.
- Author
-
Kruchinin, P. A., Malakhov, V. V., Golubkov, V. S., and Mayorov, A. G.
- Subjects
- *
GEOMAGNETISM , *TEST methods , *ALTITUDES - Abstract
The article develops a method for determining the geomagnetic cutoff rigidity based on tracing of charged particles in Earth's magnetic field using the particle-in-cell method implemented in the Buneman–Boris scheme. In order to test the method, the geomagnetic cutoff rigidity in the field of an ideal dipole and in the field given by the IGRF model are calculated. In the first case, the obtained data are compared with analytical values. The calculation accuracy in this case is 3 MV. In the second case, the penumbra pattern is reproduced in different geographical locations, for different periods, and the stability of the method to small perturbations of the initial parameters is investigated. As the main results, the article constructs and analyzes geomagnetic cutoff rigidity maps at low-orbit satellite altitudes for different directions in space as well as their variations from 1900 to 2015. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.