1,802 results on '"Uncertainty propagation"'
Search Results
2. Hybrid uncertainty propagation for mechanical dynamics problems via polynomial chaos expansion and Legendre interval inclusion function
- Author
-
Wang, Liqun, Guo, Chengyuan, Xu, Fengjie, and Xiao, Hui
- Published
- 2025
- Full Text
- View/download PDF
3. Quantum annealing algorithm for interval uncertainty propagation analysis in structural free vibration
- Author
-
Zhu, Jiazheng, Wang, Xiaojun, Wang, Zhenghuan, and Xu, Yusheng
- Published
- 2025
- Full Text
- View/download PDF
4. Multi-scale impact of geometric uncertainty on the interface bonding reliability of metal/polymer-based composites hybrid (MPH) structures
- Author
-
Pan, Wenfeng, Sun, Lingyu, Yang, Xudong, Zhang, Yiben, Sun, Jiaxing, Shang, Jiachen, Yang, Zhengqing, and Xu, ChengDong
- Published
- 2025
- Full Text
- View/download PDF
5. Data-driven corrosion failure probabilistic assessment model for reinforced concrete structures
- Author
-
Wu, Ren-jie, Min, Wan-lin, Liu, Qing-feng, Hein, Khant Swe, and Xia, Jin
- Published
- 2024
- Full Text
- View/download PDF
6. Multi-scale correlation reveals the evolution of socio-natural contributions to tropospheric HCHO over China from 2005 to 2022
- Author
-
Xia, Hui, Wang, Dakang, Abad, Gonzalo González, Yang, Xiankun, Zhu, Lei, Pu, Dongchuan, Feng, Xu, Zhang, Aoxing, Song, Zhaolong, Mo, Yongru, and Wang, Jinnian
- Published
- 2024
- Full Text
- View/download PDF
7. Reliability-based composite pressure vessel design optimization with cure-induced stresses and spatial material variability
- Author
-
Van Bavel, B., Shishkina, O., Vandepitte, D., and Moens, D.
- Published
- 2024
- Full Text
- View/download PDF
8. Robust optimal powered descent guidance via model predictive convex programming
- Author
-
Xiao, Yizheng, Gong, Youmin, Mei, Jie, Ma, Guangfu, and Wu, Weiren
- Published
- 2025
- Full Text
- View/download PDF
9. Uncertainty propagation for microwave scattering parameter measurements subject to time-domain and time-gating transformations
- Author
-
Skinner, James, Gruber, Maximilian, Eichstädt, Sascha, Appleby, Roger, and Ridler, Nick M.
- Published
- 2024
- Full Text
- View/download PDF
10. Modelling and mapping maize yields and making fertilizer recommendations with uncertain soil information.
- Author
-
Takoutsing, Bertin, Heuvelink, Gerard B. M., Aynekulu, Ermias, and Shepherd, Keith D.
- Abstract
Crop models can improve our understanding of crop responses to environmental conditions and farming practices. However, uncertainties in model inputs can notably impact the quality of the outputs. This study aimed at quantifying the uncertainty in soil information and analyse how it propagates through the Quantitative Evaluation of Fertility of Tropical Soils model to affect yield and fertilizer recommendation rates using Monte Carlo simulation. Additional objectives were to analyse the uncertainty contributions of the individual soil inputs to model output uncertainty and discuss strategies to communicate uncertainty to end-users. The results showed that the impact of soil input uncertainty on model output uncertainty was significant and varied spatially. Comparison of the results of a deterministic model run with the mean of the Monte Carlo simulation runs showed systematic differences in yield predictions, with Monte Carlo simulations on average predicting a yield that was 0.62 tonnes ha−1 lower than the deterministic run. Similar systematic differences were observed for fertilizer recommendations, with Monte Carlo simulations recommending up to 59, 42, and 20 kg ha−1 lower nitrogen (N), phosphorous (P), and potassium (K) fertilizer applications, respectively. Stochastic sensitivity analysis showed that pH was the main source of uncertainty for K fertilizer (81.6%) and that soil organic carbon contributed most to the uncertainty of N fertilizer application (97%). Uncertainty in P fertilizer application mostly came from uncertainty in extractable phosphorus (55%) and exchangeable potassium (20%). A threshold probability map designed using statistical predictions served as a visual aid that could enable farmers to swiftly make informed decisions about fertilizer application locations. The study highlights the importance of refining the accuracy of soil maps as well as incorporating uncertainty in input data, which improves QUEFTS model predictions and offers valuable insights into the relationship between soil information accuracy and reliable crop modeling for sustainable agricultural decisions. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
11. On the Measurement of Laser Lines in 3D Space with Uncertainty Estimation.
- Author
-
De Boi, Ivan, Ghaderi, Nasser, Vanlanduit, Steve, and Penne, Rudi
- Subjects
- *
GAUSSIAN mixture models , *LASER measurement , *CAMERA calibration , *OPTICAL scanners , *GAUSSIAN processes - Abstract
Laser-based systems, essential in diverse applications, demand accurate geometric calibration to ensure precise performance. The calibration process of the system requires establishing a reliable relationship between input parameters and the corresponding 3D description of the outgoing laser beams. The quality of the calibration depends on the quality of the dataset of measured laser lines. To address this challenge, we present a stochastic method for measuring the coordinates of these lines, considering both the camera calibration uncertainties and measurement noise inherent in laser dot detection on a detection board. Our approach to composing an accurate dataset of lines utilises a standard webcam and a checkerboard, avoiding the need for specialised hardware. By modelling the uncertainties involved, we provide a probabilistic description of the fitted laser line, enabling quality assessment of the measurement and integration into subsequent algorithms. We also offer insights into the optimal number of board positions and the number of repeated laser dot measurements, which are both the main time-consuming factors in practice. In summary, our proposed method represents a significant advancement in the field of laser-based system calibration, offering a robust and efficient solution. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
12. State uncertainty propagation and sensitivity analysis of the post-impact binary asteroid system.
- Author
-
Lu, Jucheng, Shang, Haibin, Dong, Yue, and Zhang, Xuefen
- Subjects
- *
DOUBLE Asteroid Redirection Test (U.S.) , *MONTE Carlo method , *TWO-body problem (Physics) , *ASTEROID orbits , *QUASIMOLECULES , *ASTEROIDS , *POLYNOMIAL chaos - Abstract
The Double Asteroid Redirection Test (DART) mission demonstrated the feasibility of altering an asteroid's orbit through kinetic impact. However, uncertainties associated with the collision and the complex dynamics of the binary asteroid system often result in rough and inefficient predictions of the system's post-impact evolution. This paper proposes the use of arbitrary polynomial chaos expansion (aPCE) to efficiently evaluate the state uncertainty of a post-impact binary asteroid system without requiring complete information on the uncertainty sources. First, a perturbed full two-body problem model is developed to assess the momentum transfer during the collision and the system's subsequent evolution. The irregular shapes of the components and the momentum enhancement from the ejecta are considered to achieve reasonable evaluations. Next, aPCE is employed to construct a surrogate model capable of efficiently quantifying uncertainties. Global sensitivity analysis is then conducted to identify the main sources of uncertainty affecting the system's evolution. Benchmarking tests show that the aPCE model produces results comparable to Monte Carlo simulations, offering a good balance between accuracy and efficiency. The data-driven nature of aPCE is further demonstrated by comparing its performance to generalized polynomial chaos expansion. Under the framework of the DART mission, the aPCE solution yields results consistent with observed data. Additionally, global sensitivity analysis reveals that the shape and density of the primary, as well as the collision target's strength and porosity, contribute most to the system uncertainty. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Uncertainty Propagation Performance in Proximity Operations Around Small Bodies.
- Author
-
Michelotti, Niccolò, Rizza, Antonio, Giordano, Carmine, and Topputo, Francesco
- Abstract
Close-proximity exploration of small celestial bodies is crucial for the comprehensive and accurate characterization of their properties. However, the complex and uncertain dynamical environment around them contributes to a rapid dispersion of uncertainty and the emergence of non-Gaussian distributions. Therefore, to ensure safe operations, a precise understanding of uncertainty propagation becomes imperative. In this work, the dynamical environment is analyzed around two asteroids, Apophis, which will perform a close flyby to Earth in 2029, and Eros, which has been already explored by past missions. The performance of different uncertainty propagation methods (linear covariance propagation, unscented transformation, and polynomial chaos expansion) are compared in various scenarios of close-proximity operations around the two asteroids. Findings are discussed in terms of propagation accuracy and computational efficiency depending on the dynamical environment. By exploring these methodologies, this work contributes to the broader goal of ensuring the safety and effectiveness of spacecraft operations during close-proximity exploration of small celestial bodies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Robust Design Optimization of Viscoelastic Damped Composite Structures Integrating Model Order Reduction and Generalized Stochastic Collocation.
- Author
-
Wang, Tianyu, Xu, Chao, and Li, Teng
- Subjects
COLLOCATION methods ,COMPOSITE structures ,PARETO optimum ,NONLINEAR equations ,SAMPLE size (Statistics) - Abstract
This study presents a novel approach that integrates model order reduction (MOR) and generalized stochastic collocation (gSC) to enhance robust design optimization (RDO) of viscoelastic damped composite structures under material and geometric uncertainties. The proposed methodology systematically reduces computational burden while maintaining the required accuracy. A projection-based MOR is chosen to alleviate the substantial computational costs associated with nonlinear eigenvalue problems. To minimize the sampling size for uncertainty propagation (UP) while effectively addressing diverse probability density distributions, a gSC method incorporating statistical moment computation techniques is developed. Pareto optimal solutions are determined by combining the proposed MOR and gSC approaches with a well-established Non-dominated Sorting Genetic Algorithm II (NSGA-II) algorithm, which accounts for robustness in handling design variables, objectives, and constraints. The results of the four examples illustrate the efficacy of the proposed MOR and gSC methods, as well as the overall RDO framework. Notably, the findings demonstrate the feasibility of this approach for practical applications, driven by a significant reduction in computational costs. This establishes a solid foundation for addressing complex optimization challenges in real-world scenarios characterized by various uncertainties. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Analysis of material parameter uncertainty propagation in preoperative flap suture simulation.
- Author
-
Ji, Xiaogang, Li, Huabin, Gong, Hao, Wen, Guangquan, and Sun, Rong
- Subjects
- *
FINITE element method , *CURVED surfaces , *SKIN grafting , *FIBER orientation , *SUTURING , *GEOMETRIC modeling - Abstract
Skin flap transplantation is the most commonly used method to repair tissue defect and cover the wound. In clinic, finite element method is often used to design the pre-operation scheme of flap suture. However, the material parameters of skin flap are uncertain due to experimental errors and differences in body parts. How to consider the influence of material parameter uncertainty on the mechanical response of flap suture in the finite element modeling is an urgent problem to be solved at present. Therefore, the influence of material parameter uncertainty propagation in skin flap suture simulation was studied, Firstly, the geometric model of clinical patient's hand wound was constructed by using reverse modeling technology, the patient's three-dimensional wound was unfolded into a flat surface by using curved surface expansion method, yielding a preliminary design contour for the patient's transplant flap. Based on the acquired patient wound geometry model, the finite element model of flap suture with different fiber orientations and different sizes was constructed in Abaqus, and the uncertainty propagation analysis method based on Monte Carlo simulation combined with surrogate model technology was further used to analyze the stress response of flap suture considering the uncertainty of material parameters. Results showed that the overall stress value was relatively lower when the average fiber orientation was 45°. which could be used as the optimal direction for the flap excision. when the preliminary design contour of the flap was scaled down within 90%, the stress value after flap suturing remained within a safe range. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Optimizing Microgrid Performance: Integrating Unscented Transformation and Enhanced Cheetah Optimization for Renewable Energy Management.
- Author
-
Alghamdi, Ali S.
- Subjects
RENEWABLE energy sources ,OPTIMIZATION algorithms ,MICROGRIDS ,OPERATING costs ,WIND turbines ,CHEETAH - Abstract
The increased integration of renewable energy sources (RESs), such as photovoltaic and wind turbine systems, in microgrids poses significant challenges due to fluctuating weather conditions and load demands. To address these challenges, this study introduces an innovative approach that combines Unscented Transformation (UT) with the Enhanced Cheetah Optimization Algorithm (ECOA) for optimal microgrid management. UT, a robust statistical technique, models nonlinear uncertainties effectively by leveraging sigma points, facilitating accurate decision-making despite variable renewable generation and load conditions. The ECOA, inspired by the adaptive hunting behaviors of cheetahs, is enhanced with stochastic leaps, adaptive chase mechanisms, and cooperative strategies to prevent premature convergence, enabling improved exploration and optimization for unbalanced three-phase distribution networks. This integrated UT-ECOA approach enables simultaneous optimization of continuous and discrete decision variables in the microgrid, efficiently handling uncertainty within RESs and load demands. Results demonstrate that the proposed model significantly improves microgrid performance, achieving a 10% reduction in voltage deviation, a 10.63% decrease in power losses, and an 83.32% reduction in operational costs, especially when demand response (DR) is implemented. These findings validate the model's efficacy in enhancing microgrid reliability and efficiency, positioning it as a viable solution for optimized performance under uncertain renewable inputs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Uncertainty propagation from probe spacing to Fourier 3-probe straightness measurement.
- Author
-
Huang, Pu, Xie, Jin, Haitjema, Han, Lu, Kuo, and Shi, Shengyu
- Subjects
SPACE probes ,MONTE Carlo method ,SEPARATION of variables - Abstract
Reliable and precise straightness profile measurements are crucial for manufacturing ultra-precision components and are capable of further enhancing their accuracy. The Fourier three-probe (F3P) straightness measurement allows for precise assessment of the workpiece profile on the machine by eliminating the harmful influence of the error motion of the sliding table. However, the probe spacing uncertainty deteriorates the measurement accuracy remarkably; and, the affecting mechanism behind this phenomenon has not yet been studied in detail. In this context, this paper thoroughly investigated the propagation of the probe spacing uncertainty in the F3P measurement. First, the influence of the probe spacing deviation is analyzed. Next, by calculating the partial differential of Laplace transform of the workpiece profile, we algebraically deduce the probe spacing uncertainty propagation law, especially in the harmonic domain. Subsequently, Monte Carlo simulations are carried out to confirm the derived propagation law. To reduce uncertainty propagation, a hybrid approach is presented: (I) F3P measurements are carried out under changing probe spacings to produce several sets of Fourier coefficients; (II) optimal harmonic estimates are selected individually according to the harmonic uncertainty. Finally, simulations and experimental measurements are performed for verification. • Spacing uncertainty propagation law in F3S straightness measurement is deduced. • Monte Carlo method is adopted to confirm the derived propagation law. • Hybridization in harmonic domain is proposed to reduce uncertainty propagation. • The hybrid method enables reduction of both random and systematic uncertainties. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. A Koopman Reachability Approach for Uncertainty Analysis in Ground Vehicle Systems †.
- Author
-
Kumar, Alok, Umathe, Bhagyashree, and Kelkar, Atul
- Subjects
DYNAMICAL systems ,EIGENFUNCTIONS ,VEHICLE models ,SAFETY - Abstract
Recent progress in autonomous vehicle technology has led to the development of accurate and efficient tools for ensuring safety, which is crucial for verifying the reliability and security of vehicles. These vehicles operate under diverse conditions, necessitating the analysis of varying initial conditions and parameter values. Ensuring the safe operation of the vehicle under all these varying conditions is essential. Reachability analysis is an important tool to certify the safety and stability of the vehicle dynamics. We propose a reachability analysis approach for evaluating the response of the vehicle dynamics, specifically addressing uncertainties in the initial states and model parameters. Reachable sets illustrate all the possible states of a dynamical system that can be obtained from a given set of uncertain initial conditions. The analysis is crucial for understanding how variations in initial conditions or system parameters can lead to outcomes such as vehicle collisions or deviations from desired paths. By mapping out these reachable states, it is possible to design systems that maintain safety and reliability despite uncertainties. These insights help to ensure the stability and reliability of the vehicles, even in unpredictable conditions, by reducing accidents and optimizing performance. The nonlinearity of the model complicates the computation of reachable sets in vehicle dynamics. This paper proposes a Koopman theory-based approach that utilizes the Koopman principal eigenfunctions and the Koopman spectrum. By leveraging the Koopman principal eigenfunction, our method simplifies the computational process and offers a formal approximation for backward and forward reachable sets. First, our method effectively computes backward and forward reachable sets for a nonlinear quarter-car model with fixed parameter values. Furthermore, we applied our approach to analyze the uncertainty response for cases with uncertain parameters of the vehicle model. When compared to time-domain simulations, our proposed Koopman approach provided accurate results and also reduced the computational time by half in most cases. This demonstrates the efficiency and reliability of our proposed approach in dynamic systems uncertainty analysis using the reachable sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Uncertainty modeling and propagation for groundwater flow: a comparative study of surrogates.
- Author
-
Ernst, Oliver G., Sprungk, Björn, and Zhang, Chao
- Abstract
We compare sparse grid stochastic collocation and Gaussian process emulation as surrogates for the parameter-to-observation map of a groundwater flow problem related to the Waste Isolation Pilot Plant in Carlsbad, NM. The goal is the computation of the probability distribution of a contaminant particle travel time resulting from uncertain knowledge about the transmissivity field. The latter is modelled as a lognormal random field which is fitted by restricted maximum likelihood estimation and universal kriging to observational data as well as geological information including site-specific trend regression functions obtained from technical documentation. The resulting random transmissivity field leads to a random groundwater flow and particle transport problem which is solved realization-wise using a mixed finite element discretization. Computational surrogates, once constructed, allow sampling the quantities of interest in the uncertainty analysis at substantially reduced computational cost. Special emphasis is placed on explaining the differences between the two surrogates in terms of computational realization and interpretation of the results. Numerical experiments are given for illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Uncertainty Assessment of the Screw Removal System for Robotic Disassembly of Hard Disk Drives During the Recycling Process.
- Author
-
SZEWCZYK, R., SZAŁATKIEWICZ, J., NOWICKI, M., GAZDA, P., OSTASZEWSKA-LIŻEWSKA, A., NOWAK, P., CHARUBIN, T., ROGALSKI, W., WIKTOROWICZ, M., PATAPENKA, I., SIEMIĄTKOWSKI, A., and ZIELIŃSKI, J.
- Subjects
- *
INDUSTRIAL robots , *HARD disks , *MONTE Carlo method , *ROBOT control systems , *WASTE recycling , *SCREWS - Abstract
Robotic disassembly of hard disk drives during their recycling process is a promising technology with significant ecological importance and high economic profitability. However, the efficiency of the robotic disassembly of screws in the cost-efficient process using the robotic system in the selective compliance assembly robot arm configuration is highly dependent on the accuracy of positioning. In such a case, the robot works based on known screw positions without a visual control loop. The paper presents the generalised method of analysis of key factors influencing the process, starting from visual geometry analysis to mechanical setup accuracy. The formalised metrological analysis was performed on the base of the Monte-Carlo method to identify the key factors influencing the screw positioning accuracy. It was stated that the robot control uncertainty plays a crucial role in the total uncertainty of the system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Sensitivity analysis of the Cercignani - Lampis accommodation coefficients in prototype rarefied gas flow and heat transfer problems via the Monte Carlo method.
- Author
-
Basdanis, Thanasis, Tatsios, Giorgos, and Valougeorgis, Dimitris
- Subjects
- *
RAREFIED gas dynamics , *MONTE Carlo method , *GAS flow , *STOKES flow , *POISEUILLE flow , *COUETTE flow - Abstract
In rarefied gas dynamics, the Cercignani-Lampis (CL) scattering kernel, containing two accommodation coefficients (ACs), namely the tangential momentum and normal energy ones, is widely employed to characterize gas-surface interaction, particularly in non-isothermal setups, where both momentum and energy may simultaneously be exchanged. Here, a formal and detailed sensitivity analysis of the effect of the CL ACs on the main output quantities of several prototype problems, namely the cylindrical Poiseuille, thermal creep and thermomolecular pressure difference (TPD) flows, as well as the plane Couette flow and heat transfer (Fourier flow), is performed. In each problem, some uncertainties are randomly introduced in the ACs (input parameters) and via a Monte Carlo propagation analysis, the deduced uncertainty of the corresponding main output quantity is computed. The output uncertainties are compared to each other to determine the flow configuration and the gas rarefaction range, where a high sensitivity of the output quantities with respect to the CL ACs is observed. The flow setups and rarefaction regimes with high sensitivities are the most suitable ones for the estimations of the ACs, since larger modeling and experimental errors may be acceptable. In the Poiseuille and Couette flows, the uncertainties of the flow rate and shear stress respectively are several times larger than the input uncertainty in the tangential momentum AC and much smaller than the uncertainty in the normal energy AC in a wide range of gas rarefaction. In the thermal creep flow, the uncertainty of the flow rate depends on the input ones of both ACs, but, in general, it remains smaller than the input uncertainties. A similar behavior with the thermal creep flow is obtained in the TPD flow. On the contrary, in the Fourier flow, the uncertainty of the heat flux may be about the same or even larger than the input ones of both ACs in a wide range of gas rarefaction. It is deduced that in order to characterize the gas-surface interaction via the CL ACs by matching computations with measurements, it is more suitable to combine the Poiseuille (or Couette) and Fourier configurations, rather than, as it is commonly done, the Poiseuille and thermal creep ones. For example, in order to estimate the normal energy AC within an accuracy of 10 %, experimental uncertainties should be less than 4 % in the thermal creep or TPD flows, while may be about 10 % in the Fourier flow. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Orbit determination for space situational awareness: A survey.
- Author
-
Kazemi, Sajjad, Azad, Nasser L., Scott, K. Andrea, Oqab, Haroon B., and Dietrich, George B.
- Subjects
- *
ORBIT determination , *GEOSYNCHRONOUS orbits , *SPACE surveillance , *ORBITS (Astronomy) , *ORBIT method - Abstract
The rapidly growing number of objects encircling our planet is an increasing concern. Collisions between these objects have already occurred and pose a potential threat in the future, resulting in the creation of countless debris fragments. In particular, the Low Earth Orbit (LEO) region is densely populated and highly contested. This underscores the critical importance of space surveillance in this area. Moreover, the utilization of Medium Earth Orbit (MEO) and Geosynchronous Earth Orbit (GEO) is also on the rise. To ensure the safety and functionality of operational satellites, it is paramount to accurately determine and continuously monitor the orbits of space objects, mitigating the risk of collisions. Precise and timely predictions of future trajectories are essential for this purpose. In response to these challenges, this survey paper provides a comprehensive review of various methods proposed in the literature for Orbit Determination (OD). It also identifies research gaps and suggests potential directions for future studies, emphasizing the pressing need for adequate Space Situational Awareness (SSA). [Display omitted] • Admissible Region is a robust, reliable cornerstone method for orbit determination. • Machine learning transforms orbit determination, boosting accuracy and efficiency. • Explainable AI is crucial for transparency and trust in orbit determination. • The recent focus on cislunar regime demands novel orbit determination methods. • Undervalued, space-based orbit determination is crucial for accurate cislunar missions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Probabilistic Performance-based Fire Design of Structures: A Hazard-Centred and Consequence-Oriented Perspective.
- Author
-
Franchini, Andrea, Galasso, Carmine, and Torero, Jose L.
- Subjects
- *
PERFORMANCE-based design , *FIRE risk assessment , *FIRE protection engineering , *FIRE prevention , *STRUCTURAL engineers , *HAZARD mitigation - Abstract
Risk-based design and assessment methods are gaining popularity in performance-based structural fire engineering. These methods usually start by defining a set of hazard scenarios to use as analysis inputs. This approach, proven highly effective for other hazard types such as earthquakes, may not be optimal for fire safety design. Indeed, the strong coupling between the fire phenomenon and structural features enables an ad-hoc design variable selection (and/or optimisation) to reduce fire intensity, making fire scenarios additional design outputs. In addition, such a coupling effect implies that fire scenarios maximising consequences are structure specific. Building on these considerations, this paper discusses the limitations that arise at different analysis steps (i.e., fire-scenario and intensity treatment, identifying fire intensity measures, probabilistic fire hazard analysis, developing fire fragility models, and risk calculation) when using conventional risk-based approaches for design purposes. Furthermore, it compares such approaches with a fire safety design methodology (the Consequence-oriented Fire intensity Optimisation, CFO, approach) that addresses the identified limitations. The potential benefits of integrating the two approaches are also discussed. Finally, the fire design of a simplified steel-girder bridge is introduced as an illustrative example, comparing the consequence metrics and design updating strategies resulting from the two approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Enhancing Assessments of Coastal Wetland Migration Potential with Sea-level Rise: Accounting for Uncertainty in Elevation Data, Tidal Data, and Future Water Levels.
- Author
-
Enwright, Nicholas M., Osland, Michael J., Thurman, Hana R., McHenry, Claire E., Vervaeke, William C., Patton, Brett A., Passeri, Davina L., Stoker, Jason M., Day, Richard H., and Simons, Bethanie M.
- Subjects
COASTAL wetlands ,ABSOLUTE sea level change ,WATER levels ,ALTITUDES ,MONTE Carlo method ,DIGITAL elevation models - Abstract
Sea-level rise rates are predicted to surpass rates of wetland vertical adjustment in the coming decades in many areas, increasing the potential for wetland submergence. Information on where wetland migration is possible can help natural resource managers for planning land acquisition or enhancing habitat connectivity to bolster adaptation of coastal wetlands to rising seas. Elevation-based models of wetland migration are often hampered with uncertainty associated with ground surface elevation, current water levels (i.e., tides and extreme water levels), and future water levels from sea-level rise. Here, we developed an approach that involved digital elevation model error reduction and the use of Monte Carlo simulations that utilize uncertainty assumptions regarding elevation error, contemporary water levels, and future sea levels to identify potential wetland migration areas. Our analyses were developed for Duvall and Nassau Counties in northeastern Florida (USA). We focus on the migration of regularly oceanic-flooded wetlands (i.e., flooded by oceanic water daily) and irregularly oceanic-flooded wetlands (i.e., flooded by oceanic water less frequently than daily). For two relative sea-level rise scenarios based on the 0.5 m and the 1.5 m global mean sea-level rise scenarios, we quantified migration by wetland flooding frequency class and identified land cover and land use types that are vulnerable to future exposure to oceanic waters. The variability in total coverage and relative coverage of wetland migration from our results highlights how topography and accelerated sea-level rise interact. Our wetland migration results communicate uncertainty by showing flooding frequency class as probabilistic outputs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. On the Use of Alternative Measurement Methods in the Estimation of Wear Rates in Rotary-Pin-on-Disk Tribometry.
- Author
-
Solasa, Krishna Chaitanya, Venkataraman, N. V., Choudhury, Palash Roy, Schueller, John K., and Bhattacharyya, Abhijit
- Abstract
Do two different and independent methods of estimating the wear rate of a test sample yield the same numerical result? Numerical values of specific wear rates estimated on the basis of alternative methods using a set of dry sliding rotary-pin-on-disk experiments are presented. Wear rates of brass and aluminium alloy pins were estimated using gravimetric and wear scar area methods. Gravimetric and linear displacement methods were used to assess wear rates of ABS plastic and machinable wax pins. Scepticism about the estimated nominal values of wear rates is reduced when alternative assessment methods result in comparable numerical values, or values having the same order of magnitude. This is particularly useful when ranking competing materials for wear rates, when the differences in these rates are small. Uncertainties in individual test sample wear rates, and dispersion in the nominal values of wear rates are also computed to support the aforementioned observations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. A novel uncertainty propagation and probability assessment method for the frequency response function involving correlated uncertainties.
- Author
-
Liao, Baopeng
- Subjects
- *
CHEBYSHEV polynomials , *PROBABILITY theory , *POLYNOMIAL chaos , *STRUCTURAL engineering , *STRUCTURAL optimization , *DISCRETE systems - Abstract
Uncertainty propagation of the frequency response function is crucial for vibration problems such as model calibration, and the probability assessment is another indispensable item in structural optimization. Considering only one of them will inevitably lead to a lack of uncertain knowledge of practical engineering structures. However, it is still challenging to evaluate the system response bounds and probability characteristics simultaneously and maximally reduce the computational cost. This paper focuses on the frequency response function involving correlated uncertainties and proposes a novel uncertainty propagation and probability assessment method. First, a convex model was established to quantify the correlated uncertainty parameters, and then, the Chebyshev polynomial function was developed as the surrogate model to efficiently quantify the uncertainty propagation from uncertainty parameters to system response. Subsequently, the novel normalized and coordinated transformation combined the uncertainty propagation method, making the uncertainty system response easy to assess. Note that the response value at the interpolation point can be employed as the input of probability assessment. One can also estimate the probability characteristics at different frequency positions during the construction of the surrogate model. Finally, two numerical examples were presented to demonstrate the effectiveness and cheaper computation by a discrete system and a continuous system, respectively. Results indicate that the proposed method can be conveniently and accurately applied to assess the bounds and probability characteristics of frequency response function involving correlated uncertainties. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Multiobjective robust optimization framework based on first and second order Taylor expansion applied to a vehicle suspension design.
- Author
-
Grotti, Ewerton, Santana, Pedro Bührer, Filho, José Gilberto Picoral, and Gomes, Herbert M.
- Abstract
Accounting for uncertainties is a major challenge in the engineering design field. The optimization process used to design high quality final products seldom takes these uncertainties into consideration due to the high computational cost associated with this type of analysis, especially in the multiobjective area. The present paper describes a procedure that incorporates uncertainties in the multiobjective optimization process by means of first and second moments approximation (mean and variance) of the objective functions. These gradient based approximations are developed using two levels of Taylor expansion each, which are tested against the classic Monte Carlo method in three different numerical vehicle suspension examples. The framework is then used to solve a suspension optimization problem for ride comfort and maneuverability, simultaneously. The optimization is performed using MOQPSO, MOPSO and the well established NSGA-II. The results show that the framework is efficient in solving multiobjective robust optimization (RO) problems, consuming low computational cost to cope with the uncertainty propagation problem. The final error associated with the method has showed to be around 1%, which is negligible considering the performance gains. A simple optimization is carried out using the same problem in order to clarify the real gains and costs of adding uncertainties to the optimization. That being said, the focus of the present paper is not only to establish a useful RO multiobjective framework, but also to compare different levels of Taylor approximation for first and second central moments in engineering problems with different degrees of complexity, which is not present in the literature. Equations to estimate number of function calls per approximation are given, so readers are able to predict the viability of the approach in different engineering problems beforehand. Finally, the optimization framework is further enhanced taking advantage of both Taylor approximations simultaneously. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Uncertainty Quantification Using Polynomial-Chaos Expansion
- Author
-
Syed, Wajih U., Elfadel, Ibrahim (Abe) M., Syed, Wajih U., and Elfadel, Ibrahim (Abe) M.
- Published
- 2024
- Full Text
- View/download PDF
29. The Role of Uncertainty Propagation for Digital Twins
- Author
-
Haslbeck, Matthias, Braml, Thomas, di Prisco, Marco, Series Editor, Chen, Sheng-Hong, Series Editor, Vayas, Ioannis, Series Editor, Kumar Shukla, Sanjay, Series Editor, Sharma, Anuj, Series Editor, Kumar, Nagesh, Series Editor, Wang, Chien Ming, Series Editor, Cui, Zhen-Dong, Series Editor, Matos, José C., editor, Lourenço, Paulo B., editor, Oliveira, Daniel V., editor, Branco, Jorge, editor, Proske, Dirk, editor, Silva, Rui A., editor, and Sousa, Hélder S., editor
- Published
- 2024
- Full Text
- View/download PDF
30. Uncertainty Propagation Analysis of TBM Performance Based on Sparse Polynomial Chaos Expansion Combined with Kernel Density Estimation and Bayesian Model Average
- Author
-
Li, Yue, Miao, Jiazhi, Liu, Hao, Zhou, Gongbo, Ceccarelli, Marco, Series Editor, Corves, Burkhard, Advisory Editor, Glazunov, Victor, Advisory Editor, Hernández, Alfonso, Advisory Editor, Huang, Tian, Advisory Editor, Jauregui Correa, Juan Carlos, Advisory Editor, Takeda, Yukio, Advisory Editor, Agrawal, Sunil K., Advisory Editor, Tan, Jianrong, editor, Liu, Yu, editor, Huang, Hong-Zhong, editor, Yu, Jingjun, editor, and Wang, Zequn, editor
- Published
- 2024
- Full Text
- View/download PDF
31. Propagation of the Uncertainty in the Dynamic Behavior of OPGW Cables Under Stochastic Wind Load
- Author
-
Campos, Damián, Ajras, Andrés, Piovan, Marcelo, di Prisco, Marco, Series Editor, Chen, Sheng-Hong, Series Editor, Vayas, Ioannis, Series Editor, Kumar Shukla, Sanjay, Series Editor, Sharma, Anuj, Series Editor, Kumar, Nagesh, Series Editor, Wang, Chien Ming, Series Editor, Cui, Zhen-Dong, Series Editor, Gattulli, Vincenzo, editor, Lepidi, Marco, editor, and Martinelli, Luca, editor
- Published
- 2024
- Full Text
- View/download PDF
32. Widespread analytical pitfalls in empirical coexistence studies and a checklist for improving their statistical robustness
- Author
-
J. Christopher D. Terry and David W. Armitage
- Subjects
competition ,experiments ,model selection ,modern coexistence theory ,uncertainty propagation ,Ecology ,QH540-549.5 ,Evolution ,QH359-425 - Abstract
Abstract Modern coexistence theory (MCT) offers a conceptually straightforward approach for connecting empirical observations with an elegant theoretical framework, gaining popularity rapidly over the past decade. However, beneath this surface‐level simplicity lie various assumptions and subjective choices made during data analysis. These can lead researchers to draw qualitatively different conclusions from the same set of experiments. As the predictions of MCT studies are often treated as outcomes, and many readers and reviewers may not be familiar with the framework's assumptions, there is a particular risk of ‘researcher degrees of freedom’ inflating the confidence in results, thereby affecting reproducibility and predictive power. To tackle these concerns, we introduce a checklist consisting of statistical best practices to promote more robust empirical applications of MCT. Our recommendations are organised into four categories: presentation and sharing of raw data, testing model assumptions and fits, managing uncertainty associated with model coefficients and incorporating this uncertainty into coexistence predictions. We surveyed empirical MCT studies published over the past 15 years and discovered a high degree of variation in the level of statistical rigour and adherence to best practices. We present case studies to illustrate the dependence of results on seemingly innocuous choices among competition model structure and error distributions, which in some cases reversed the predicted coexistence outcomes. These results demonstrate how different analytical approaches can profoundly alter the interpretation of experimental results, underscoring the importance of carefully considering and thoroughly justifying each step taken in the analysis pathway. Our checklist serves as a resource for authors and reviewers alike, providing guidance to strengthen the empirical foundation of empirical coexistence analyses. As the field of empirical MCT shifts from a descriptive, trailblazing phase to a stage of consolidation, we emphasise the need for caution when building upon the findings of earlier studies. To ensure that progress made in the field of ecological coexistence is based on robust and reliable evidence, it is crucial to subject our predictions, conclusions and generalisability to a more rigorous assessment than is currently the trend.
- Published
- 2024
- Full Text
- View/download PDF
33. Robust Design Optimization of Viscoelastic Damped Composite Structures Integrating Model Order Reduction and Generalized Stochastic Collocation
- Author
-
Tianyu Wang, Chao Xu, and Teng Li
- Subjects
robust design optimization ,viscoelastic damped structures ,model order reduction ,uncertainty propagation ,stochastic collocation method ,Motor vehicles. Aeronautics. Astronautics ,TL1-4050 - Abstract
This study presents a novel approach that integrates model order reduction (MOR) and generalized stochastic collocation (gSC) to enhance robust design optimization (RDO) of viscoelastic damped composite structures under material and geometric uncertainties. The proposed methodology systematically reduces computational burden while maintaining the required accuracy. A projection-based MOR is chosen to alleviate the substantial computational costs associated with nonlinear eigenvalue problems. To minimize the sampling size for uncertainty propagation (UP) while effectively addressing diverse probability density distributions, a gSC method incorporating statistical moment computation techniques is developed. Pareto optimal solutions are determined by combining the proposed MOR and gSC approaches with a well-established Non-dominated Sorting Genetic Algorithm II (NSGA-II) algorithm, which accounts for robustness in handling design variables, objectives, and constraints. The results of the four examples illustrate the efficacy of the proposed MOR and gSC methods, as well as the overall RDO framework. Notably, the findings demonstrate the feasibility of this approach for practical applications, driven by a significant reduction in computational costs. This establishes a solid foundation for addressing complex optimization challenges in real-world scenarios characterized by various uncertainties.
- Published
- 2024
- Full Text
- View/download PDF
34. Taillefer—A Tool for Sensitivity Analysis and Uncertainty Propagation Studies for Steady-State Thermal-Hydraulic Simulations of Involute Fuel Element Research Reactors.
- Author
-
Schönecker, Ronja, Bianchini, Paolo, Thomas, Frederic, Calzavara, Yoann, Petry, Winfried, and Reiter, Christian
- Abstract
AbstractTaillefer is a versatile Python tool for carrying out Sensitivity Analysis (SA) and uncertainty propagation (UP) studies based on Monte Carlo sampling. Developed with the primary goal of investigating sensitivities and uncertainties of steady-state thermal-hydraulic (SSTH) safety parameters of the high-performance research reactors Forschungs Neutronenquelle Heinz Maier-Leibnitz (FRM II) in Garching, Germany, and the Réacteur à Haut Flux (RHF) in Grenoble, France, it can also be used for a large variety of other modeling problems.The work presented here aims to explain the underlying mathematical background of SA and UP studies with Taillefer and to show some steps to verify these routines. Furthermore, a real-life application example is provided that demonstrates Taillefer’s use in SSTH analysis of the RHF. For this purpose, Taillefer is coupled to the external thermal-hydraulic software PLTEMP/ANL, which is one of the codes used at FRM II and RHF to access SSTH performance and safety parameters.Determining these crucial quantities is part of identifying possible low-enriched uranium (LEU) core designs that are suitable to replace the currently used highly enriched uranium fuels of the two reactors, supporting global nonproliferation efforts. Taillefer is a powerful tool in these conversion studies, as it increases the reliability of the LEU safety parameters by providing information about sensitivities and uncertainties in addition to the nominal values predicted by the thermal-hydraulic software. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Sensitivity analysis for a thermohydrodynamic model: Uncertainty analysis and parameter estimation.
- Author
-
Fiorini, Camilla, Puscas, Maria Adela, and Després, Bruno
- Subjects
- *
PARAMETER estimation , *SENSITIVITY analysis , *HEAT capacity , *THERMAL diffusivity , *NAVIER-Stokes equations , *NATURAL heat convection , *THERMAL conductivity , *RAYLEIGH number - Abstract
This paper proposes an efficient computational strategy to deal with uncertainty propagation problems for the Navier–Stokes equations coupled with a temperature equation based on a sensitivity analysis technique model. Sensitivity analysis allows one to investigate how the model response in a point is affected by a change in the boundary conditions on the temperature, the heat capacity, the thermal conductivity, and the thermal diffusivity under the hypothesis of a small variance of the input parameters. The variance can be estimated with just one simulation of the state and as many simulations of the sensitivity as there are uncertain parameters. We focused on the benchmark problem of natural convection in a square cavity, also known as the de Vahl Davis problem. Various areas of the domain were analysed to determine the relative influence of each parameter on temperature and velocity. We use the open-source code TrioCFD to simulate the state equations and a specific module is developed for the sensitivity equations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Seismic Risk Analysis of Subway Station Structures Combining the Epistemic Uncertainties from Both Seismic Hazard and Numerical Simulation.
- Author
-
Xu, Minze, Cui, Chunyi, Xu, Chengshun, Zhang, Peng, and Zhao, Jingtong
- Subjects
- *
EARTHQUAKE hazard analysis , *SUBWAY stations , *EPISTEMIC uncertainty , *MONTE Carlo method , *COMPUTER simulation , *HAZARD mitigation - Abstract
To consider the influence of epistemic uncertainties in both seismic hazard and numerical simulation on seismic risk of subway station structures, the uncertainties mentioned above are uniformly characterized as the epistemic uncertainty of seismic demand of subway station structures from the perspective of uncertainty propagation in this paper. On this basis, the analytical formulations of seismic risk considering simultaneous aleatory and epistemic uncertainties are derived. The validity of the derived analytical formulations is verified by the Monte Carlo simulation and the influences of epistemic uncertainty on the seismic risk of subway station structure are further discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Reinforced concrete wall buildings with force‐limiting connections: Modeling effects and uncertainty propagation.
- Author
-
Mayorga, C. Franco and Tsampras, Georgios
- Subjects
CONCRETE walls ,REINFORCED concrete buildings ,SEISMIC response ,PROBABILITY density function ,MONTE Carlo method ,WALLS ,STEEL walls - Abstract
This paper assesses the effects of (1) the gravity load‐resisting system (GLRS) modeling approach, (2) the seismic force‐resisting system (SFRS) modeling approach, and (3) the uncertainty of the model parameters of the constitutive law of the longitudinal reinforcing steel of the SFRS on the seismic responses of a 12‐story reinforced concrete wall building with force‐limiting connections. This is achieved by conducting nonlinear numerical earthquake simulations. The seismic responses of the building models with force‐limiting connections using two GLRS modeling approaches, (1) a moment frame system and (2) a pin‐base lean‐on‐column system, are compared. The seismic responses of the building models with conventional connections and force‐limiting connections, respectively, using two SFRS modeling approaches, (1) a distributed‐plasticity modeling approach and (2) a lumped‐plasticity modeling approach, are compared. A joint probability density function for the ASTM‐A615 Grade 60 steel available in the literature is used to conduct an uncertainty propagation analysis through Monte Carlo simulation. The uncertainty in the steel model parameters is propagated to the seismic responses of the building models with conventional connections and force‐limiting connections, respectively. The distributions of the mean values of the peak structural responses of the building models are studied. The effects of the GLRS modeling approach on the seismic responses are not significant in the context of seismic performance‐based design and assessment of buildings with force‐limiting connections. The effects of the SFRS modeling approach and the uncertainty in the steel model parameters on the floor total acceleration and force responses are reduced by including force‐limiting connections. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Widespread analytical pitfalls in empirical coexistence studies and a checklist for improving their statistical robustness.
- Author
-
Terry, J. Christopher D. and Armitage, David W.
- Subjects
EMPIRICAL research ,RESEARCH personnel ,DEGREES of freedom ,BEST practices ,DATA analysis - Abstract
Modern coexistence theory (MCT) offers a conceptually straightforward approach for connecting empirical observations with an elegant theoretical framework, gaining popularity rapidly over the past decade. However, beneath this surface‐level simplicity lie various assumptions and subjective choices made during data analysis. These can lead researchers to draw qualitatively different conclusions from the same set of experiments. As the predictions of MCT studies are often treated as outcomes, and many readers and reviewers may not be familiar with the framework's assumptions, there is a particular risk of 'researcher degrees of freedom' inflating the confidence in results, thereby affecting reproducibility and predictive power.To tackle these concerns, we introduce a checklist consisting of statistical best practices to promote more robust empirical applications of MCT. Our recommendations are organised into four categories: presentation and sharing of raw data, testing model assumptions and fits, managing uncertainty associated with model coefficients and incorporating this uncertainty into coexistence predictions.We surveyed empirical MCT studies published over the past 15 years and discovered a high degree of variation in the level of statistical rigour and adherence to best practices. We present case studies to illustrate the dependence of results on seemingly innocuous choices among competition model structure and error distributions, which in some cases reversed the predicted coexistence outcomes. These results demonstrate how different analytical approaches can profoundly alter the interpretation of experimental results, underscoring the importance of carefully considering and thoroughly justifying each step taken in the analysis pathway.Our checklist serves as a resource for authors and reviewers alike, providing guidance to strengthen the empirical foundation of empirical coexistence analyses. As the field of empirical MCT shifts from a descriptive, trailblazing phase to a stage of consolidation, we emphasise the need for caution when building upon the findings of earlier studies. To ensure that progress made in the field of ecological coexistence is based on robust and reliable evidence, it is crucial to subject our predictions, conclusions and generalisability to a more rigorous assessment than is currently the trend. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Propagation of Uncertainties due to Nuclear Data for the LCT086 Reactor System.
- Author
-
Nyalunga, G. P. and Naicker, V. V.
- Abstract
An Organisation for Economic Co-operation and Development/Nuclear Energy Agency (OECD/NEA) benchmark has been established over recent years to satisfy an increasing demand from the nuclear community for best-estimate predictions accompanied by uncertainty and sensitivity analyses. The main objectives of the OECD/NEA benchmark activity are to determine uncertainties in modeling for reactor systems using best-estimate methods under steady-state and transient conditions and to quantify the impact of these uncertainties for each type of calculation in multiphysics analyses. In terms of light water reactor analyses, an international uncertainty analysis, "Benchmarks for Uncertainty Analysis in Modelling (UAM) for the Design, Operation and Safety Analysis of LWRs" is currently in progress, being coordinated by the OECD/NEA. In the neutronic phases of the benchmark, the uncertainty due to nuclear data is being studied for various LWR types. The LCT086 benchmark, which is a VVER-type reactor criticality benchmark experiment, has been identified to form part of the validation matrix for the uncertainty methodology development. Resulting from this, the main focus of the present work is to propagate the uncertainty due to the nuclear data for two cases (LCT086/Case1 and LCT086/Case3) presented in the LCT086 benchmark. Both fuel assembly and core models were used for the analysis. The code package used to perform the calculations was SCALE 6.2.1. In particular, the function modules KENO-VI, SAMPLER, and TSUNAMI-3D of SCALE 6.2.1 were used. MCNP 6.1 was also used for continuous-energy criticality calculations. In addition to propagating the uncertainty due to the nuclear data, the uncertainty due to selected input parameters as bounded by the manufacturing tolerances were also propagated so that the modeling methods employed could be verified against those reported in the LCT086 benchmark. The results obtained for the nuclear data uncertainty were further compared with nuclear data uncertainty propagation results for the OECD/NEA UAM Kozloduy-6 reactor system. The uncertainty in the multiplication factor $${k_{eff}}$$ k eff is reported in pcm together with the main contributors to the uncertainty from the nuclear data reported in $$\% \Delta k/k$$ %Δk / k. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Statistical landslide susceptibility assessment using Bayesian logistic regression and Markov Chain Monte Carlo (MCMC) simulation with consideration of model class selection.
- Author
-
Zhao, Tengyuan, Peng, Hongzhen, Xu, Ling, and Sun, Pingping
- Subjects
MARKOV chain Monte Carlo ,LANDSLIDE hazard analysis ,DEEP learning ,LOGISTIC regression analysis - Abstract
Landslide susceptibility mapping (LSM) plays an essential role in landslide management and contributes to decision-makers and planners to formulate landslide prevention policies. It is often carried out by predicting possibility of landslide occurrence first from numerous landslides conditioning factors (LCFs), followed by partitioning areas with different landslide susceptibility levels. Numerous methods have been proposed for such a purpose, saying logistic regression (LR), deep learning methods, etc. Among these methods, LR is the most widely used in literature, which may be attributed to its good performance and easy-to-follow. However, few studies explore uncertainty and reliability of the LR in LSM. Furthermore, not all LCFs contribute significantly to the landslide occurrence, saying elevation, distance to roads, etc. How to objectively determine the most relevant LCFs is another issue that remains unsolved. This study proposes a Bayesian LR method for landslide susceptibility assessment (LSA), together with Markov Chain Monte Carlo (MCMC) simulation for parameter estimation. MCMC samples are used to determine the optimal model, and to quantify the uncertainty associated with the LSM. Real-life data from Shaanxi Province are used for illustration. Results show that the proposed method works reasonably well in determination of the optimal model and in uncertainty quantification in LSM. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Non-Probabilistic Uncertainty and Correlation Propagation Analysis Methods Based on Multidimensional Parallelepiped Model.
- Author
-
Lü, Hui, Li, Zhencong, Huang, Xiaoting, and Shangguan, Wen-Bin
- Subjects
STATISTICAL correlation ,MONTE Carlo method ,POLYNOMIAL chaos - Abstract
In engineering practice, the uncertainty and correlation may coexist in the input parameters, as well as in the output responses. To address such cases, several methods are developed for the non-probabilistic uncertainty and correlation propagation analysis in this study. In the proposed methods, the multidimensional parallelepiped model (MPM) is introduced to quantify the uncertainty and correlation of input parameters. In the uncertainty propagation analysis, three methods are presented to calculate the interval bounds of output responses. Among the methods, the Monte Carlo uncertainty analysis method (MCUAM) is firstly presented as a reference method, and then the first-order perturbation method (FOPM) is employed to promote the computational efficiency, and the sub-parallelepiped perturbation method (SPPM) is further developed to handle the correlated parameters with large uncertainty. In the correlation propagation analysis, the Monte Carlo correlation analysis method (MCCAM) is proposed based on the MPM and Monte Carlo simulation, which aims to compute the correlation among different output responses. The uncertainty domains between any two responses can also be constructed by the MCCAM. The effectiveness of the proposed methods on dealing with the uncertainty and correlation propagation problems is demonstrated by three numerical examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Representing natural climate variability in an event attribution context: Indo-Pakistani heatwave of 2022
- Author
-
Shruti Nath, Mathias Hauser, Dominik L. Schumacher, Quentin Lejeune, Lukas Gudmundsson, Yann Quilcaille, Pierre Candela, Fahad Saeed, Sonia I. Seneviratne, and Carl-Friedrich Schleussner
- Subjects
Attribution ,Extreme events ,Natural climate variability ,Emulators ,Uncertainty propagation ,Parametric uncertainty ,Meteorology. Climatology ,QC851-999 - Abstract
Attribution of extreme climate events to global climate change as a result of anthropogenic greenhouse gas emissions has become increasingly important. Extreme climate events arise at the intersection of natural climate variability and a forced response of the Earth system to anthropogenic greenhouse gas emissions, which may alter the frequency and severity of such events. Accounting for the effects of both natural climate variability and the forced response to anthropogenic climate change is thus central for the attribution. Here, we investigate the reproducibility of probabilistic extreme event attribution results under more explicit representations of natural climate variability. We employ well-established methodologies deployed in statistical Earth System Model emulators to represent natural climate variability as informed from its spatio-temporal covariance structures. Two approaches towards representing natural climate variability are investigated: (1) where natural climate variability is treated as a single component; and (2) where natural climate variability is disentangled into its annual and seasonal components. We showcase our approaches by attributing the 2022 Indo-Pakistani heatwave to human-induced climate change. We find that explicit representation of annual and seasonal natural climate variability increases the overall uncertainty in attribution results considerably compared to established approaches such as the World Weather Attribution Initiative. The increase in likelihood of such an event occurring as a result of global warming differs slightly between the approaches, mainly due to different assessments of the pre-industrial return periods. Our approach that explicitly resolves annual and seasonal natural climate variability indicates a median increase in likelihood by a factor of 41 (95% range: 6-603). We find a robust signal of increased likelihood and intensification of the event with increasing global warming levels across all approaches. Compared to its present likelihood, under 1.5 °C (2 °C) of global near-surface air temperature increase relative to pre-industrial temperatures, the likelihood of the event would be between 2.2 to 2.5 times (8 to 9 times) higher. We note that regardless of the different statistical approaches to represent natural variability, the outcomes on the conducted event attribution are similar, with minor differences mainly in the uncertainty ranges. Possible reasons for differences are evaluated, including limitations of the proposed approach for this type of application, as well as the specific aspects in which it can provide complementary information to established approaches.
- Published
- 2024
- Full Text
- View/download PDF
43. Determining Uncertainties Associated With Retrieved AOD Based on the Dark Object Method: A Case Study With GF-1/WFV Satellite Data
- Author
-
Ruoxi Yang, Lingling Ma, Yongguang Zhao, Xin Lu, Renfei Wang, Wan Li, Beibei Zhang, Qijin Han, Qiongqiong Lan, Qingchuan Zheng, Xiaoxin Hou, Jianghong Zhao, and Xiang Zhou
- Subjects
Aerosol optical depth (AOD) ,dark object (DO) method ,GF-1/WFV ,metrology ,traceability ,uncertainty propagation ,Ocean engineering ,TC1501-1800 ,Geophysics. Cosmic physics ,QC801-809 - Abstract
The retrieval of aerosol optical depth (AOD) from remote sensing data using the dark object (DO) method, which utilizes the measured radiance in areas with dark underlying scenes, such as dark vegetation or shadowed areas, is widely applied in atmospheric science. While many studies have aimed to improve its accuracy and validate its effectiveness, comprehensive uncertainty analysis associated with AOD retrieved based on the DO method remains limited. Therefore, this study focuses on addressing the uncertainty issues in AOD retrieval, using shadowed regions as a case study based on the DO method. We developed a method suitable for AOD retrieval from China's GF-1/WFV satellite data, which lacks shortwave infrared band information, and analyzed the retrieval process to identify the sources and propagation of uncertainty in the AOD retrieval process. Following metrological principles, we performed thorough calculations and analysis of the uncertainties associated with key influencing factors throughout the retrieval process, including with the retrieval algorithm, input data and related aspects. Furthermore, by considering the interrelationships among influencing factors and synthesizing uncertainties, we traced the propagation of uncertainty from GF-1/WFV Level 1 products to the final AOD products. The AOD retrieval results indicate strong consistency with AERONET ground observations. We characterized the uncertainties associated with AOD retrieval using an uncertainty tree diagram, quantifying specific uncertainty results. This provides insights into satellite-based AOD retrieval methods and investigates the traceability of China's land satellite AOD products.
- Published
- 2024
- Full Text
- View/download PDF
44. A Koopman Reachability Approach for Uncertainty Analysis in Ground Vehicle Systems
- Author
-
Alok Kumar, Bhagyashree Umathe, and Atul Kelkar
- Subjects
Koopman spectrum ,vehicle dynamics ,principal eigenfunction ,uncertainty propagation ,Mechanical engineering and machinery ,TJ1-1570 - Abstract
Recent progress in autonomous vehicle technology has led to the development of accurate and efficient tools for ensuring safety, which is crucial for verifying the reliability and security of vehicles. These vehicles operate under diverse conditions, necessitating the analysis of varying initial conditions and parameter values. Ensuring the safe operation of the vehicle under all these varying conditions is essential. Reachability analysis is an important tool to certify the safety and stability of the vehicle dynamics. We propose a reachability analysis approach for evaluating the response of the vehicle dynamics, specifically addressing uncertainties in the initial states and model parameters. Reachable sets illustrate all the possible states of a dynamical system that can be obtained from a given set of uncertain initial conditions. The analysis is crucial for understanding how variations in initial conditions or system parameters can lead to outcomes such as vehicle collisions or deviations from desired paths. By mapping out these reachable states, it is possible to design systems that maintain safety and reliability despite uncertainties. These insights help to ensure the stability and reliability of the vehicles, even in unpredictable conditions, by reducing accidents and optimizing performance. The nonlinearity of the model complicates the computation of reachable sets in vehicle dynamics. This paper proposes a Koopman theory-based approach that utilizes the Koopman principal eigenfunctions and the Koopman spectrum. By leveraging the Koopman principal eigenfunction, our method simplifies the computational process and offers a formal approximation for backward and forward reachable sets. First, our method effectively computes backward and forward reachable sets for a nonlinear quarter-car model with fixed parameter values. Furthermore, we applied our approach to analyze the uncertainty response for cases with uncertain parameters of the vehicle model. When compared to time-domain simulations, our proposed Koopman approach provided accurate results and also reduced the computational time by half in most cases. This demonstrates the efficiency and reliability of our proposed approach in dynamic systems uncertainty analysis using the reachable sets.
- Published
- 2024
- Full Text
- View/download PDF
45. Quantitative evaluation of the impact of hydrological forecasting uncertainty on reservoir real-time optimal operation.
- Author
-
Zhu, Feilin, Wang, Yaqin, Liu, Bojun, Cao, Qing, Han, Mingyu, Zeng, Yurou, Lin, Meiyan, Zhao, Lingqi, Wang, Xinrong, Wan, Zhiqi, and Zhong, Ping-an
- Subjects
- *
HYDROLOGICAL forecasting , *DIFFERENTIAL evolution , *FLOOD control , *ROBUST optimization , *OPERATIONAL risk , *DECISION making , *STOCHASTIC models - Abstract
The substantial challenge posed by inherent hydrological forecasting uncertainty has critical implications for the optimization of real-time reservoir operations. In response, this study introduces a stochastic framework explicitly devised to comprehensively quantify the ramifications of hydrological forecasting uncertainty, notably its temporal correlations, on the outcomes of real-time reservoir optimization and risk assessment. Furthermore, this framework seeks to delineate the pivotal influence of incorporating or neglecting these temporal dynamics on the eventual results, while concurrently elucidating the underlying mechanisms governing these discernible influences. The framework adopts a comprehensive approach to simulating hydrological forecast uncertainty through ensemble forecasts and scenario trees, employing three methods (two Monte Carlo sampling-based methods and one Gaussian copula method) to generate inflow forecast ensembles. To improve the adaptability to uncertainties in inflow forecasts, the framework incorporates a transformation of the generated ensembles into scenario trees, serving as input for a stochastic optimization model that derives the final optimal decision based on optimizing the expected value of the objective function for all scenarios. Additionally, a parallel differential evolution algorithm is proposed to solve the stochastic optimization model efficiently. Risk assessment is performed to capture the uncertainty and corresponding risk associated with the reservoir optimal decision. The proposed framework is demonstrated in a flood control reservoir system in China, where several numerical experiments are conducted to explore the effect of forecast uncertainty level and temporal correlation on real-time reservoir optimal operation. Results show that the temporal correlation of inflows must be considered in both inflow stochastic simulation and reservoir stochastic optimization to avoid overestimating or underestimating operational risk, potentially leading to operation failures. By examining the risk simulation surface, reservoir operators can evaluate the robustness of operational decisions and make more reliable final decisions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Application of the Polynomial Chaos Expansion to the Uncertainty Propagation in Fault Transients in Nuclear Fusion Reactors: DTT TF Fast Current Discharge.
- Author
-
De Bastiani, Marco, Aimetta, Alex, Bonifetto, Roberto, and Dulla, Sandra
- Subjects
POLYNOMIAL chaos ,FUSION reactors ,MONTE Carlo method ,FUSION reactor divertors ,SUPERCONDUCTING magnets ,NUCLEAR fusion ,ORTHOGONAL polynomials - Abstract
Nuclear fusion reactors are composed of several complex components whose behavior may be not certain a priori. This uncertainty may have a significant impact on the evolution of fault transients in the machine, causing unexpected damage to its components. For this reason, a suitable method for the uncertainty propagation during those transients is required. The Monte Carlo method would be the reference option, but it is, in most of the cases, not applicable due to the large number of required, repeated simulations. In this context, the Polynomial Chaos Expansion has been considered as a valuable alternative. It allows us to create a surrogate model of the original one in terms of orthogonal polynomials. Then, the uncertainty quantification is performed repeatedly, relying on this much simpler and faster model. Using the fast current discharge in the Divertor Tokamak Test Toroidal Field (DTT TF) coils as a reference scenario, the following method has been applied: the uncertainty on the parameters of the Fast Discharge Unit (FDU) varistor disks is propagated to the simulated electrical and electromagnetic relevant effects. Eventually, two worst-case scenarios are analyzed from a thermal–hydraulic point of view with the 4C code, simulating a fast current discharge as a consequence of a coil quench. It has been demonstrated that the uncertainty on the inputs (varistor parameters) strongly propagates, leading to a wide range of possible scenarios in the case of accidental transients. This result underlines the necessity of taking into account and propagating all possible uncertainties in the design of a fusion reactor according to the Best Estimate Plus Uncertainty approach. The uncertainty propagation from input data to electrical, electromagnetic, and thermal hydraulic results, using surrogate models, is the first of its kind in the field of the modeling of superconducting magnets for nuclear fusion applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Seismic risk and resilience analysis of networked industrial facilities.
- Author
-
Tabandeh, Armin, Sharma, Neetesh, and Gardoni, Paolo
- Subjects
- *
INFRASTRUCTURE (Economics) , *EARTHQUAKE intensity , *EARTHQUAKE hazard analysis , *NATURAL disaster warning systems , *HAZARDOUS substances , *ENVIRONMENTAL infrastructure , *HAZARD mitigation , *DIFFERENTIAL equations - Abstract
Industrial facilities, as an essential part of socio-economic systems, are susceptible to disruptions caused by earthquakes. Such disruptions may result from direct structural damage to facilities or their loss of functionality due to impacts on their support facilities and infrastructure systems. Decisions to improve the seismic performance of industrial facilities should ideally be informed by risk (and resilience) analysis, taking into account their loss of functionality and the following recovery under the influence of various sources of uncertainty. Rather than targeting specific individual facilities like a hazardous chemical plant, our objective is to quantify the resilience of interacting industrial facilities (i.e., networked industrial facilities) in the face of uncertain seismic events while accounting for their functional dependencies on infrastructure systems. A specific facility, such as a hazardous chemical plant, can be a compound node in the network representation, interacting with other facilities and their supporting infrastructure components. In this context, a compound node is a complex system in its own right. To this end, this paper proposes a formulation to model the functionality of interacting industrial facilities and infrastructure using a system of coupled differential equations, representing dynamic processes on interdependent networked systems. The equations are subject to uncertain initial conditions and have uncertain coefficients, capturing the effects of uncertainties in earthquake intensity measures, structural damage, and post-disaster recovery process. The paper presents a computationally tractable approach to quantify and propagate various sources of uncertainty through the formulated equations. It also discusses the recovery of damaged industrial facilities and infrastructure components under resource and implementation constraints. The effects of changes in structural properties and networks' connectivity are incorporated into the governing equations to model networks' functionality recovery and quantify their resilience. The paper illustrates the proposed approach for the seismic resilience analysis of a hypothetical but realistic shipping company in the city of Memphis in Tennessee, United States. The example models the effects of dependent water and power infrastructure systems on the functionality disruption and recovery of networked industrial facilities subject to seismic hazards. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Data inaccuracy quantification and uncertainty propagation for bibliometric indicators.
- Author
-
Donner, Paul
- Subjects
- *
CITATION analysis , *BIBLIOGRAPHICAL citations , *REGRESSION analysis , *BIBLIOMETRICS , *DATA quality - Abstract
This study introduces an approach to estimate the uncertainty in bibliometric indicator values that is caused by data errors. This approach utilizes Bayesian regression models, estimated from empirical data samples, which are used to predict error-free data. Through direct Monte Carlo simulation—drawing many replicates of predicted data from the estimated regression models for the same input data—probability distributions for indicator values can be obtained which provide the information on their uncertainty due to data errors. It is demonstrated how uncertainty in base quantities, such as the number of publications of certain document types of a unit of analysis and the number of citations of a publication, can be propagated along a measurement model into final indicator values. Synthetic examples are used to illustrate the method and real bibliometric research evaluation data is used to show its application in practice. Though in this contribution we just use two out of a larger number of known bibliometric error categories and therefore can account for only some part of the total uncertainty due to inaccuracies, the latter example reveals that average values of citation impact scores of publications of research groups need to be used very cautiously as they often have large margins of error resulting from data inaccuracies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. A brief guide to measurement uncertainty (IUPAC Technical Report).
- Author
-
Possolo, Antonio, Hibbert, David Brynn, Stohner, Jürgen, Bodnar, Olha, and Meija, Juris
- Subjects
- *
TECHNICAL reports , *MEASUREMENT uncertainty (Statistics) , *MONTE Carlo method , *RANDOM variables , *ANALYTICAL chemistry - Abstract
This Brief Guide reintroduces readers to the main concepts and technical tools used for the evaluation and expression of measurement uncertainty, including both classical and Bayesian statistical methods. The general approach is the same that was adopted by the Guide to the Expression of Uncertainty in Measurement (GUM): quantities whose values are surrounded by uncertainty are modeled as random variables, which enables the application of a wide range of techniques from probability and statistics to the evaluation of measurement uncertainty. All the methods presented are illustrated with examples involving real measurement results from a wide range of fields of chemistry and related sciences, ranging from classical analytical chemistry as practiced at the beginning to the 20th century, to contemporary studies of isotopic compositions of the elements and clinical trials. The supplementary material offers profusely annotated computer codes that allow the readers to reproduce all the calculations underlying the results presented in the examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. A Bayesian framework for incorporating exposure uncertainty into health analyses with application to air pollution and stillbirth.
- Author
-
Comess, Saskia, Chang, Howard H, and Warren, Joshua L
- Subjects
- *
AIR pollution , *PROBABILITY density function , *STILLBIRTH , *HEALTH outcome assessment , *AIR analysis - Abstract
Studies of the relationships between environmental exposures and adverse health outcomes often rely on a two-stage statistical modeling approach, where exposure is modeled/predicted in the first stage and used as input to a separately fit health outcome analysis in the second stage. Uncertainty in these predictions is frequently ignored, or accounted for in an overly simplistic manner when estimating the associations of interest. Working in the Bayesian setting, we propose a flexible kernel density estimation (KDE) approach for fully utilizing posterior output from the first stage modeling/prediction to make accurate inference on the association between exposure and health in the second stage, derive the full conditional distributions needed for efficient model fitting, detail its connections with existing approaches, and compare its performance through simulation. Our KDE approach is shown to generally have improved performance across several settings and model comparison metrics. Using competing approaches, we investigate the association between lagged daily ambient fine particulate matter levels and stillbirth counts in New Jersey (2011–2015), observing an increase in risk with elevated exposure 3 days prior to delivery. The newly developed methods are available in the R package KDExp. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.