111 results
Search Results
2. A probabilistic approach for the levelized cost of energy of floating offshore wind farms.
- Author
-
Amlashi, Hadi and Baniotopoulos, Charalampos
- Subjects
WIND power ,ENERGY industries ,WIND power plants ,WIND turbines ,OFFSHORE wind power plants ,RANDOM variables - Abstract
This paper aims to analyze the levelized cost of energy (LCOE) of floating offshore wind farms from a probabilistic point of view. Understanding and addressing the uncertainty associated with the main parameters that influence the wind energy generated by a wind farm during its lifetime is crucial for the economic evaluation of offshore wind energy in the broader energy landscape. The methodology for probabilistic assessment of LCOE is introduced, and the uncertainty in input parameters are discussed. In a base case study, an assumed Floating Offshore Wind Farm (FOWT) consisting of 250 5-MW wind turbines is considered. The use of bias and randomness in key random variables is discussed and studied in detail. Results indicate that LCOE estimates of 15 EURc/kWh for offshore wind turbines are achievable with reasonable confidence, while estimates of 5 EURc/kWh require careful consideration of uncertainty in the wind farm's parameters. The feasibility analysis showed that techno-economic parameters are more influenced by wind characteristics and efficient use of wind turbines than by the cost of the wind farm. This paper provides general guidance on how to carry out early-stage techno-economic analysis of FOWFs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Explicit constants in the nonuniform local limit theorem for Poisson binomial random variables.
- Author
-
Auld, Graeme and Neammanee, Kritsana
- Subjects
BINOMIAL theorem ,LIMIT theorems ,RANDOM variables ,POISSON'S equation ,PROBABILITY theory - Abstract
In a recent paper the authors proved a nonuniform local limit theorem concerning normal approximation of the point probabilities P (S = k) when S = ∑ i = 1 n X i and X 1 , X 2 , ... , X n are independent Bernoulli random variables that may have different success probabilities. However, their main result contained an undetermined constant, somewhat limiting its applicability. In this paper we give a nonuniform bound in the same setting but with explicit constants. Our proof uses Stein's method and, in particular, the K-function and concentration inequality approaches. We also prove a new uniform local limit theorem for Poisson binomial random variables that is used to help simplify the proof in the nonuniform case. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Mean convergence theorems for arrays of dependent random variables with applications to dependent bootstrap and non-homogeneous Markov chains.
- Author
-
Vǎn Thành, Lê
- Subjects
MARKOV processes ,RANDOM variables ,DEPENDENT variables ,LAW of large numbers ,MATHEMATICS - Abstract
This paper provides sets of sufficient conditions for mean convergence theorems for arrays of dependent random variables. We expand and improve a number of particular cases in the literature including Theorem 2.1 in Sung (Appl Math Lett 26(1):18–24, 2013), Theorems 3.1–3.3 in Wu and Guan (J Math Anal Appl 377(2):613–623, 2011), and Theorem 3 in Lita da Silva (Results Math 74(1):1–11, 2019), among others. The proof is different from those in the aforementioned papers and the main results can be applied to obtain mean convergence results for arrays of functions of non-homogeneous Markov chains and dependent bootstrap. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Inference and expected total test time for step-stress life test in the presence of complementary risks and incomplete data.
- Author
-
Tian, Yajie and Gui, Wenhao
- Subjects
ACCELERATED life testing ,MISSING data (Statistics) ,CENSORING (Statistics) ,RANDOM variables ,ENGINEERING mathematics ,STATISTICAL models - Abstract
The complementary risk is common and important in the engineering field. However, there is not much research about it because of its complex derivation compared with the competing risk model. In this paper, we concentrate on inference of step-stress partially accelerated life test in the presence of complementary risks under progressive type-II censoring scheme. The Weibull distribution is chosen as the baseline lifetime of the model. The tampered random variable model is adopted as the statistical acceleration model in the accelerated test. We apply both the classical and Bayesian methods to obtain the estimation of lifetime parameters and acceleration factors. The reliability and reversed hazard rate are estimated based on the parametric estimates. The computational formulae of expected total test time are creatively derived under the step-stress and censored setting. The theoretical calculations are compared with simulated values to verify the derivation. Also, numerical studies including the simulation study and real-data analysis in engineering background are conducted to compare and illustrate the performance of the approaches proposed in the paper. Some conclusions and suggestions for actual production are given at the end of the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Refined probabilistic response and seismic reliability evaluation of high-rise reinforced concrete structures via physically driven dimension-reduced probability density evolution equation.
- Author
-
Lyu, Meng-Ze, Chen, Jian-Bing, and Shen, Jia-Xu
- Subjects
SEISMIC response ,REINFORCED concrete ,EVOLUTION equations ,MONTE Carlo method ,GROUND motion ,RANDOM variables ,PROBABILISTIC number theory - Abstract
Dynamic reliability evaluation of large-scale reinforced concrete (RC) structures is one of the most challenging problems in engineering practices. Although extensive endeavors have been devoted to mechanical analysis of concrete structures in the past decades, it was recognized that the randomness from both structural parameters and excitations have significant effects on the dynamic behaviors of structures with complex nonlinearity, damage, energy dissipation, and plasticity. Thus, great difficulty exists in evaluating the nonlinear stochastic responses and dynamic reliability of the real-world complex structures of large degrees of freedom. In the present paper, a physically driven method for refined probabilistic response and seismic reliability evaluation of real-world RC structures is proposed via synthesis of the refined mechanical analysis and the physically-based uncertainty propagation. In this method, the material parameters can be treated as probabilistically dependent random variables characterized by vine copulas, and the ground motion is modeled by non-stationary Clough–Penzien spectrum. The uncertainty propagation of arbitrary response quantity(-ies) of interest is governed by the dimension-reduced probability density evolution equation (DR-PDEE). The intrinsic drift coefficients in the DR-PDEE are the physically driven force for the uncertainty propagation and can be identified via data from representative dynamic analyses of the structure. The time-variant reliability of the structures can be captured by solving the physically driven DR-PDEE, which cannot be achieved by the general Monte Carlo simulation (MCS) due to the prohabitively large computational cost. Finally, a practical engineering application is shown in this paper for the probabilistic response and seismic reliability evaluation of a 24-story RC shear wall structure with nearly 280,000 degrees of freedom (DOFs). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. On the asymptotic behaviour of the joint distribution of the maxima and minima of observations, when the sample size is a random variable.
- Author
-
Vasudeva, R.
- Subjects
RANDOM variables ,SAMPLE size (Statistics) ,STATISTICAL sampling - Abstract
In this paper, we obtain the asymptotic form of the joint distribution of maxima and minima of independent observations, when the sample size is a random variable. We also discuss the asymptotic distribution of the Range. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Connection between higher order measures of risk and stochastic dominance.
- Author
-
Pichler, Alois
- Subjects
STOCHASTIC orders ,RANDOM variables - Abstract
Higher order risk measures are stochastic optimization problems by design, and for this reason they enjoy valuable properties in optimization under uncertainties. They nicely integrate with stochastic optimization problems, as has been observed by the intriguing concept of the risk quadrangles, for example. Stochastic dominance is a binary relation for random variables to compare random outcomes. It is demonstrated that the concepts of higher order risk measures and stochastic dominance are equivalent, they can be employed to characterize the other. The paper explores these relations and connects stochastic orders, higher order risk measures and the risk quadrangle. Expectiles are employed to exemplify the relations obtained. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Complete convergence for weighted sums of random variables satisfying generalized Rosenthal-type inequalities*.
- Author
-
Liu, Chang and Miao, Yu
- Subjects
REGRESSION analysis ,RANDOM variables - Abstract
In the paper, we establish the complete convergence for weighted sums of random variables satisfying generalized Rosenthal-type inequalities. Our results partially extend some known results and weaken their conditions. As statistical applications, we study the nonparametric regression model and obtain the complete consistency of the weighted regression estimator for the unknown regression functions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Z4Z4Z4-additive cyclic codes are asymptotically good.
- Author
-
Dinh, Hai Q., Yadav, Bhanu Pratap, Pathak, Sachin, Prasad, Abhyendra, Upadhyay, Ashish Kumar, and Yamaka, Woraphon
- Subjects
CYCLIC codes ,REAL numbers ,RANDOM variables - Abstract
In this paper, we construct a class of Z 4 Z 4 Z 4 -additive cyclic codes generated by 3-tuples of polynomials. We discuss their algebraic structure and show that generator matrices can be constructed for all codes in this class. We study asymptotic properties of this class of codes by using a Bernoulli random variable. Moreover, let 0 < δ < 1 be a real number such that the entropy h 4 ((k + l + t) δ 6) < 1 4 , we show that the relative minimum distance converges to δ and the rate of the random codes converges to 1 k + l + t , where k, l, and t are pairwise co-prime positive odd integers. Finally, we conclude that the Z 4 Z 4 Z 4 -additive cyclic codes are asymptotically good. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Importance Weighting in Hybrid Iterative Ensemble Smoothers for Data Assimilation.
- Author
-
Ba, Yuming and Oliver, Dean S.
- Subjects
KALMAN filtering ,TWO-phase flow ,POROUS materials ,COMPLEX variables ,RANDOM variables ,PERMEABILITY - Abstract
Because it is generally impossible to completely characterize the uncertainty in complex model variables after assimilation of data, it is common to approximate the uncertainty by sampling from approximations of the posterior distribution for model variables. When minimization methods are used for the sampling, the weights on each of the samples depend on the magnitude of the data mismatch at the critical points and on the Jacobian of the transformation from the prior density to the sample proposal density. For standard iterative ensemble smoothers, the Jacobian is identical for all samples, and the weights depend only on the data mismatch. In this paper, a hybrid data assimilation method is proposed which makes it possible for each ensemble member to have a distinct Jacobian and for the approximation to the posterior density to be multimodal. For the proposed hybrid iterative ensemble smoother, it is necessary that a part of the mapping from the prior Gaussian random variable to the data be analytic. Examples might include analytic transformation from a latent Gaussian random variable to permeability followed by a black-box transformation from permeability to state variables in porous media flow, or a Gaussian hierarchical model for variables followed by a similar black-box transformation from permeability to state variables. In this paper, the application of weighting to both hybrid and standard iterative ensemble smoothers is investigated using a two-dimensional, two-phase flow problem in porous media with various degrees of nonlinearity. As expected, the weights in a standard iterative ensemble smoother become degenerate for problems with large amounts of data. In the examples, however, the weights for the hybrid iterative ensemble smoother were useful for improving forecast reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Enhancing ecological uncertainty predictions in pollution control games through dynamic Bayesian updating.
- Author
-
Zhou, Jiangjing, Petrosian, Ovanes, and Gao, Hongwei
- Subjects
NATURAL disasters ,BIOINDICATORS ,RANDOM variables ,FORECASTING ,GAMES - Abstract
This study presents a dynamic Bayesian game model designed to improve predictions of ecological uncertainties leading to natural disasters. It incorporates historical signal data on ecological indicators. Participants, acting as decision-makers, receive signals about an unknown parameter-observations of a random variable's realization values before a specific time, offering insights into ecological uncertainties. The essence of the model lies in its dynamic Bayesian updating, where beliefs about unknown parameters are refined with each new signal, enhancing predictive accuracy. The main focus of our paper is to theoretically validate this approach, by presenting a number of theorems that prove its precision and efficiency in improving uncertainty estimations. Simulation results validate the model's effectiveness in various scenarios, highlighting its role in refining natural disaster forecasts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Sampling large hyperplane-truncated multivariate normal distributions.
- Author
-
Maatouk, Hassan, Rullière, Didier, and Bay, Xavier
- Subjects
GAUSSIAN distribution ,RANDOM variables ,GAUSSIAN processes ,RANDOM matrices ,STATIONARY processes ,BLOCK codes - Abstract
Generating multivariate normal distributions is widely used in various fields, including engineering, statistics, finance and machine learning. In this paper, simulating large multivariate normal distributions truncated on the intersection of a set of hyperplanes is investigated. Specifically, the proposed methodology focuses on cases where the prior multivariate normal is extracted from a stationary Gaussian process (GP). It is based on combining both Karhunen-Loève expansions (KLE) and Matheron's update rules (MUR). The KLE requires the computation of the decomposition of the covariance matrix of the random variables, which can become expensive when the random vector is too large. To address this issue, the input domain is split in smallest subdomains where the eigendecomposition can be computed. Due to the stationary property, only the eigendecomposition of the first subdomain is required. Through this strategy, the computational complexity is drastically reduced. The mean-square truncation and block errors have been calculated. The efficiency of the proposed approach has been demonstrated through both synthetic and real data studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. An ensemble algorithm based on adaptive chaotic quantum-behaved particle swarm optimization with weibull distribution and hunger games search and its financial application in parameter identification.
- Author
-
Ye, Hanqiu and Dong, Jianping
- Subjects
PARTICLE swarm optimization ,METAHEURISTIC algorithms ,PARAMETER identification ,WEIBULL distribution ,RANDOM numbers ,IMAGE encryption ,RANDOM variables - Abstract
Quantum-behaved Particle Swarm Optimization (QPSO) is a meta-heuristic optimization algorithm, which is widely used in many research fields and practical problems due to its flexibility and low computational cost. However, the existing QPSO algorithms and their variants still have problems such as insufficient search capabilities, lack of adaptivity, and prone to stagnation. This paper proposes a novel ensemble algorithm, ACQPSOW-HGS, based on Quantum-behaved Particle Swarm Optimization (QPSO) and Hunger Games Search (HGS). By combining three improvements and introducing three hybrid strategies, our algorithm has made a comprehensive development, effectively improving the stability and solution accuracy in a large number of test functions and the parameter identification application, which is superior compared with many existing algorithms. First, we design the Weibull distribution random number generation operator, the distance-guided adaptive control technique, and the chaotic update mechanism to deal with the weak randomness, insufficient adaptability, and susceptibility to stagnation of the original QPSO algorithm, respectively. Integrating the above improvements, ACQPSOW is proposed as an improved variant of QPSO. Second, the proposed ensemble algorithm ACQPSOW-HGS is built on ACQPSOW and HGS and combined with specific hybrid strategies to add population diversity and improve search efficiency, including the Selection-Crossover-Mutation mechanism, the elite local search mechanism, and the information exchange mechanism. Finally, the experiments, on 23 benchmark functions and the IEEE CEC 2017 test suite, demonstrate that ACQPSOW-HGS outperforms comparison algorithms in terms of convergence speed and solution accuracy through non-parametric statistical tests. Moreover, ACQPSOW-HGS was applied to the fractional-order hyper-chaotic financial system for parameter identification to illustrate the applicability and robustness in solving real-world problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Anticipative information in a Brownian−Poisson market.
- Author
-
D'Auria, Bernardo and Salmeron, Jose A.
- Subjects
WIENER processes ,MALLIAVIN calculus ,RANDOM variables ,EXPECTED utility - Abstract
The anticipative information refers to some information about future events that may be disclosed in advance. This information may regard, for example, financial assets and their future trends. In our paper, we assume the existence of some anticipative information in a market whose risky asset dynamics evolve according to a Brownian motion and a Poisson process. Using Malliavin calculus and filtration enlargement techniques, we derive the information drift of the mentioned processes and, both in the pure jump case and in the mixed one, we compute the additional expected logarithmic utility. Many examples are shown, where the anticipative information is related to some conditions that the constituent processes or their running maximum may verify, in particular, we show new examples considering Bernoulli random variables. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. The Consistency of LSE Estimators in Partial Linear Regression Models under Mixing Random Errors.
- Author
-
Yao, Yun Bao, Lü, Yu Tan, Lu, Chao, Wang, Wei, and Wang, Xue Jun
- Subjects
REGRESSION analysis ,LEAST squares ,RANDOM variables ,DATA analysis ,COMPUTER simulation - Abstract
In this paper, we consider the partial linear regression model y
i = xi β* + g(ti ) + εi , i = 1, 2, ..., n, where (xi , ti ) are known fixed design points, g(·) is an unknown function, and β* is an unknown parameter to be estimated, random errors εi are (α, β)-mixing random variables. The p-th (p > 1) mean consistency, strong consistency and complete consistency for least squares estimators of β* and g(·) are investigated under some mild conditions. In addition, a numerical simulation is carried out to study the finite sample performance of the theoretical results. Finally, a real data analysis is provided to further verify the effect of the model. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
17. The estimated causal effect on the variance based on the front-door criterion in Gaussian linear structural equation models: an unbiased estimator with the exact variance.
- Author
-
Kuroki, Manabu and Tezuka, Taiki
- Subjects
STRUCTURAL equation modeling ,LINEAR equations ,RANDOM variables ,RANDOM sets ,UNBIASED estimation (Statistics) - Abstract
In this paper, we assume that cause–effect relationships between random variables can be represented by a Gaussian linear structural equation model and the corresponding directed acyclic graph. Then, we consider a situation where a set of random variables that satisfies the front-door criterion is observed to estimate a total effect. In this situation, when the ordinary least squares method is utilized to estimate the total effect, we formulate the unbiased estimator of the causal effect on the variance of the outcome variable. In addition, we provide the exact variance formula of the proposed unbiased estimator. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. On approximation and estimation of distribution function of sum of independent random variables.
- Author
-
Midhu, N. N., Dewan, Isha, Sudheesh, K. K., and Sreedevi, E. P.
- Subjects
DISTRIBUTION (Probability theory) ,RANDOM variables ,INDEPENDENT variables ,MONTE Carlo method ,APPROXIMATION error - Abstract
In this paper, we obtain an approximation for the distribution function of sum of two independent random variables using quantile based representation. The error of approximation is shown to be negligible under some mild conditions. We then use the approximation to obtain a non-parametric estimator for the distribution function of sum of two independent random variables. The exact distribution of the proposed estimator is derived. The estimator is shown to be strongly consistent and asymptotically normally distributed. Extensive Monte Carlo simulation studies are carried out to evaluate the bias and mean squared error of the estimator and also to assess the approximation error. We also compare the performance of the proposed estimator with other estimators available in the literature. Finally, we illustrate the use of the proposed estimator for estimating the reliability function of a standby redundant system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. On the Baum–Katz theorem for randomly weighted sums of negatively associated random variables with general normalizing sequences and applications in some random design regression models.
- Author
-
Ta Cong, Son, Tran Manh, Cuong, Bui Khanh, Hang, and Le Van, Dung
- Subjects
RANDOM variables ,REGRESSION analysis - Abstract
In this paper, we develop Jajte's technique, which is used in the proof of strong laws of large numbers, to prove complete convergence for randomly weighted sums of negatively associated random variables. Based on a general normalizing function that satisfies some specific conditions, we give some general results on complete convergence for randomly weighted sums of random variables. The Baum–Katz theorem for randomly weighted sums with general normalizing sequences is also presented. Our results have an interesting connection with the theory of regularly varying functions. These results are applied to simple linear regression models as well as nonparametric regression models with random design. Furthermore, simulations to study the numerical performance of the consistency for nearest neighbor weight function estimators in nonparametric regression and least-squares estimators in a simple linear regression with random design are given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Optimal configuration method of demand-side flexible resources for enhancing renewable energy integration.
- Author
-
Fu, Yu, Bai, Hao, Cai, Yongxiang, Yang, Weichen, and Li, Yue
- Subjects
RENEWABLE energy sources ,ENERGY consumption ,COPULA functions ,RANDOM variables ,MICROGRIDS ,ENERGY storage ,WIND power ,ELECTRIC vehicles - Abstract
Demand-side flexible load resources, such as Electric Vehicles (EVs) and Air Conditioners (ACs), offer significant potential for enhancing flexibility in the power system, thereby promoting the full integration of renewable energy. To this end, this paper proposes an optimal allocation method for demand-side flexible resources to enhance renewable energy consumption. Firstly, the adjustable flexibility of these resources is modeled based on the generalized energy storage model. Secondly, we generate random scenarios for wind, solar, and load, considering variable correlations based on non-parametric probability predictions of random variables combined with Copula function sampling. Next, we establish the optimal allocation model for demand-side flexible resources, considering the simulated operation of these random scenarios. Finally, we optimize the demand-side resource transformation plan year by year based on the growth trend forecast results of renewable energy installed capacity in Jiangsu Province from 2025 to 2031. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Ordering results for the smallest (largest) and the second smallest (second largest) order statistics of dependent and heterogeneous random variables.
- Author
-
Shojaee, Omid, Mohammadi, Seyed Morteza, and Momeni, Reza
- Subjects
ORDER statistics ,RANDOM variables ,STOCHASTIC orders ,ENGINEERING reliability theory ,INDEPENDENT sets - Abstract
In reliability theory, the kth order statistic represents the lifetime of a (n - k + 1) -out-of-n system. In particular, the smallest (largest) order statistics denotes the lifetime of a series (parallel) system that consist of n components; and the second smallest (second largest) order statistics is the lifetime of a (n - 1) -out-of-n (2-out-of-n) system. In this paper, sufficient conditions are provided to compare the smallest and the second smallest (largest and second largest) order statistics of dependent and heterogeneous random variables having the additive hazard model with the Archimedean copula in the sense of usual stochastic order and hazard rate order. Further, we compare the smallest order statistics of two sets of independent and heterogeneous random variables having the additive hazard model in the sense of dispersive order. Finally, our theoretical findings are evaluated by some numerical examples and counterexamples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Stochastic virtual element methods for uncertainty propagation of stochastic linear elasticity.
- Author
-
Zheng, Zhibao and Nackenhorst, Udo
- Subjects
POLYNOMIAL chaos ,ALGEBRAIC equations ,RANDOM variables ,POLYNOMIAL approximation ,ELASTICITY ,RANDOM sets - Abstract
This paper presents stochastic virtual element methods for propagating uncertainty in linear elastic stochastic problems. We first derive stochastic virtual element equations for 2D and 3D linear elastic problems that may involve uncertainties in material properties, external forces, boundary conditions, etc. A stochastic virtual element space that couples the deterministic virtual element space and the stochastic space is constructed for this purpose and used to approximate the unknown stochastic solution. Two numerical frameworks are then developed to solve the derived stochastic virtual element equations, including a Polynomial Chaos approximation based approach and a weakly intrusive approximation based approach. In the Polynomial Chaos based framework, the stochastic solution is approximated using the Polynomial Chaos basis and solved via an augmented deterministic virtual element equation that is generated by applying the stochastic Galerkin procedure to the original stochastic virtual element equation. In the weakly intrusive approximation based framework, the stochastic solution is approximated by a summation of a set of products of random variables and deterministic vectors, where the deterministic vectors are solved via converting the original stochastic problem to deterministic virtual element equations by the stochastic Galerkin approach, and the random variables are solved via converting the original stochastic problem to one-dimensional stochastic algebraic equations by the classical Galerkin procedure. This method avoids the curse of dimensionality in high-dimensional stochastic problems successfully since all random inputs are embedded into one-dimensional stochastic algebraic equations whose computational effort weakly depends on the stochastic dimension. Numerical results on 2D and 3D problems with low- and high-dimensional random inputs demonstrate the good performance of the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Comparison of extreme order statistics from two sets of heterogeneous dependent random variables under random shocks.
- Author
-
Amini-Seresht, Ebrahim, Nasiroleslami, Ebrahim, and Balakrishnan, Narayanaswamy
- Subjects
RANDOM variables ,DEPENDENT variables ,STOCHASTIC orders ,ORDER statistics - Abstract
In this paper, we consider two k-out-of-n systems comprising heterogeneous dependent components under random shocks, with an Archimedean copula. We then provide sufficient conditions on the distributions of components' lifetimes and the generator of the Archimedean copula and on the random shocks for comparing the lifetimes of two systems with respect to the usual stochastic order. Finally, we present some examples to illustrate the established results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. On the number of failed components in a series–parallel system upon system failure when the lifetimes are DNID discrete random variables.
- Author
-
Jasiński, Krzysztof
- Subjects
SYSTEM failures ,ENGINEERING reliability theory ,CONDITIONAL probability ,ORDER statistics ,INFORMATION storage & retrieval systems ,RANDOM variables - Abstract
In this paper, we study properties of a series–parallel system. The component lifetimes may be dependent and non-identically distributed (DNID) discrete random variables. We consider the number of failed components upon system failure. We derive the probability mass function and the expected value of this quantity. In addition, we find the conditional probabilities corresponding to this variate given some partial information about the system failure. We also provide a numerical example to demonstrate the theoretical results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Probabilistic and Fuzzy Nonlinear Discontinuous Aeroelastic Analysis of In-plane FG Panels in Supersonic Flow with Mechanical and Thermal In-plane Loadings.
- Author
-
Hussein, Omar S.
- Subjects
SUPERSONIC flow ,POLYNOMIAL chaos ,PROBABILITY density function ,RANDOM variables ,DISTRIBUTION (Probability theory) ,RANDOM fields - Abstract
This paper is concerned with the uncertain discontinuous nonlinear aeroelastic behavior of in-plane bi-directional functionally graded (FG) metal nanocomposite panels. The panels are subjected to supersonic flow and in-plane mechanical and thermal loadings. This type of FG structures is manufactured using additive manufacturing technologies which might lead to uncertain properties of the manufactured parts due to manufacturing uncertainties, modeling uncertainties in the mathematical and physical formulations used to predict their properties, or uncertainties in the constituent materials properties themselves. These sources of uncertainties might be known with defined probability density functions or defined with uncertain intervals only (fuzzy). Therefore, the mechanical and thermal properties of the nanocomposite material are modeled as uncertain random variables or random fields with known probability distribution function (pdf) or uncertain fuzzy variables or fields with given intervals. The random fields are modeled using the Karhunen–Loève expansion (KLE), and the uncertain output variables are modeled using the Hermite polynomial chaos expansion method (HPCE). The effects of the material properties uncertainties type (fuzzy vs. probabilistic), the cross-correlation between the thermal and mechanical properties, the random fields properties (correlation length, stationary vs. non-stationary, etc.) on the dynamic stability thresholds and the nonlinear limit cycle oscillation are studied. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Uncertainty quantification for random domains using periodic random variables.
- Author
-
Hakula, Harri, Harbrecht, Helmut, Kaarnioja, Vesa, Kuo, Frances Y., and Sloan, Ian H.
- Subjects
RANDOM variables ,RANDOM numbers ,RANDOM fields ,FINITE fields ,PERIODIC functions ,ERROR analysis in mathematics ,PREDICATE calculus - Abstract
We consider uncertainty quantification for the Poisson problem subject to domain uncertainty. For the stochastic parameterization of the random domain, we use the model recently introduced by Kaarnioja et al. (SIAM J. Numer. Anal., 2020) in which a countably infinite number of independent random variables enter the random field as periodic functions. We develop lattice quasi-Monte Carlo (QMC) cubature rules for computing the expected value of the solution to the Poisson problem subject to domain uncertainty. These QMC rules can be shown to exhibit higher order cubature convergence rates permitted by the periodic setting independently of the stochastic dimension of the problem. In addition, we present a complete error analysis for the problem by taking into account the approximation errors incurred by truncating the input random field to a finite number of terms and discretizing the spatial domain using finite elements. The paper concludes with numerical experiments demonstrating the theoretical error estimates. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Generalised score distribution: underdispersed continuation of the beta-binomial distribution.
- Author
-
Ćmiel, Bogdan, Nawała, Jakub, Janowski, Lucjan, and Rusek, Krzysztof
- Subjects
DISTRIBUTION (Probability theory) ,RANDOM variables ,MAXIMUM likelihood statistics ,BINOMIAL distribution ,LATENT variables ,GAUSSIAN distribution - Abstract
Consider a class of discrete probability distributions with a limited support. A typical example of such support is some variant of a Likert scale, with a response mapped to either the { 1 , 2 , ... , 5 } or { - 3 , - 2 , ... , 2 , 3 } set. Such type of data is common for Multimedia Quality Assessment but can also be found in many other research fields. For modelling such data a latent variable approach is usually used (e.g., Ordered Probit). In many cases it is convenient or even necessary to avoid latent variable approach (e.g., when dealing with too small sample size). To avoid it the proper class of discrete distributions is needed. The main idea of this paper is to propose a family of discrete probability distributions with only two parameters that play the same role as the parameters of the normal distribution. We call the new class the Generalised Score Distribution (GSD). The proposed GSD class covers the entire set of possible means and variances, for any fixed and finite support. Furthermore, the GSD class can be treated as an underdispersed continuation of a reparametrized beta-binomial distribution. The GSD class parameters are intuitive and can be easily estimated by the method of moments. We also offer a Maximum Likelihood Estimation (MLE) algorithm for the GSD class and evidence that the class properly describes response distributions coming from 24 Multimedia Quality Assessment experiments. At last, we show that the GSD class can be represented as a sum of dichotomous zero–one random variables, which points to an interesting interpretation of the class. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. On the distribution of sample scale-free scatter matrices.
- Author
-
Mathai, A. M. and Provost, Serge B.
- Subjects
S-matrix theory ,RANDOM variables ,HYPERGEOMETRIC series ,HYPERGEOMETRIC functions ,GAMMA distributions ,CHI-square distribution - Abstract
This paper addresses certain distributional aspects of a scale-free scatter matrix denoted by R that is stemming from a matrix-variate gamma distribution having a positive definite scale parameter matrix B. Under the assumption that B is a diagonal matrix, a structural representation of the determinant of R is derived; the exact density functions of products and ratios of determinants of matrices possessing such a structure are obtained; a closed form expression is given for the density function of R. Moreover, a novel procedure is utilized to establish that certain functions of the determinant of the sample scatter matrix are asymptotically distributed as chi-square or normal random variables. Then, representations of the density function of R that respectively involve multiple integrals, multiple series and Gauss' hypergeometric function are provided for the general case of a positive definite scale parameter matrix, and an illustrative numerical example is presented. Cutting-edge mathematical techniques have been employed to derive the results. Naturally, they also apply to the conventional sample correlation matrix which is encountered in various multivariate inference contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. A new (T-Xθ) family of distributions: properties, discretization and estimation with applications.
- Author
-
Mandouh, Rasha M., Mahmoud, Mahmoud R., and Abdelatty, Rasha E.
- Subjects
MONTE Carlo method ,ORDER statistics ,MAXIMUM likelihood statistics ,RENYI'S entropy ,GENERATING functions ,RANDOM variables - Abstract
In this paper, a new class of distributions called the T-X θ family of distributions for bounded—(0,1)—and unbounded— (0 , ∞) —supported random variables is suggested. Some special sub-models of the proposed family are utilized. A new sub-model is selected to be studied in details. The statistical properties of the suggested family including quantile function, moments, moment generating function, order statistics and Rényi entropy are discussed. The maximum likelihood method is provided to estimate the parameters of the distribution and a Monte Carlo simulation study is used. The discretized T-X θ family provided many sub-families and sub-models. In addition, eight real data sets are utilized to demonstrate the flexibility of the proposed continuous and discrete family's multiple sub models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Analytical model considering varying numbers of RA-RUs to determine how to allocate the RA-RUs in practice.
- Author
-
Zhu, Deqing and Pan, Genmei
- Subjects
NETWORK performance ,RANDOM variables - Abstract
The uplink OFDMA−based random access (UORA) is a key mechanism in the 802.11ax protocol. The advantage of the UORA is that nodes can send their frames quickly without requiring resources from the AP; the disadvantage is that the UORA can be heavily affected by collisions, resulting in low channel efficiency. In this paper, we focus on constructing an analytical model in which the number of random−access RUs (RA−RUs) is taken as a random variable. We investigate how the key parameters, the number of RA−RUs and the allowed maximum transmission times affect the network performance. Based on this, we explore how these two parameters can be set in practice. The comparison results show that our scheme outperforms conventional 802.11ax and previous work in terms of delay and the packet delivery ratio (PDR). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. A Comparison Between Bayesian and Ordinary Kriging Based on Validation Criteria: Application to Radiological Characterisation.
- Author
-
Wieskotten, Martin, Crozet, Marielle, Iooss, Bertrand, Lacaux, Céline, and Marrel, Amandine
- Subjects
KRIGING ,NUCLEAR facility decommissioning ,NUCLEAR facilities ,RANDOM variables ,TOY stores - Abstract
In decommissioning projects of nuclear facilities, radiological characterisation aims to estimate the quantity and spatial distribution of different radionuclides. To carry out the estimation, measurements are performed on site to obtain preliminary information. The usual industrial practice consists of applying spatial interpolation tools (as the ordinary kriging method) on these data to predict the value of interest for the contamination (radionuclide concentration, radioactivity, etc.) at unobserved positions. This paper questions the ordinary kriging tool on the well-known problem of the overoptimistic prediction variances due to not taking into account uncertainties on the estimation of the kriging parameters (variance and range). To overcome this issue, the practical use of the Bayesian kriging method, where the model parameters are considered as random variables, is deepened. The usefulness of Bayesian kriging, whilst comparing its performance to that of ordinary kriging, is demonstrated in the small data context (which is often the case in decommissioning projects). This result is obtained via several numerical tests on different toy models and using complementary validation criteria: the predictivity coefficient ( Q 2 ), the predictive variance adequacy, the α confidence interval plot (and its associated mean squared error α ), and the predictive interval adequacy. The latter is a new criterion adapted to Bayesian kriging results. Finally, the same comparison is performed on a real data set coming from the decommissioning project of the CEA Marcoule G3 reactor. It illustrates the practical interest of Bayesian kriging in industrial radiological characterisation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Multi-item stochastic inventory model for deteriorating items with power demand pattern under partial backlogging and joint replenishment.
- Author
-
Gupta, Sweety and Mishra, Vinod Kumar
- Subjects
- *
STOCHASTIC models , *RANDOM variables , *SENSITIVITY analysis , *UNITS of time , *INVENTORIES - Abstract
In this paper, we develop a multi-item inventory model with stochastic demand, and the demand is considered as a random variable following a power demand pattern. A cumulative cycle demand is first identified for each product, and then the demand is gradually released to the inventory system by the power patterns within a cycle. The replenishment schedule is predetermined and well-known. Shortages are permitted but partially backlogged, and the rest are taken as lost sales. An effective inventory policy is utilized to reduce the total expected cost per unit of time by acquiring the optimal amount of stock to be stored at the start of the inventory cycle, which may protect against excessive lost sales and also avoid overstocking. This model is better suited for use in online purchases and the offline market where the buyer is assured that the goods will be shipped within a few days rather than at the time of purchase. We evaluated the system with items deteriorating instantaneously, which makes it appropriate to have a larger initial demand with a power pattern index greater than one. Numerical illustrations and sensitivity analysis provide an efficient tool for the insights of the theoretical solutions over the managerial foundation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Influence of soil property variability on the lateral displacement of liquefiable ground reinforced by granular columns.
- Author
-
Mo, Tengfei, Wu, Qiang, Li, Dian-Qing, and Du, Wenqi
- Subjects
- *
DISTRIBUTION (Probability theory) , *GROUND motion , *RANDOM fields , *RANDOM variables , *PERFORMANCE-based design - Abstract
In this paper, three-dimensional nonlinear dynamic finite-element analyses are conducted to examine the effect of soil property variability on the lateral displacement (D) of liquefiable ground reinforced by granular columns. A suite of 20 ground motions is selected from the NGA-West2 database as input. A soil-granular column ground system consisting of an intermediate liquefiable layer is modeled in OpenSees. Both the random variable (RV) and random filed (RF) methods are adopted to model the variability of soil property parameters. Dynamic analyses are then conducted to estimate the earthquake-induced deformation of the soil-granular column system. It is found that modeling the variability of soil parameters based on the RV method generally increases the geometric mean and standard deviation (σlnD) of D for the soil-granular column system. Enlarging the spatial correlation of soil parameters in the RF model brings in a slight increase of the mean D and comparable σlnD values, respectively. Hence, incorporating the spatially correlated soil property parameters may not be necessarily increase the variation of D for the soil-granular column system. Specifically, the statistical distribution of D is more sensitive to the vertical scale of fluctuation rather than the horizontal one. The results presented could aid in addressing the variability issue for performance-based design of granular column-reinforced liquefiable ground in engineering applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Generalized simulated method-of-moments estimators for multivariate copulas.
- Author
-
Belalia, Mohamed and Quessy, Jean-François
- Subjects
RANDOM variables ,U-statistics ,DATA modeling ,PROBABILITY theory ,INTEGRALS - Abstract
This paper introduces a general semi-parametric method for estimating a vector of parameters in multivariate copula models. The proposed approach uses the moments of the multivariate probability integral random variable to generalize the inversion of Kendall's tau estimator. What makes the new methodology attractive is the fact that it can be performed as soon as one can simulate from the assumed parametric family of copulas. This feature is especially helpful when explicit expressions are not available for the theoretical moments. The consistency and asymptotic normality of the proposed estimators are established under mild conditions. An extensive simulation study indicates that the price to pay for the estimation of the moments is modest and that the new estimators are almost as accurate as the pseudo-maximum likelihood (PML) estimator. The usefulness of the proposed estimators is illustrated on the modelling of multivariate data with copula models where the PML estimator is hardly computable. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Fluctuating Fading Distributions.
- Author
-
Nawa, Victor and Nadarajah, Saralees
- Subjects
SIGNAL-to-noise ratio ,RANDOM variables ,GAUSSIAN distribution ,INDEPENDENT variables - Abstract
Badarneh and da Costa (IEEE Wirel Commun Lett, 2024. https://doi.org/10.1109/LWC.2024.3353620) introduced the fluctuating fading distribution by taking the signal envelope to be the ratio of two independent random variables, one having the Nakagami m distribution and the other a uniform random variable. The Nakagami m distribution corresponds to signals following the normal distribution which may not always hold in practice. In this paper, we derive twenty other fluctuating fading distributions. For each distribution, we give explicit expressions for the average channel capacity and the average bit error rate. Their correctness is checked numerically. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Multi-period descriptive sampling for scenario generation applied to the stochastic capacitated lot-sizing problem.
- Author
-
Stadtler, Hartmut and Heinrichs, Nikolai
- Subjects
- *
RANDOM variables , *RANDOM numbers , *DISTRIBUTION (Probability theory) , *LINEAR programming , *INTEGER programming - Abstract
Using scenarios to model a stochastic system's behavior poses a dilemma. While a large(r) set of scenarios usually improves the model's accuracy, it also causes drastic increases in the model's size and the computational effort required. Multi-period descriptive sampling (MPDS) is a new way to generate a small(er) set of scenarios that yield a good fit both to the periods' probability distributions and to the convoluted probability distributions of stochastic variables (e.g., period demands) over time. MPDS uses descriptive sampling to draw a sample of S representative random numbers from a period's known (demand) distribution. Now, to create a set of S representative scenarios, MPDS heuristically combines these random numbers (period demands) period by period so that a good fit is achieved to the convoluted (demand) distributions up to any period in the planning interval. A further contribution of this paper is an (accuracy) improvement heuristic, called fine-tuning, executed once the fix-and-optimize (FO) heuristic to solve a scenario-based mixed integer programming model has been completed. Fine-tuning uses linear programming (LP) with fixed binary variables (e.g., setup decisions) generated by FO and iteratively adapts production quantities so that compliance with given expected service level constraints is reached. The LP is solved with relatively little computational effort, even for large(r) sets of scenarios. We show the advancements possible with MPDS and fine-tuning by solving numerous test instances of the stochastic capacitated lot-sizing problem under a static uncertainty approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. On cumulative residual extropy of coherent and mixed systems.
- Author
-
Chakraborty, Siddhartha and Pradhan, Biswabrata
- Subjects
- *
UNCERTAINTY (Information theory) , *RELIABILITY in engineering , *STOCHASTIC orders , *STOCHASTIC systems , *RANDOM variables - Abstract
In system reliability, coherent systems play a central role since without them any system is considered flawed. A system is coherent if all the components are relevant to that system i.e. functioning of every component has an effect in the functioning of the system and the structure of the system is monotone. Monotone structure means that improving any component will not deteriorate the system. A mixed system is a stochastic mixture of coherent systems and any coherent system is a special case of a mixed system. In reliability engineering, one major problem is to compare various coherent and mixed systems so that the better system can be used to increase the overall reliability. Another important problem is to measure the complexity of systems. A highly complex system will naturally have a higher running and maintenance cost associated with it and it is desirable for a reliability engineer to have an understanding regarding the complexity of systems beforehand. In this paper, we address these two problems from an information theoretic approach. Extropy is a measure of information which is the dual of the famous Shannon entropy measure. Recently, a new measure related to extropy, called cumulative residual extropy (CREx), was introduced in the literature by Jahanshahi et al. (Probab Eng Inf Sci 1–21, 2019). This measure is based on the survival function of the underlying random variable and it has some advantages over extropy measure. In this work, we analyze the CREx measure for coherent and mixed systems and develop some comparison results among systems. We also obtain some bounds of CREx of coherent and mixed systems consisting of independent and identically distributed (iid) and dependent and identically distributed (d.i.d.) components. We propose a new divergence measure to calculate the complexity of systems having iid components. Also, we introduced a new discrimination measure to compare various coherent and mixed systems when pairwise comparisons by usual stochastic order is not possible. Finally, we discuss analysis of the CREx measure of coherent systems having heterogeneous components. We also provide applications involving redundancy allocation. Various numerical examples are considered for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. On approximating dependence function and its derivatives.
- Author
-
Tajvidi, Nader
- Subjects
DERIVATIVES (Mathematics) ,DISTRIBUTION (Probability theory) ,EXTREME value theory ,OPTIMIZATION algorithms ,RANDOM variables ,CONVEX functions ,DIFFERENTIABLE functions - Abstract
Bivariate extreme value distributions can be used to model dependence of observations from random variables in extreme levels. There is no finite dimensional parametric family for these distributions, but they can be characterized by a certain one-dimensional function which is known as Pickands dependence function. In many applications the general approach is to estimate the dependence function with a non-parametric method and then conduct further analysis based on the estimate. Although this approach is flexible in the sense that it does not impose any special structure on the dependence function, its main drawback is that the estimate is not available in a closed form. This paper provides some theoretical results which can be used to find a closed form approximation for an exact or an estimate of a twice differentiable dependence function and its derivatives. We demonstrate the methodology by calculating approximations for symmetric and asymmetric logistic dependence functions and their second derivatives. We show that the theory can be even applied to approximating a non-parametric estimate of dependence function using a convex optimization algorithm. Other discussed applications include a procedure for testing whether an estimate of dependence function can be assumed to be symmetric and estimation of the concordance measures of a bivariate extreme value distribution. Finally, an Australian annual maximum temperature dataset is used to illustrate how the theory can be used to build semi-infinite and compact predictions regions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Multivariate stochastic comparisons of sequential order statistics with non-identical components.
- Author
-
Sahoo, Tanmay, Hazra, Nil Kamal, and Balakrishnan, Narayanaswamy
- Subjects
ORDER statistics ,STOCHASTIC orders ,RANDOM variables ,SYSTEM failures - Abstract
Sequential order statistics (SOS) are useful tools for modeling the lifetimes of systems wherein the failure of a component has a significant impact on the lifetimes of the remaining surviving components. The SOS model is a general model that contains most of the existing models for ordered random variables. In this paper, we consider the SOS model with non-identical components and then discuss various univariate and multivariate stochastic comparison results in both one-and two-sample scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. On a preference relation between random variables related to an investment problem.
- Author
-
Răducan, A.M., Vernic, R., and Zbăganu, G.
- Subjects
- *
RANDOM variables , *STOCHASTIC orders - Abstract
Related to a stochastic investment problem which aims to deter-mine when is it better to first invest a larger amount of money and afterwards a smaller one, in this paper we introduce a new preference relation between random variables. We investigate the link between this new relation and some well-known stochastic order relations and present some characterization properties illustrated with numerical examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Sharp inequalities involving multiplicative chaos sums.
- Author
-
Karagulyan, G.A.
- Subjects
- *
GENERATING functions , *RANDOM variables , *MARTINGALES (Mathematics) - Abstract
The present note is an addition to the author's recent paper [44], concerning general multiplicative systems of random variables. Using some lemmas and the methodology of [13], we obtain a general extremal inequality, with corollaries involving Rademacher chaos sums and those analogues for multiplicative systems. In particular we prove that a system of functions generated by bounded products of a multiplicative system is a convergence system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Some practical and theoretical issues related to the quantile estimators.
- Author
-
Dudek, Dagmara and Kuczmaszewska, Anna
- Subjects
RANDOM variables ,COMPARATIVE studies ,QUANTILE regression - Abstract
The paper contains the comparative analysis of the efficiency of different qunatile estimators for various distributions. Additionally, we show strong consistency of different quantile estimators and we study the Bahadur representation for each of the quantile estimators, when the sample is taken from NA, φ , ρ ∗ , ρ -mixing population. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Strong consistency of tail value-at-risk estimator and corresponding general results under widely orthant dependent samples.
- Author
-
Zhou, Jinyu, Yan, Jigao, and Cheng, Dongya
- Subjects
RANDOM variables ,LAW of large numbers ,VALUE at risk ,DEPENDENT variables - Abstract
In this paper, strong consistency of tail value-at-risk (TVaR) estimator under widely orthant dependent (WOD) samples is established, and a numerical simulation is performed to verify the validity of the theoretical results. To reveal the essence of the result, theoretical discussion on complete and complete moment convergence corresponding to the Baum–Katz law, as well as the Marcinkiewicz–Zygmund type strong law of large numbers (MZSLLN) for maximal weighted sums and maximal product sums of widely orthant dependent (WOD) random variables are investigated. The results obtained in the context extend the corresponding ones for independent and some dependent random variables. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Chance-constrained programs with convex underlying functions: a bilevel convex optimization perspective.
- Author
-
Laguel, Yassine, Malick, Jérôme, and van Ackooij, Wim
- Subjects
BILEVEL programming ,CONVEX functions ,PYTHON programming language ,RANDOM variables ,CONSTRAINT satisfaction ,STOCHASTIC programming - Abstract
Chance constraints are a valuable tool for the design of safe decisions in uncertain environments; they are used to model satisfaction of a constraint with a target probability. However, because of possible non-convexity and non-smoothness, optimizing over a chance constrained set is challenging. In this paper, we consider chance constrained programs where the objective function and the constraints are convex with respect to the decision parameter. We establish an exact reformulation of such a problem as a bilevel problem with a convex lower-level. Then we leverage this bilevel formulation to propose a tractable penalty approach, in the setting of finitely supported random variables. The penalized objective is a difference-of-convex function that we minimize with a suitable bundle algorithm. We release an easy-to-use open-source python toolbox implementing the approach, with a special emphasis on fast computational subroutines. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A general impossibility theorem on Pareto efficiency and Bayesian incentive compatibility.
- Author
-
Kikuchi, Kazuya and Koriyama, Yukio
- Subjects
- *
SOCIAL choice , *SOCIAL skills , *RANDOM variables , *SOCIAL classes , *SOCIAL problems - Abstract
This paper studies a general class of social choice problems in which agents' payoff functions (or types) are privately observable random variables, and monetary transfers are not available. We consider cardinal social choice functions which may respond to agents' preference intensities as well as preference rankings. We show that a social choice function is ex ante Pareto efficient and Bayesian incentive compatible if and only if it is dictatorial. The result holds for arbitrary numbers of agents and alternatives, and under a fairly weak assumption on the joint distribution of types, which allows for arbitrary correlations and asymmetries. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Private measures, random walks, and synthetic data.
- Author
-
Boedihardjo, March, Strohmer, Thomas, and Vershynin, Roman
- Subjects
- *
RANDOM walks , *DATA privacy , *POLYNOMIAL time algorithms , *INFORMATION-theoretic security , *METRIC spaces , *RANDOM variables , *INDEPENDENT variables - Abstract
Differential privacy is a mathematical concept that provides an information-theoretic security guarantee. While differential privacy has emerged as a de facto standard for guaranteeing privacy in data sharing, the known mechanisms to achieve it come with some serious limitations. Utility guarantees are usually provided only for a fixed, a priori specified set of queries. Moreover, there are no utility guarantees for more complex—but very common—machine learning tasks such as clustering or classification. In this paper we overcome some of these limitations. Working with metric privacy, a powerful generalization of differential privacy, we develop a polynomial-time algorithm that creates a private measure from a data set. This private measure allows us to efficiently construct private synthetic data that are accurate for a wide range of statistical analysis tools. Moreover, we prove an asymptotically sharp min-max result for private measures and synthetic data in general compact metric spaces, for any fixed privacy budget ε bounded away from zero. A key ingredient in our construction is a new superregular random walk, whose joint distribution of steps is as regular as that of independent random variables, yet which deviates from the origin logarithmically slowly. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Wilson Loop Expectations for Non-abelian Finite Gauge Fields Coupled to a Higgs Boson at Low and High Disorder.
- Author
-
Adhikari, Arka
- Subjects
- *
HIGGS bosons , *FINITE fields , *RANDOM variables , *ELECTROWEAK interactions - Abstract
We consider computations of Wilson loop expectations to leading order at large β in the case where a non-abelian finite gauge field interacts with a Higgs boson. By identifying the main order contributions from minimal vortices, we can express the Wilson loop expectations via an explicit Poisson random variable. This paper treats multiple cases of interests, including the Higgs boson at low and high disorder, and finds efficient polymer expansion like computations for each of these regimes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. On the Transformation of a Stationary Fuzzy Random Process by a Linear Dynamic System.
- Author
-
Khatskevich, V. L.
- Subjects
- *
STOCHASTIC processes , *STATIONARY processes , *DYNAMICAL systems , *RANDOM variables , *LINEAR systems - Abstract
In this paper, stationary random processes with fuzzy states are studied. The properties of their numerical characteristics—fuzzy expectations, expectations, and covariance functions—are established. The spectral representation of the covariance function, the generalized Wiener–Khinchin theorem, is proved. The main attention is paid to the problem of transforming a stationary fuzzy random process (signal) by a linear dynamic system. Explicit-form relationships are obtained for the fuzzy expectations (and expectations) of input and output stationary fuzzy random processes. An algorithm is developed and justified to calculate the covariance function of a stationary fuzzy random process at the output of a linear dynamic system from the covariance function of a stationary input fuzzy random process. The results rest on the properties of fuzzy random variables and numerical random processes. Triangular fuzzy random processes are considered as examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Fast parameterless prototype-based co-clustering.
- Author
-
Battaglia, Elena, Peiretti, Federico, and Pensa, Ruggero G.
- Subjects
COMPLEX matrices ,STATISTICAL association ,RANDOM variables ,DATA analysis ,RECOMMENDER systems - Abstract
Tensor co-clustering algorithms have been proven useful in many application scenarios, such as recommender systems, biological data analysis and the analysis of complex and evolving networks. However, they are significantly affected by wrong parameter configurations, since, at the very least, they require the cluster number to be set for each mode of the matrix/tensor, although they typically have other algorithm-specific hyper-parameters that need to be fine-tuned. Among the few known objective functions that can be optimized without setting these parameters, the Goodman–Kruskal τ —a statistical association measure that estimates the strength of the link between two or more discrete random variables—has proven its effectiveness in complex matrix and tensor co-clustering applications. However, its optimization in a co-clustering setting is tricky and, so far, has leaded to very slow and, at least in some specific but not unfrequent cases, inaccurate algorithms, due to its normalization term. In this paper, we investigate some interesting mathematical properties of τ , and propose a new simplified objective function with the ability of discovering an arbitrary and a priori unspecified number of good-quality co-clusters. Additionally, the new objective function definition allows for a novel prototype-based optimization strategy that enables the fast execution of matrix and higher-order tensor co-clustering. We show experimentally that the new algorithm preserves or even improves the quality of the discovered co-clusters by outperforming state-of-the-art competing approaches, while reducing the execution time by at least two orders of magnitude. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Evolutionary support vector regression for monitoring Poisson profiles.
- Author
-
Yeganeh, Ali, Abbasi, Saddam Akber, Shongwe, Sandile Charles, Malela-Majika, Jean-Claude, and Shadman, Ali Reza
- Subjects
- *
POISSON regression , *QUALITY control charts , *LIKELIHOOD ratio tests , *RANDOM variables , *ROBUST control , *EVOLUTIONARY algorithms - Abstract
Many researchers have shown interest in profile monitoring; however, most of the applications in this field of research are developed under the assumption of normal response variable. Little attention has been given to profile monitoring with non-normal response variables, known as general linear models which consists of two main categories (i.e., logistic and Poisson profiles). This paper aims to monitor Poisson profile monitoring problem in Phase II and develops a new robust control chart using support vector regression by incorporating some novel input features and evolutionary training algorithm. The new method is quicker in detecting out-of-control signals as compared to conventional statistical methods. Moreover, the performance of the proposed scheme is further investigated for Poisson profiles with both fixed and random explanatory variables as well as non-parametric profiles. The proposed monitoring scheme is revealed to be superior to its counterparts, including the likelihood ratio test (LRT), multivariate exponentially weighted moving average (MEWMA), LRT-EWMA and other machine learning-based schemes. The simulation results show superiority of the proposed method in profiles with fixed explanatory variables and non-parametric models in nearly all situations while it is not able to be the best in all the simulations when there are with random explanatory variables. A diagnostic method with machine learning approach is also used to identify the parameters of change in the profile. It is shown that the proposed profile diagnosis approach is able to reach acceptable results in comparison with other competitors. A real-life example in monitoring Poisson profiles is also provided to illustrate the implementation of the proposed charting scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.