171 results on '"Geometric probabilities -- Research"'
Search Results
2. Long-term values in Markov decision processes and repeated games, and a new distance for probability spaces
- Author
-
Renault, Jerome and Venel, Xavier
- Subjects
Game theory -- Research ,Markov processes -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Mathematical research ,Probabilities -- Research ,Business ,Computers and office automation industries ,Mathematics - Abstract
Abstract. We study long-term Markov decision processes (MDPs) and gambling houses, with applications to any partial observation MDPs with finitely many states and zero-sum repeated games with an informed controller. [...]
- Published
- 2017
- Full Text
- View/download PDF
3. Polynormally computable bounds for the probability of the union of events
- Author
-
Boros, Endre, Scozzari, Andrea, Tardella, Fabio, and Veneziani, Pierangela
- Subjects
Boundary value problems -- Research ,Linear programming -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Mathematical research ,Probabilities -- Research ,Business ,Computers and office automation industries ,Mathematics - Abstract
We consider the problem of finding upper and lower bounds for the probability of the union of events when the probabilities of the single events and the probabilities of the [...]
- Published
- 2014
- Full Text
- View/download PDF
4. Markov chain based predictive (BCI) speller
- Author
-
Wang, Jenny and Spoonamore, Janet
- Subjects
Computer interfaces -- Design and construction ,Spelling -- Technology application ,Markov processes -- Analysis -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,IEEE-488 interface ,Technology application ,Science and technology - Abstract
Brain-Computer Interfaces (BCI) allow for hands free communication by correlating visual stimulation with surface electroencephalograph (EEG) signal responses. To overcome inefficiency in the current BCI speller, we created a Markov-chain based predictive method to help enhance letter selection. Based on the observation that the 100 most commonly used English words make up about half of all written material and the 300 most commonly used words make up about 65% of all written material, we analyzed the 100 and 300 most commonly used words and calculated the first order Markov Transition probability of next alphabetic character given the current letter. Based on the Markov transition probability, we created three optimized visual stimulation patterns for text input: 1) ordered row and column flashing of next possible letters in decreasing probability; 2) ordered row and column flashing of next possible letters after re-arranging the list to top left corner diagonally; 3) ordered single character flashing of next possible letters in decreasing probability. Our simulation results showed a significant speed up in text entry over current methods. Keywords: Brain-Computer Interfaces, Markov Analysis, Predictive text input, Visual Speller, INTRODUCTION Brain-Computer Interfaces (BCI) [1], which is based on the correlation of P300 evoked reaction potentials (ERPs) with visual stimuli, has enabled severely disabled people to interact with computers. The [...]
- Published
- 2014
5. Modeling probabilistic traffic breakdown on congested freeway flow
- Author
-
Xu, Tian-dong, Hao, Yuan, Peng, Zhong-ren, and Sun, Li-jun
- Subjects
Traffic estimation -- Methods ,Traffic congestion -- Control ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Engineering and manufacturing industries - Abstract
The study aims to develop a novel evolution model of speed perturbations for quantifying the probability of the traffic flow breakdown on congested freeway flow. The model is applicable to freeway operation and management. Time headway, driver behavior, and state transitions with stochastic nature are the most important factors for the breakdown and recovery of a disturbance. Two types of drivers, 'adventurers' and 'conservatives,' and two styles of reacting behavior, 'copy' and 'shift' are defined to characterize driving behavior. The time headway distributions for arriving and discharging traffic are distinguished for the existence of the 'hysteresis' phenomenon. The Monte Carlo simulation is carried out to obtain the estimation of the probability of breakdown (PB). The calculated PB is a monotonically increasing function of freeway flow with an 'S' shape. The simulation results indicate that the proposed model can describe the changes in spatial extent and magnitude of a disturbance, which have potential use for future works and field applications. Key words: traffic breakdown, stochastic nature, time headway, driving behavior, car-following theory. Cette etude vise a developper un modele innovateur devolution des perturbations de vitesse afin de quantifier la probability de distribution de la circulation sur une autoroute congestionnee. Le modele peut etre utilise pour l'operation et la gestion des autoroutes. L'intervalle entre les passages, le comportement des conducteurs et la transition d'etats de nature stochastique sont les plus importants facteurs a considerer pour la distribution de la circulation et la recuperation a la suite d'une perturbation. Deux types de conducteurs, les << aventuriers >> et les << conservateurs >> et deux styles de comportement reactif, la << copie >> et le << deplacement >> sont definis pour caracteriser le comportement de conduite. Les distributions des intervalles entre les passages pour la circulation entrant et sortant ont ete separees pour l'existence du phenomene d'hysterese. Une simulation de Monte-Carlo est effectuee pour estimer la probabilite d'une distribution de la circulation (PB). Le PB calcule est une fonction monotone croissante de la circulation sur l'autoroute presentant la forme d'un << S >>. Les resultats de la simulation indiquent que le modele propose peut decrire les changements dans l'etendue spatiale et l'amplitude d'une perturbation, ce qui represente une utilisation potentielle pour les travaux futurs et les applications sur le terrain. [Traduit par la Redaction] Mots-cles: distribution de la circulation, nature stochastique, intervalle entre les passages, comportement des conducteurs, theorie de la poursuite., Introduction Traffic congestion caused by instability of traffic flow induces travel delays, high risks of accidents, frequent accelerations and decelerations, and an increase of emissions. Once a congestion queue appears, [...]
- Published
- 2013
- Full Text
- View/download PDF
6. Testing the reliability of paper-pencil versions of the fill-in-the-blank and multiple-choice methods of measuring probability discounting for seven different outcomes
- Author
-
Weatherly, Jeffrey N. and Derenne, Adam
- Subjects
Psychological research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Functional equations -- Research ,Functions -- Research ,Probabilities -- Research ,Psychology and mental health - Abstract
Probability discounting occurs when the subjective value of an outcome changes because its delivery is uncertain. The present study measured the reliability of rates of probability discounting. Participants completed a probability-discounting task that involved seven different outcomes. Some participants completed the task using the fill-in-the-blank (FITB) method and others using the multiple-choice (MC) method. Participants completed the task a second time either 4 or 12 weeks later. Data were analyzed using hyperbolic and hyperbolic-like functions, and by calculating the area under the discounting curve. The FITB method consistently produced steeper rates of discounting than the MC method. Discounting rates were generally reliable when discounting was analyzed using a hyperbolic function or area under the curve, but not when a hyperbolic-like function was used. Overall, reliability measures were somewhat lower than previously observed for delay discounting. These results suggest that rates of probability discounting are temporally reliable, but that the observed rates of discounting will depend on the type of method used to collect and analyze the discounting data. Key words: probability discounting, reliability, fill-in-the-blank method, multiple-choice method, university students, Discounting occurs when the subjective value of an outcome is altered because its delivery is delayed or uncertain (see Madden & Bickel, 2010, for a review). For instance, if you [...]
- Published
- 2013
7. Probability and time trade-off
- Author
-
Baucells, Manel and Heukamp, Franz H.
- Subjects
Management science -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Functional equations -- Research ,Functions -- Research ,Probabilities -- Research ,Business, general ,Business - Abstract
Probability and time are integral dimensions of virtually any decision. To treat them together, we consider the prospect of receiving outcome x with a probability p at time t. We define risk and time distance, and show that if these two distances are traded off linearly, then preferences are characterized by three functions: a value function, a probability discount rate function, and a psychological distance function. The concavity of the psychological distance function explains the common ratio and common difference effects. A decreasing probability discount rate accounts for the magnitude effect. The discount rate and the risk premium depend on the shape of these three functions. Key words: risk preferences; time preferences; probability discount rate; subendurance; magnitude effect; psychological distance History: Received July 14, 2009; accepted July 19, 2011, by Peter Wakker, decision analysis. Published online in Article in Advance December 2, 2011., 1. Introduction Time and probability are fundamental attributes of virtually any decision. Decisions involving future risky consequences can be found in domains such as investment, saving, consumption, environmental preservation, and [...]
- Published
- 2012
8. Stability criteria for complex ecosystems
- Author
-
Allesina, Stefano and Tang, Si
- Subjects
Ecosystems -- Models -- Research ,Biological complexity -- Research -- Models ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Environmental issues ,Science and technology ,Zoology and wildlife conservation - Abstract
Forty years ago, May proved (1,2) that sufficiently large or complex ecological networks have a probability of persisting that is close to zero, contrary to previous expectations (3-5). May analysed large networks in which species interact at random (1,2,6). However, in natural systems pairs of species have well-defined interactions (for example predator-prey, mutualistic or competitive). Here we extend May's results to these relationships and find remarkable differences between predator-prey interactions, which are stabilizing, and mutualistic and competitive interactions, which are destabilizing. We provide analytic stability criteria for all cases. We use the criteria to prove that, counterintuitively, the probability of stability for predator-prey networks decreases when a realistic food web structure is imposed (7,8) or if there is a large preponderance of weak interactions (9,10). Similarly, stability is negatively affected by nestedness (11-14) in bipartite mutualistic networks. These results are found by separating the contribution of network structure and interaction strengths to stability. Stable predator-prey networks can be arbitrarily large and complex, provided that predator-prey pairs are tightly coupled. The stability criteria are widely applicable, because they hold for any system of differential equations., May's theorem deals with community matrices (1,2,6) M, of size S x S, where S is the number of species. My describes the effect that species j has on i [...]
- Published
- 2012
- Full Text
- View/download PDF
9. The expected time to attain chemical equilibrium from a thermodynamic probabilistic analysis
- Author
-
Pastore, Christopher and Garfinkle, Moishe
- Subjects
Chemical equilibrium -- Research ,Thermodynamics -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Chemical reactions -- Research ,Chemical reaction, Rate of -- Research ,Probabilities -- Research ,Chemistry - Abstract
Employing a stochastic model, both Planck and Fokker proposed almost a century ago that stoichiometric chemical reactions proceed by a chain mechanism involving discrete reaction steps. To determine whether such a chain mechanism was in fact a valid mechanism for chemical reactions was the subject of a recent study (Garfinkle, M. 2002. J. Phys. Chem. 106A: 490). Using a thermodynamic-probabilistic algorithm the stochastic reaction paths were found to be in excellent agreement with the observed reaction paths plotted from experimental data. This study was then extended to test the conclusions of Ehrenfest and Prigogine that a chain mechanism dictates that the number of discrete reaction steps required for a chemical reaction to attain equilibrium must be finite. The stochastic and empirical reaction paths were compared using experimental data for first-, second-, and third-order reactions as well as fractional order reactions. The empirical verification was excellent. Key words: chemical thermodynamics, irreversible thermodynamics, reaction kinetics, mass action, natural path. En se basant sur un modele stochastique, Plank et Fokker ont tous les deux propose independamment, il y a pratiquement un siecle, que les reactions chimiques stoechiometriques se produisent par un mecanisme en chaine impliquant des etapes reactionnelles distinctes. Un travail recent (Garfinkle, M. 2002. J. Phys. Chem. 106A: 490) a ete realise pour determiner si un tel mecanisme en chaine est un mecanisme valide pour les reactions chimiques. Sur la base d'un algorithme thermodynamique-probabilistique, on a trouve que les voies reactionnelles stochastiques sont en excellent accord avec les voies reactionnelles observees sur la base de graphiques obtenus a partir de donnees experimentales. Cette etude a alors ete etendue pour verifier la veracite des conclusions de Ehrenfest et Prigogine a l'effet qu'un mecanisme en chaine impose un nombre determine d'etapes reactionnelles distinctes necessaires pour qu'une reaction chimique atteigne un equilibre. On a compare les voies reactionnelles stochastiques et empiriques a l' aide de donnees experimentales pour des reactions d'ordre un, deux et trois ainsi que pour des reactions d'ordres fractionnaires. La verification empirique est excellente. Mots-cles : thermodynamique chimique, thermodynamique irreversible, cinetique reactionnelle, action de masse, voie naturelle. [Traduit par la Redaction], Introduction Classically, the progress of chemical reactions have been described by the empirical rate equations, first elucidated by Guldberg and Waage (see ref. 1) a century and a half ago. [...]
- Published
- 2012
- Full Text
- View/download PDF
10. Probabilistic version of the Robertson and Wride method for liquefaction evaluation: development and application
- Author
-
Ku, Chih-Sheng, Juang, C. Hsein, Chang, Chi-Wen, and Ching, Jianye
- Subjects
Soil liquefaction -- Measurement ,Soil mechanics -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Earth sciences - Abstract
The Robertson and Wride method is the most widely used cone penetration test (CPT)-based method for soil liquefaction evaluation. This method is a deterministic model, which expresses liquefaction potential in terms of factor of safety. On many occasions, there is a need to express the liquefaction potential in terms of liquefaction probability. Although several probabilistic models are available in the literature, there is an advantage having a probabilistic version of the Robertson and Wride method so that the engineer who prefers to use this method can obtain additional information of liquefaction probability with minimal extra effort. In this paper, a simple model is developed, which links the factor of safety determined by the Robertson and Wride method to the liquefaction probability. The model, referred to as the probabilistic RW model, is developed, and verified, in a mathematically rigorous manner. Simplified equations for assessing the variation of liquefaction probability caused by the uncertainty in input parameters are also developed. Example applications are presented to demonstrate the developed models. Key words: statistical analysis, probability, liquefaction, earthquakes, case histories, in situ testing, cone penetration test. La methode de Wride et Robertson est la methode la plus couramment utilisee pour l'evaluation de la liquefaction du sol a partir d'essais de penetration du cone. Cette methode est un modele deterministe, qui exprime le potentiel de liquefaction en termes de facteur de securite. A plusieurs occasions, il peut etre necessaire d'exprimer le potentiel de liquefaction en termes de probabilite de liquefaction. Meme si plusieurs modeles probabilistes sont disponibles dans la litterature, l'avantage d'utiliser une version probabiliste de la methode de Robertson et Wride est que l'ingenieur qui prefere utiliser cette methode peut obtenir des informations additionnelles sur la probabilite de liquefaction en fournissant peu d'efforts supplementaires. Dans cet article, un modele simple est developpe reliant le facteur de securite determine par la methode de Robertson et Wride a la probabilite de liquefaction. Le modele, appele le modele RW probabiliste, est developpe et verifie selon une methode mathematique rigoureuse. Des equations simplifiees pour evaluer la variation de la probabilite de liquefaction causee par l'incertitude liee aux parametres d'entree sont aussi developpees. Des exemples d'application sont presentes pour demontrer les modeles developpes. Mots-cles : analyse statistique, probabilite, liquefaction, seismes, histoire de cas, essais in situ, essai de penetration du cone. [Traduit par la Redaction], Introduction Earthquake-induced liquefaction of soils may cause ground failure, such as surface settlement, lateral spreading, sand boils, and flow failures, which in turn may cause damage to infrastructure, such as [...]
- Published
- 2012
- Full Text
- View/download PDF
11. Potential for wider application of 3P sampling in forest inventory
- Author
-
West, P.W.
- Subjects
Prediction (Logic) -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Forest management -- Research ,Probabilities -- Research ,Earth sciences - Abstract
Sampling with probability proportional to prediction (3P sampling) is useful where the variable of interest to a forest inventory is costly to measure and where there exists a cheaper to measure auxiliary variable, which correlates positively with the variable of interest. Two forms of 3P sampling, termed 'classical' and 'point-' 3P sampling, have received some use in forest inventory. However, both have limitations that have restricted their use mainly to estimation of tree stem wood volume for timber sales over small forest areas in North America. A more general form of 3P sampling, termed here 'ordinary' 3P sampling, has been all but ignored to date. It has potential for use in inventory of a broad range of forest attributes, both floral and faunal and both commercial and environmental, across large or small forest areas. Using a common mathematical approach, the present work derives the estimators of the population mean for these three forms of 3P sampling. Their properties are compared with simple random sampling through Monte Carlo simulations based on two example forest populations. The work lays a basis from which 3P sampling might develop further and enjoy wider application in forest inventory than has been the case previously. Resume: L'echantillonnage avec probabilite proportionnelle a la prevision (echantillonnage 3P) est utile lorsque la variable d'interet d'un inventaire forestier est couteuse a mesurer et qu'il existe une variable auxiliaire moins chere a mesurer et positivement correlee avec la variable d'interet. Deux formes d'echantillonnage 3P dites [much less than] classique [much greater than] et [much less than] par point [much greater than] ont connu un certain usage dans l'inventaire forestier. Toutefois, elles ont des lacunes et leur usage a essentiellement ete limite a l'estimation du volume de bois de la tige pour les ventes de bois dans les forets de petite superficie en Amerique du Nord. Une forme plus generale de l'echantillonnage 3P, appelee ici echantillonnage 3P [much less than] ordinaire [much greater than], a ete completement ignoree a ce jour. Elle a un potentiel d'utilisation pour inventorier un large eventail d'attributs forestiers, a la fois floraux et fauniques et a la fois commerciaux et environnementaux, dans les forets de petite et grande superficies. Les estimateurs de la moyenne de la population pour ces trois formes d'echantillonnage 3P sont derives en utilisant une approche mathematique commune. Les proprietes de ces estimateurs sont comparees a un echantillonnage aleatoire simple au moyen de simulations Monte Carlo basees sur deux exemples de population forestiere. Cette etude etablit une base a partir de laquelle l'echantillonnage 3P pourrait se developper davantage et etre applique dans l'inventaire forestier plus qu'il l'a ete jusqu'a maintenant. [Traduit par la Redaction], Introduction Whether managing forests for wood production or for other commercial products or to maintain their biological resources, it is necessary to know how much of those resources the forest [...]
- Published
- 2011
- Full Text
- View/download PDF
12. A new goodness-of-fit test for event forecasting and its application to credit defaults
- Author
-
Blochlinger, Andreas and Leippold, Markus
- Subjects
Management science -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Goodness-of-fit tests -- Research ,Probabilities -- Research ,Business, general ,Business - Abstract
We develop a new goodness-of-fit test for validating the performance of probability forecasts. Our test statistic is particularly powerful under sparseness and dependence in the observed data. To build our test statistic, we start from a formal definition of calibrated forecasts, which we operationalize by introducing two components. The first component tests the level of the estimated probabilities; the second validates the shape, measuring the differentiation between high and low probability events. After constructing test statistics for both level and shape, we provide a global goodness-of-fit statistic, which is asymptotically [X.sup.2] distributed. In a simulation exercise, we find that our approach is correctly sized and more powerful than alternative statistics. In particular, our shape statistic is significantly more powerful than the Kolmogorov--Smirnov test. Under independence, our global test has significantly greater power than the popular Hosmer-Lemeshow's [X.sup.2] test. Moreover, even under dependence, our global test remains correctly sized and consistent. As a timely and important empirical application of our method, we study the validation of a forecasting model for credit default events. Key words: out-of-sample validation; probability calibration; Hosmer-Lemeshow statistic; Bernoulli mixture models; credit risk History: Received August 12, 2009; accepted October 22, 2010, by Wei Xiong, finance. Published online in Articles in Advance January 28, 2011., 1. Introduction In conclusion, at present no really powerful tests of adequate calibration are currently available. Due to the correlation effects that have to be respected there even seems to [...]
- Published
- 2011
13. A generalised lottery paradox for infinite probability spaces
- Author
-
Smith, Martin
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Philosophy and religion ,Science and technology - Published
- 2010
14. A comprehensive procedure to estimate the probability of extreme vibration levels due to mistuning
- Author
-
Chan, Y.-J. and Ewins, D.J.
- Subjects
Vibration -- Research ,Monte Carlo method -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Blades -- Mechanical properties ,Blades -- Acoustic properties ,Blades -- Maintenance and repair ,Engineering and manufacturing industries ,Science and technology - Abstract
A new procedure is developed to find the probabilities of extremely high amplification factors in mistuned bladed disk vibration levels, typical of events which occur rarely. While a rough estimate can be made by curve-fitting the distribution function generated in a Monte Carlo simulation, the procedure presented here can determine a much more accurate upper bound and the probabilities of amplification factors near to that bound. The procedure comprises an optimization analysis based on the conjugate gradient method and a stochastic simulation using the importance sampling method. Two examples are provided to illustrate the efficiency of the procedure, which can be 2 or 3 orders of magnitude more efficient than Monte Carlo simulations. [DOI: 10.1115/1.4001065]
- Published
- 2010
15. Probabilistic seismic demand models and fragility estimates for reinforced concrete highway bridges with one single-column bent
- Author
-
Huang, Qindan, Gardoni, Paolo, and Hurlebaus, Stefan
- Subjects
Seismology -- Research ,Strains and stresses -- Models ,Stress relaxation (Materials) -- Models ,Stress relieving (Materials) -- Models ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Bridges -- Mechanical properties ,Bridges -- Materials ,Bridges -- Testing ,Reinforced concrete -- Mechanical properties ,Structural analysis (Engineering) -- Methods ,Science and technology - Abstract
In performance-based seismic design, general and practical seismic demand models of structures are essential. This paper proposes a general methodology to construct probabilistic demand models for reinforced concrete (RC) highway bridges with one single-column bent. The developed probabilistic models consider the dependence of the seismic demands on the ground motion characteristics and the prevailing uncertainties, including uncertainties in the structural properties, statistical uncertainties, and model errors. Probabilistic models for seismic deformation, shear, and bivariate deformation-shear demands are developed by adding correction terms to deterministic demand models currently used in practice. The correction terms remove the bias and improve the accuracy of the deterministic models, complement the deterministic models with ground motion intensity measures that are critical for determining the seismic demands, and preserve the simplicity of the deterministic models to facilitate the practical application of the proposed probabilistic models. The demand data used for developing the models are obtained from 60 representative configurations of finite-element models of RC bridges with one single-column bent subjected to a large number of representative seismic ground motions. The ground motions include near-field and ordinary records, and the soil amplification due to different soil characteristics is considered. A Bayesian updating approach and an all possible subset model selection are used to assess the unknown model parameters and select the correction terms. Combined with previously developed capacity models, the proposed seismic demand models can be used to estimate the seismic fragility of RC bridges with one single-column bent. Seismic fragility is defined as the conditional probability that the demand quantity of interest attains or exceeds a specified capacity level for given values of the earthquake intensity measures. As an application, the univariate deformation and shear fragilities and the bivariate deformation-shear fragility are assessed for an example bridge. DOI: 10.1061/(ASCE)EM.1943-7889.0000186 CE Database subject headings: Bayesian analysis; Reinforced concrete; Bridges, highway; Bridges, concrete; Probability; Deformation; Shear failures; Seismic analysis; Experimentation. Author keywords: Bayesian analysis; Reinforced concrete; Bridges; Latin hypercube sampling; Probabilistic models; Deformation failure; Shear failure; Seismic analysis; Experimental design; Fragility.
- Published
- 2010
16. Random field characterization considering statistical dependence for probability analysis and design
- Author
-
Xi, Zhimin, Youn, Byeng D., and Hu, Chao
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Fields, Algebraic -- Research ,Engineering mathematics -- Research ,Engineering and manufacturing industries ,Science and technology - Abstract
The proper orthogonal decomposition method has been employed to extract the important field signatures of random field observed in an engineering product or process. Our preliminary study found that the coefficients of the signatures are statistically uncorrelated but may be dependent. To this point, the statistical dependence of the coefficients has been ignored in the random field characterization for probability analysis and design. This paper thus proposes an effective random field characterization method that can account for the statistical dependence among the coefficients for probability analysis and design. The proposed approach has two technical contributions. The first contribution is the development of a natural approximation scheme of random field while preserving prescribed approximation accuracy. The coefficients of the signatures can be modeled as random field variables, and their statistical properties are identified using the chi-square goodness-of-fit test. Then, as the paper's second technical contribution, the Rosenblatt transformation is employed to transform the statistically dependent random field variables into statistically independent random field variables. The number of the transformation sequences exponentially increases as the number of random field variables becomes large. It was found that improper selection of a transformation sequence among many may introduce high nonlinearity into system responses, which may result in inaccuracy in probability analysis and design. Hence, this paper proposes a novel procedure of determining an optimal sequence of the Rosenblatt transformation that introduces the least degree of nonlinearity into the system response. The proposed random field characterization can be integrated with any advanced probability analysis method, such as the eigenvector dimension reduction method or polynomial chaos expansion method. Three structural examples, including a microelectromechanical system bistable mechanism, are used to demonstrate the effectiveness of the proposed approach. The results show that the statistical dependence in the random field characterization cannot be neglected during probability analysis and design. Moreover, it is shown that the proposed random field approach is very accurate and efficient. [DOI: 10.1115/1.4002293] Keywords: random field, proper orthogonal decomposition, probability analysis and design, eigenvector dimension reduction
- Published
- 2010
17. On adding a list of numbers (and other one-dependent determinantal processes)
- Author
-
Borodin, Alexei, Diaconis, Persi, and Fulman, Jason
- Subjects
Numbers -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Mathematics - Abstract
Adding a column of numbers produces 'carries' along the way. 'We show that random digits produce a pattern of carries with a neat probabilistic description: the carries form a on-dependent determinantal point process, This makes it easy to answer natural question: How many carries are typical? Where are they locate? We show that many further examples, from combinatorics, algebra and group theory, have essentially the same neat formulae, and that any one-dependent point process on the integers is determinantal. The examples give a gentle introduction to the emerging fields of one-dependent and determinantal point processes.
- Published
- 2010
18. Stochastic models for large interacting systems and related correlation inequalities
- Author
-
Liggett, Thomas M.
- Subjects
Learning models (Stochastic processes) -- Usage ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Science and technology - Abstract
A very large and active part of probability theory is concerned with the formulation and analysis of models for the evolution of large systems arising in the sciences, including physics and biology. These models have in their description randomness in the evolution rules, and interactions among various parts of the system. This article describes some of the main models in this area, as well as some of the major results about their behavior that have been obtained during the past 40 years. An important technique in this area, as well as in related parts of physics, is the use of correlation inequalities. These express positive or negative dependence between random quantities related to the model. In some types of models, the underlying dependence is positive, whereas in others it is negative. We give particular attention to these issues, and to applications of these inequalities. Among the applications are central limit theorems that give convergence to a Gaussian distribution. contact process | exclusion process | Glauber dynamics | voter models doi/ 10.1073/pnas.1011270107
- Published
- 2010
19. Alterations in choice behavior by manipulations of world model
- Author
-
Green, C.S., Benson, C., Kersten, D., and Schrater, P.
- Subjects
Reinforcement (Psychology) -- Influence ,Decision-making -- Methods ,Decision-making -- Psychological aspects ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Algorithms -- Usage ,Algorithm ,Science and technology - Abstract
How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) 'probability matching'--a consistent example of suboptimal choice behavior seen in humans --occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning. decision making | probability matching | reinforcement learning www.pnas.org/cgi/doi/ 10.1073/pnas.1001709107
- Published
- 2010
20. Multi-objective optimal design of stationary flat-plate solar collectors under probabilistic uncertainty
- Author
-
Rao, Singiresu S. and Hu, Yi
- Subjects
Mathematical optimization -- Analysis ,Solar collectors -- Design and construction ,Engineering design -- Methods ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Engineering and manufacturing industries ,Science and technology - Abstract
The multi-objective optimum design of stationary flat-plate solar collectors under probabilistic uncertainty is considered. The clear day solar beam radiation and diffuse radiation at the location of the solar collector are estimated. Three objectives are considered in the optimization problem formulation: maximization of the annual average incident solar energy, maximization of the lowest month incident solar energy, and minimization of the cost. The game theory methodology is used for the solution of the three objective constrained optimization problem. A parametric study is conducted with respect to changes in the standard deviation of the mean values of random variables and probability of constraint satisfaction. The present study is expected to help designers in creating optimized solar collectors based on specified requirements. [DOI: 10.1115/1.4002133]
- Published
- 2010
21. The lottery paradox generalized?
- Author
-
Chandler, Jake
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Philosophy and religion ,Science and technology - Published
- 2010
22. Probabilistic approach to estimation of urban storm-water TMDLS: regulated catchment
- Author
-
Kuzin, S.A. and Adams, B.J.
- Subjects
Runoff -- Environmental aspects ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Best practices -- Management ,Watersheds -- Management ,Company business management ,Engineering and manufacturing industries ,Science and technology - Abstract
The present work is concerned with the development of a set of tools for the incorporation of various control measures--best management practices into an analytical probabilistic modeling approach for urban storm-water total maximum daily load (TMDL) estimation. Control measures are divided into two major groups--upstream and downstream, each requiring application of separate modeling principles elaborated in this paper. Applying Monte Carlo simulation to the developed set of expressions allows modeling the 'end-of-pipe' parameters of urban storm-water discharges (runoff volume, discharge rate, and pollutant load) on an event average basis, as well as the stream parameters downstream of a storm-water discharge outlet. Model application is illustrated for a catchment regulated with an extended detention dry pond. Representative model results are presented, and a range of potential model applications is discussed. The capability to model the behavior of an urban storm-water system with the application of various control measures is the key precondition for the design of an optimal configuration of a water-protective strategy. DOI: 10.1061/(ASCE)IR.1943-4774.0000156 CE Database subject headings: Stormwater management; Surface water; Probability; Monte Carlo method; Catchments; Urban areas. Author keywords: Storm-water management; Surface water protection; Probabilistic models; Monte Carlo simulation.
- Published
- 2010
23. Probabilistic treatment of crack nucleation and growth for gas turbine engine materials
- Author
-
Enright, M.P., McClung, R.C., Hudak, S.J., and Francis, W.L.
- Subjects
Gas-turbines -- Materials ,Gas-turbines -- Maintenance and repair ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Fatigue testing machines -- Control ,Materials -- Fatigue ,Materials -- Control ,Engineering and manufacturing industries ,Science and technology - Abstract
The empirical models commonly used for probabilistic life prediction do not provide adequate treatment of the physical parameters that characterize fatigue damage development. For these models, probabilistic treatment is limited to statistical analysis of strain-life regression fit parameters. In this paper, a model is proposed for life prediction that is based on separate nucleation and growth phases of total fatigue life. The model was calibrated using existing smooth specimen strain-life data, and it has been validated for other geometries. Crack nucleation scatter is estimated based on the variability associated with smooth specimen and fatigue crack growth data, including the influences of correlation among crack nucleation and growth phases. The influences of crack nucleation and growth variability on life and probability of fracture are illustrated for a representative gas turbine engine disk geometry. [DOI: 10.1115/1.4000289]
- Published
- 2010
24. Brief communication: a probabilistic approach to age estimation from infracranial sequences of maturation
- Author
-
Coqueugniot, Helene, Weaver, Timothy D., and Houet, Francis
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Age -- Analysis ,Craniotomy -- Research ,Anthropology/archeology/folklore - Abstract
Infracranial sequences of maturation are commonly used to estimate the age at death of nonadult specimens found in archaeological, paleoanthropological, or forensic contexts. Typically, an age assessment is made by comparing the degree of long-bone epiphyseal fusion in the target specimen to the age ranges for different stages of fusion in a reference skeletal collection. While useful as a first approximation, this approach has a number of shortcomings, including the potential for 'age mimicry,' being highly dependent on the sample size of the reference sample and outliers, not using the entire fusion distribution, and lacking a straightforward quantitative way of combining age estimates from multiple sites of fusion. Here we present an alternative probabilistic approach based on data collected on 137 individuals, ranging in age from 7- to 29-years old, from a documented skeletal collection from Coimbra, Portngah We then use cross validation to evaluate the accuracy of age estimation from epiphyseal fusion. While point estimates of age can, at least in some circumstances, be both accurate and precise based on the entire skeleton, or many sites of fusion, there will often be substantial error in these estimates when they derive from one or only a few sites. Because a probabilistic approach to age estimation from epiphyseal fusion is computationally intensive, we make available a series of spreadsheets or computer programs that implement the approach presented here. Am J Phys Anthropol 142:655-664, 2010. KEY WORDS Bayesian statistics; osteological collections; postcranium; skeletal maturation DOI 10.1002/ajpa.21312
- Published
- 2010
25. Probabilistic quorum systems in wireless ad hoc networks
- Author
-
Friedman, Roy, Kliot, Gabriel, and Avin, Chen
- Subjects
Company business management ,Ad hoc networks (Computer networks) -- Management ,Quantum computing -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research - Abstract
Quorums are a basic construct in solving many fundamental distributed computing problems. One of the known ways of making quorums scalable and efficient is by weakening their intersection guarantee to being probabilistic. This article explores several access strategies for implementing probabilistic quorums in ad hoc networks. In particular, we present the first detailed study of asymmetric probabilistic biquorum systems, that allow to mix different access strategies and different quorums sizes, while guaranteeing the desired intersection probability. We show the advantages of asymmetric probabilistic biquorum systems in ad hoc networks. Such an asymmetric construction is also useful for other types of networks with nonuniform access costs (e.g, peer-to-peer networks). The article includes a formal analysis of these approaches backed up by an extensive simulation-based study. The study explores the impact of various parameters such as network size, network density, mobility speed, and churn. In particular, we show that one of the strategies that uses random walks exhibits the smallest communication overhead, thus being very attractive for ad hoc networks. Categories and Subject Descriptors: C.2.1 [Computer-Communication Networks]: Network Architecture and Design--Wireless communication; C.2.4 [Computer-Communication Networks]: Distributed Systems--Distributed application General Terms: Algorithms, Design Additional Key Words and Phrases: Distributed middleware, location service, quorums systems, random walks, wireless ad hoc networks ACM Reference Formar: Friedman, R., Kliot, G., and Avin, C. 2010. Probabilistic quorum systems in wireless ad hoc networks. ACM Trans. Comput. Syst. 28, 3, Article 7 (September 2010), 50 pages. DOI = 10.1145/1841313.1841315 http://doi.acm.org/10.1145/1841313.1841315
- Published
- 2010
26. Dual discharge approach to accessing assimilative capacity: probabilistic analysis and management application
- Author
-
Rucinski, Daniel K., Watkins, David W., Jr., Auer, Martin T., and Effler, Steven W.
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Water quality -- Research ,Eutrophication -- Research ,Sewage -- Purification ,Sewage -- Methods ,Engineering and manufacturing industries ,Environmental issues - Abstract
A dual discharge strategy has been proposed for management of the effluent from the Syracuse Metropolitan Treatment Plant (Metro). The approach involves routing the discharge to the Seneca River when assimilative capacity is available there and to Onondaga Lake when it is not. Application of a deterministic modeling approach has demonstrated that the dual discharge strategy is effective in meeting water-quality standards/goals in both the fiver [dissolved oxygen (DO)] and the lake [total phosphorus (TP)] under summer average conditions of fiver flow and upstream boundary condition DO. Here, that analysis is extended to include a probabilistic treatment of the impact of natural variability in fiver flow and DO boundary conditions on the feasibility of this management option. Model simulations, incorporating these key sources of system variability, indicate that the dual discharge strategy will meet the lake management goal for TP ~94% of the time, with no attendant violation of fiver DO standards. Excursions from the lake TP goal, occurring ~6% of the time, range from 1-5 [micro]g x [L.sup.-1], are within the range of uncertainty in indicators applied in identifying trophic status. This novel management option is compared with an in-lake discharge alternative in terms of technical and economic feasibility and public acceptance of resultant water quality. Additional management actions, recommended to accompany implementation of the dual discharge strategy, are discussed. DOI: 10.1061/(ASCE)EE.1943-7870.0000209 CE Database subject headings: Water quality; Eutrophication; Wastewater management; Water treatment; Nonpoint pollution. Author keywords: Water quality; Eutrophication; Wastewater treatment; Nonpoint source pollution.
- Published
- 2010
27. Nearest neighbor probabilistic model for aluminum polycrystals
- Author
-
Grigoriu, Mircea
- Subjects
Crystals -- Properties ,Aluminum -- Properties ,Aluminum -- Structure ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Crystals -- Structure ,Crystals -- Research ,Science and technology - Abstract
A real-valued random field {[Z.sub.i,j]} with piecewise constant samples and defined on a lattice L in [R.sup.2] is developed to characterize two-dimensional metallic polycrystals. The subsets defined by constant values of {[Z.sub.i,j]} are virtual grains and the values of {[Z.sub.i,j]} give Euler angles at the nodes of L. The field {[Z.sub.i,j]} is completely defined by its marginal distribution and conditional probabilities associated with the nearest neighbor model. The defining probabilities of {[Z.sub.i,j]} need to be estimated from measurements of atomic lattice orientation. Random fields {[Z.sub.i,j]} calibrated to the measurements of crystallographic texture in two AA7075 aluminum plates have been used to generate virtual polycrystals. Virtual and actual polycrystals are similar. DOI: 10.1061/(ASCE)EM.1943-7889.0000163 CE Database subject headings: Monte Carlo method; Lattices; Probability; Aluminum; Measurement. Author keywords: Conditional/joint probabilities; Metallic.
- Published
- 2010
28. Estimation of probabilistic extreme wind load effects: combination of aerodynamic and wind climate data
- Author
-
Chen, Xinzhong and Huang, Guoqing
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Winds -- Properties ,Aerodynamics -- Research ,Climate models -- Research ,Science and technology - Abstract
A refined full-order method is presented for estimating the extreme wind load effects of rigid structures with given mean recurrence intervals (MRIs) by combining the distributions of annual maximum wind speed and extreme load coefficients. This refined method is capable of dealing with any type of asymptotic extreme value distribution. With this full-order method, the predictions of wind load effects by using distributions of annual maximum wind velocity pressure and wind speed are compared that provide information on the sensitivity of predictions to the upper tail of wind speed distribution. The efficacy of the first-order method is examined. The influences of the type of distributions and the variations of annual maximum wind speed and extreme load coefficient on the predictions are quantified. Finally, the first- and full-order methods are extended to wind load effects of dynamically sensitive structures which facilitate a comprehensive probabilistic analysis as compared to the Monte Carlo simulation schemes used in literature. It is pointed out that 78% fractile extreme load coefficient can be used for defining the characteristic load effects of both rigid and dynamically sensitive structures. The wind load factor is insensitive to the variation of extreme load coefficient. It can be approximately estimated through the wind speed factor and the growth rate of extreme wind load effect with increasing wind speed. The result concerning the wind load factor justifies the advantage of specifying design wind speeds with various MRIs in reducing the uncertainties of design wind loading. DOI: 10.1061/(ASCE)EM.1943-7889.0000118 CE Database subject headings: Wind loads; Uncertainty principles; Reliability; Probability; Aerodynamics. Author keywords: Wind; Wind load; Wind load effect; Uncertainty; Reliability analysis; Probabilistic analysis; Wind load factor; Rigid structures; Dynamically sensitive structures.
- Published
- 2010
29. Probability density function solution of nonlinear oscillators subjected to multiplicative Poisson pulse excitation on velocity
- Author
-
Zhu, H.T., Er, G.K., Iu, V.P., and Kou, K.P.
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Oscillators (Electronics) -- Mechanical properties ,Oscillators (Electronics) -- Testing ,Poisson processes -- Research ,Speed -- Research ,Dynamic testing -- Methods ,Science and technology - Abstract
The stationary probability density function (PDF) solution of the stochastic responses is derived for nonlinear oscillators subjected to both additive and multiplicative Poisson white noises. The PDF solution is governed by the generalized Fokker--Planck--Kolmogorov (FPK) equation and obtained with the exponential-polynomial closure (EPC) method, which was originally proposed for solving the FPK equation. The extended EPC solution procedure is presented for the case of Poisson pulses in this paper. In order to evaluate the effectiveness of the solution procedure, nonlinear oscillators are investigated under multiplicative Poisson white noise excitation on velocity and additive Poisson white noise excitation. Both weakly and strongly nonlinear oscillators are considered, respectively. In the numerical analysis, both the unimodal and bimodal stationary PDFs of oscillator responses are obtained with the EPC method and Monte Carlo simulation. Compared with the simulation results, good agreement is achieved with the presented solution procedure in the case of the polynomial degree being 6, especially in the tail regions of the PDFs of the system responses. [DOI: 10.1115/1.4000385] Keywords: nonlinear, oscillator, generalized FPK equation, probability density function, Poisson white noise
- Published
- 2010
30. Probabilistic analysis of soil-water characteristic curves
- Author
-
Phoon, Kok-Kwang, Santoso, Anastasia, and Quek, Ser-Tong
- Subjects
Soil moisture -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Graphic methods -- Research ,Earth sciences ,Engineering and manufacturing industries ,Science and technology - Abstract
Direct measurement of the soil-water characteristic curve (SWCC) is costly and time consuming. A first-order estimate from statistical generalization of experimental data belonging to soils with similar textural and structural properties is useful. A simple approach is to fit the data with a nonlinear function and to construct an appropriate probability model of the curve-fitting parameters. This approach is illustrated using sandy clay loam, loam, loamy sand, clay, and silty clay data in Unsaturated Soil Database. This paper demonstrates that a lognormal random vector is suitable to model the curve-fitting parameters of the SWCC. Other probability models using normal, gamma, Johnson, and other distributions do not provide better fit than the proposed lognormal model. The engineering impact of adopting a probabilistic SWCC is briefly discussed by studying the uncertainty of unsaturated shear strength due to the uncertainty of SWCC. DOI: 10.1061/(ASCE)GT.1943-5606.0000222 CE Database subject headings: Unsaturated soils; Soil water; Probability; Correlation; Shear strength; Uncertainty principles. Author keywords: Unsaturated soils; Soil-water characteristic curve; Curve fitting; Probability; Translation; Lognormal; Correlation; Random vector.
- Published
- 2010
31. Probabilistic analysis of coupled soil consolidation
- Author
-
Huang, Jinsong, Griffiths, D.V., and Fenton, Gordon A.
- Subjects
Geometric probabilities -- Research ,Finite element method -- Research ,Combinatorial probabilities -- Research ,Probabilities -- Research ,Soil permeability -- Research ,Soils -- Mechanical properties ,Earth sciences ,Engineering and manufacturing industries ,Science and technology - Abstract
Coupled Biot consolidation theory was combined with the random finite-element method to investigate the consolidation behavior of soil deposits with spatially variable properties in one-dimensional (1D) and two-dimensional (2D) spaces. The coefficient of volume compressibility ([m.sub.v]) and the soil permeability (k) are assumed to be lognormally distributed random variables. The random fields of [m.sub.v] and k are generated by the local average subdivision method which fully takes account of spatial correlation, local averaging, and cross correlations. The generated random variables are mapped onto a finite-element mesh and Monte Carlo finite-element simulations follow. The results of parametric studies are presented, which describe the effect of the standard deviation, spatial correlation length, and cross correlation coefficient on output statistics relating to the overall 'equivalent' coefficient of consolidation. It is shown that the average degree of consolidation defined by excess pore pressure and settlement are different in heterogeneous soils. The dimensional effect on the soil consolidation behaviors is also investigated by comparing the 1D and 2D results. DOI: 10.1061/(ASCE)GT.1943-5606.0000238 CE Database subject headings: Finite element method; Soil consolidation; Probability; Coupling. Author keywords: Finite-element method; Soil consolidation; Probabilistic methods; Coupling.
- Published
- 2010
32. Direct joint probability method for estimating extreme sea levels
- Author
-
Liu, Joan C., Lence, Barbara J., and Isaacson, Michael
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Engineering design -- Methods ,Engineering design -- Technology application ,Sea level -- Measurement ,Technology application ,Engineering and manufacturing industries ,Science and technology - Abstract
A key design element in coastal structures is the crest elevation which protects against damages due to overflowing and overtopping. In order to avoid overflowing, the design crest elevation should be above the extreme flood level, which is usually composed of tides and storm surges but could also include tsunami, El Nino, and other climatologic and geologic effects. The extreme flood level may be determined with the annual maxima, simple addition, or joint probability methods (JPM). These methods have various limitations in terms of the amount of required data, the representation of factors contributing to sea level fluctuations, the ability to assess the joint probability of these factors, and the degree of data independence required. To minimize overtopping, the design crest elevation should be above the extreme sea level which is evaluated considering wave runup and the extreme flood level. Wave runup estimates are based on selected extreme flood levels and the extreme wave climate, data for which are often dependent. A modification of the JPM, the direct JPM (DJPM), is developed for estimating extreme flood and sea levels. This method may be applied to consider any number of dependent contributing factors. Data for the City of Richmond, B.C., Canada, are used to demonstrate the DJPM. The DJPM provides an estimate of the extreme flood level for Richmond that is within the same range as those obtained using traditional estimation methods. The results indicate a large difference between extreme flood and sea level estimates. The sea levels at Richmond are also increasing due to climate and tectonic effects. A hybrid direct joint probability-simple addition method is applied to consider these effects. CE Database subject headings: Estimation; Sea level; Overflow; Coastal structures; Damage; Floods. DOI: 10.1061/(ASCE)0733-950X(2010)136:1(66)
- Published
- 2010
- Full Text
- View/download PDF
33. Efficient probabilistic back-analysis of slope stability model parameters
- Author
-
Zhang, J., Tang, Wilson H., and Zhang, L.M.
- Subjects
Slopes (Physical geography) -- Mechanical properties ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Stability -- Models ,Earth sciences ,Engineering and manufacturing industries ,Science and technology - Abstract
Back-analysis of slope failure is often performed to improve one's knowledge on parameters of a slope stability analysis model. In a failed slope, the slip surface may pass through several layers of soil. Therefore, several sets of model parameters need to be back-analyzed. To back-analyze multiple sets of slope stability parameters simultaneously under uncertainty, the back-analysis can be implemented in a probabilistic way, in which uncertain parameters are modeled as random variables, and their distributions are improved based on the observed slope failure information. In this paper, two methods are presented for probabilistic back-analysis of slope failure. For a general slope stability model, its uncertain parameters can be back-analyzed with an optimization procedure that can be implemented in a spreadsheet. When the slope stability model is approximately linear, its parameters can be back-analyzed with sensitivity analysis instead. A feature of these two methods is that they are easy to apply. Two case studies are used to illustrate the proposed methods. The case studies show that the degrees of improvement achieved by the back-analysis are different for different parameters, and that the parameter contributing most to the uncertainty in factor of safety is updated most. DOI: 10.1061/(ASCE)GT.1943-5606.0000205 CE Database subject headings: Landslides; Slope stability; Models; Parameters; Failure investigation; Probability; Reliability; Bayesian analysis. Author keywords: Landslides; Slope stability; Failure investigation; Probability methods; Reliability; Bayesian analysis.
- Published
- 2010
- Full Text
- View/download PDF
34. Some characteristics of k-step state transition probabilities of Markovian process
- Author
-
N, Murugesan and P, Suguna
- Subjects
Sequences (Mathematics) -- Research ,Markov processes -- Analysis -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,High technology industry ,Business, international ,Law - Abstract
Classical theory of Markov process deals with sequence analysis. Markov process provides a special type of dependence. Few analytic properties of the transition probabilities are analyzed for Markov process and Hidden Markov Model as well. The Gaines algorithm for Hidden Markov Model is found to improve the probability value. Keywords Markov process, Chapman-Kolmogorov equations, k-step transition probability, Hidden Markov Model, Gaines algorithm., §1. Introduction Markov process is a probabilistic description of a series of dependent trials. The Chapman-Kolmogorov equations play an important role in the study of Markov process. Classification of states [...]
- Published
- 2010
35. New representations and bounds for the generalized Marcum Q-function via a geometric approach, and an application
- Author
-
Rong Li, Pooi Yuen Kam, and Hua Fu
- Subjects
Functions, Exponential -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Gaussian distribution -- Research - Published
- 2010
36. The framing of the fundamental probability set: a historical case study on the context of mathematical discovery
- Author
-
Campos, Daniel G.
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Philosophy -- Research ,Science and technology - Published
- 2009
37. Physics-based foundation for empirical mode decomposition
- Author
-
Lee, Young S., Tsakirtzis, Stylianos, Vakakis, Alexander F., Bergman, Lawrence A., and McFarland, D. Michael
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Time measurement -- Methods ,Timekeeping -- Methods ,Aerospace engineering -- Research ,Aerospace and defense industries ,Business - Abstract
We study the correspondence between analytical and empirical slow-flow analyses. Given a sufficiently dense set of sensors, measured time series recorded throughout a mechanical or structural system contains all information regarding the dynamics of that system. Empirical mode decomposition is a useful tool for decomposing the measured time series in terms of intrinsic mode functions, which are oscillatory modes embedded in the data that fully reproduce the time series. The equivalence of responses of the analytical slow-flow models and the dominant intrinsic mode functions derived from empirical mode decomposition provides a physics-based theoretical foundation for empirical mode decomposition, which currently is performed formally in an ad hoc fashion. To demonstrate correspondence between analytical and empirical slow flows, we derive appropriate mathematical expressions governing the empirical slow flows and based on analyticity conditions. Several nonlinear dynamical systems are considered to demonstrate this correspondence, and the agreement between the analytical and empirical slow dynamics proves the assertion. DOI: 10.2514/1.43207
- Published
- 2009
38. Probabilistic settlement analysis by stochastic and random finite-element methods
- Author
-
Griffiths, D.V. and Fenton, Gordon A.
- Subjects
Finite element method -- Usage ,Monte Carlo method -- Usage ,Elasticity -- Measurement ,Stochastic processes -- Measurement ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Earth sciences ,Engineering and manufacturing industries ,Science and technology - Abstract
The paper discusses finite element models for predicting the elastic settlement of a strip footing on a variable soil. The paper then goes on to compare results obtained in a probabilistic settlement analysis using a stochastic finite element method based on first order second moment approximations, with the random finite element method based on generation of random fields combined with Monte Carlo simulations. The paper highlights the deficiencies of probabilistic methods that are unable to properly account for spatial correlation. DOI: 10.1061/(ASCE)GT.1943-5606.0000126 CE Database subject headings: Foundation settlement; Elasticity; Probability; Finite element method; Stochastic processes.
- Published
- 2009
39. Probability theory, not the very guide of life
- Author
-
Juslin, Peter, Nilsson, Hatkan, and Winman, Anders
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Heuristic -- Research ,Psychology and mental health - Abstract
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality. Keywords: probability judgment, representativeness heuristic, conjunction error, base-rate neglect. additive integration DOI: 10.1037/a0016979
- Published
- 2009
40. Water treatment units residence-time distributions as probabilistic functions
- Author
-
Tzatchkov, Velitchko G., Buchberger, Steven G., and Martin-Dominguez, Alejandra
- Subjects
Persistence (Environmental chemistry) -- Measurement ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Nuclear reactors -- Models ,Water treatment plants -- Models ,Engineering and manufacturing industries ,Environmental issues - Abstract
A probabilistic approach to obtain theoretical residence-time distribution (RTD) functions for series of reactors with possible stagnation, bypassing, and recycle is presented. It is shown that most known RTD functions can be obtained from probability arguments alone, thus avoiding abstract Laplace transform mathematical techniques and providing additional physical insight. Several new RTD models for reactors in series are derived, based exclusively on using the binomial probability distribution to describe the passage of a particle through the treatment train with bypassing and stagnation possible at each individual reactor. The proposed RTD models are validated with travel time computer simulation of a large number of particles through the series of reactors. A MSExcel based computer procedure was programmed to obtain the nonideal flow parameters by minimizing the squared sum of the differences between tracer test data and the derived unit's RTD function. The least squares parameter estimation procedure was used to fit theoretical RTDs to tracer data collected from real water treatment units with different hydraulic behavior at two locations in Mexico. DOI: 10.1061/(ASCE)0733-9372(2009)135:10(944) CE Database subject headings: Residence time; Nonuniform flow; Water treatment plants; Reactors; Mathematical models; Probability; Probability density functions.
- Published
- 2009
41. Influence of spatial variability on slope reliability using 2-D random fields
- Author
-
Griffiths, D.V., Huang, Jinsong, and Fenton, Gordon A.
- Subjects
Finite element method -- Usage ,Slopes (Physical geography) -- Structure ,Stability -- Measurement ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Earth sciences ,Engineering and manufacturing industries ,Science and technology - Abstract
The paper investigates the probability of failure of slopes using both traditional and more advanced probabilistic analysis tools. The advanced method, called the random finite-element method, uses elastoplasticity in a finite-element model combined with random field theory in a Monte-Carlo framework. The traditional method, called the first-order reliability method, computes a reliability index which is the shortest distance (in units of directional equivalent standard deviations) from the equivalent mean-value point to the limit state surface and estimates the probability of failure from the reliability index. Numerical results show that simplified probabilistic analyses in which spatial variability of soil properties is not properly accounted for, can lead to unconservative estimates of the probability of failure if the coefficient of variation of the shear strength parameters exceeds a critical value. The influences of slope inclination, factor of safety (based on mean strength values), and cross correlation between strength parameters on this critical value have been investigated by parametric studies in this paper. The results indicate when probabilistic approaches, which do not model spatial variation, may lead to unconservative estimates of slope failure probability and when more advanced probabilistic methods are warranted. DOI: 10.1061/(ASCE)GT.1943-5606.0000099 CE Database subject headings: Slope stability; Finite element method; Probability; Failures.
- Published
- 2009
42. Antenna/relay selection for coded cooperative networks with AF relaying
- Author
-
Elfituri, Mohamed, Ghrayeb, Ali, and Hamouda, Walaa
- Subjects
Antennas (Electronics) -- Design and construction ,Relays -- Design and construction ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research - Published
- 2009
43. The bivariate generalized-K ([K.sub.G]) distribution and its application to diversity receivers
- Author
-
Bithas, Petros S., Sagias, Nikos C., and Mathiopoulos, P. Takis
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Electromagnetic waves -- Scattering ,Electromagnetic waves -- Observations - Published
- 2009
44. Probabilistic approach for modeling and presenting error in spatial data
- Author
-
Fekpe, Edward S., Windholz, Thomas K., and Beard, Kate
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Geospatial data -- Analysis ,Geographic information systems -- Usage ,Artificial satellites in surveying -- Methods ,Geographic information system ,Engineering and manufacturing industries ,Science and technology - Abstract
This paper presents a probabilistic approach to describe and visualize uncertainty and error in spatial data. The probabilistic approach assigns n-dimensional probability zones to n-dimensional measured feature locations. The size of each n-dimensional zone depends on the uncertainty arising from imprecise measurements or derived inaccuracy values, and a user-selected probability threshold that the 'true' feature location is to be found within this probabilistic space. The uncertainties relate to recorded measurement precision, accuracy of a linear network, and issues of scale and resolution. The confidence intervals are based on the X; distribution, where the probability that the measured point location is within the tabulated distance of the true point location can be tested. The error model computes the probability of the intersection of two features or data sources to determine whether they are compatible and if they should be used together. The error model allows the user to assess the potential quality implications of combining data from different sources and with different qualities. The error model is encapsulated in a software program that includes a graphic user interface that facilitates visualization of results. The ability to visualize the quality of spatial data at different significance levels of confidence provides a powerful tool for communicating the impacts of the quality of spatial data on applications of interest to users. DOI: 10.1061/(ASCE)0733-9453(2009)135:3(101) CE Database subject headings: Spatial data; Errors; Uncertainly principles; Data communication; Geographic information systems; Probability; Surveys.
- Published
- 2009
45. Probabilistic modelling of safety and damage blast risks for window glazing
- Author
-
Netherton, Michael D. and Stewart, Mark G.
- Subjects
Glass -- Mechanical properties ,Risk assessment -- Methods ,Glazing -- Methods ,Explosions -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Windows -- Mechanical properties ,Probabilities -- Research ,Engineering and manufacturing industries ,Mechanical properties ,Research ,Methods - Abstract
There are many computational techniques to model the consequences to built infrastructure when subject to explosive blast loads; however, the majority of these do not account for the uncertainties associated with system response or blast loading. This paper describes a new computational model, called 'Blast-RF' (Blast Risks for Facades), that incorporates existing (deterministic) blast-response models within an environment that considers threat and (or) vulnerability uncertainties and variability using probability and structural reliability theory. The structural reliability analysis uses stress limit states and the UK Glazing Hazard Guide's rating criteria to calculate probabilities of glazing damage and occupant safety hazards conditional on a given blast scenario. This allows the prediction of likelihood and extent of damage and (or) casualties, useful information for risk mitigation considerations, emergency service's contingency and response planning, collateral damage estimation, weaponeering, and post-blast forensic analysis. Key words: probability, glass, safety, risk, blast, windows. Plusieurs techniques de calcul existent pour modeliser les consequences que subissent des infrastructures construites soumises a des forces de souffle d'explosion. Malheureusement, la plupart de ces techniques ne tiennent pas compte des incertitudes associees a la reponse du systeme a la force de l'explosion. Cet article decrit un nouveau modele informatique appele ' Blast-RF ' (Blast Risks for Facades), qui incorpore les modeles deterministes existants sur la reponse aux explosions dans un milieu qui tient compte des incertitudes de menace/vulnerabilite ainsi que de la variabilite par la probabilite et la theorie de fiabilite structurale. L'analyse de fiabilite structurale utilise les etats limites de resistances et les criteres de classement du UK Glazing Hazard Guide (guide des dangers pour le vitrage du R.-U.) afin de calculer les probabilites de dommage au vitrage ainsi que les dangers a la securite des occupants selon un scenario d'explosion donne. Cela permet de predire la vraisemblance et l'etendue des dommages et/ou le nombre de deces; cette information peut etre utile pour attenuer les risques, pour la planification d'urgence et de l'intervention des services d'urgence, pour l'estimation des dommages collateraux, l'analyse des probabilites des effets des armes ainsi que l'analyse judicaire apres l'explosion. Mots-cles: probabilite, verre, securite, risque, explosion, fenetres. [Traduit par la Redaction], 1. Introduction There are many instances in the recent past that indicate that terrorist threats will remain into the future and that a favoured method of terrorist attack is to [...]
- Published
- 2009
- Full Text
- View/download PDF
46. Tracking of divers using a probabilistic data association filter with a bubble model
- Author
-
Rodningsby, Anders and Bar-Shalom, Yaakov
- Subjects
Digital filters -- Usage ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Divers -- Identification and classification ,Remote sensing -- Methods ,Aerospace and defense industries ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Detection and tracking of divers have become an important factor in port protection against underwater intruders. A problem arises from divers with open breathing systems because detections of the air bubbles they produce can mislead the tracking filter and sometimes result in a lost track. In this paper a probabilistic model is developed which reflects the probability that a false measurement originates from the bubbles. The novel contribution of this paper is the integration of this model in the probabilistic data association filter (PDAF) to improve the track continuity. The bubble detections may also cause confusion in the track initiation. To prevent this problem, a clustering method is proposed based on morphological operators which allows tracks to be initialized based on two-point differencing of the cluster centroids from succeeding scans. This morphological clustering method is included in a cell averaging constant false alarm rate (CA-CFAR) detector in such a way that both the point detections and their corresponding clusters can be fed to the tracking filter. These techniques are implemented and applied to real data of two divers, one with an open breathing system and the other with a closed breathing system, operating simultaneously in a coastal area. The real data were recorded from an active 90 kHz narrowband multibeam imaging sonar.
- Published
- 2009
47. Evaluation of call performance in cellular networks with generalized cell dwell time and call-holding time distributions in the presence of channel fading
- Author
-
Pattaramalai, Suwat, Aalo, Valentine A., and Efthymoglou, George P.
- Subjects
Communications circuits -- Design and construction ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Signals and signaling -- Methods ,Business ,Electronics ,Electronics and electrical industries ,Transportation industry - Abstract
Call-completion probability, call-dropping probability, and handoff rate are important performance measures of wireless networks. In this paper, we study the joint effect of channel fading and handover failure on these performance measures. For the case of Rayleigh and lognormal fading channels and for generalized distributions of the cell dwell time and the call-holding time, we derive simple closed-form expressions that closely approximate these performance metrics. The results are given in terms of the moment-generating function (MGF) of the distribution for the call-holding time and may be useful in the cross-layer design and the optimization of wireless networks. Index Terms--Call-completion probability, call-holding time, cell dwell time, handoff rate.
- Published
- 2009
48. Asymptotic performance of threshold-based generalized selection combining
- Author
-
Ma, Yao, Dong, Xiaodai, and Yang, Hong-Chuan
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Communications circuits -- Design and construction ,Business ,Electronics ,Electronics and electrical industries ,Transportation industry - Abstract
In this paper, we study the asymptotic performance of absolute-threshold- and normalized-threshold-based generalized selection combining (AT-GSC and NT-GSC) schemes over generalized fading channels for high average signal-to-noise ratios (ASNRs). By evaluating the asymptotic moment-generating function of the threshold-based GSC output SNRs, we derive the diversity and combining gains for AT- and NT-GSCs with a large class of modulation formats and versatile fading conditions, including different types of fading channels and nonidentical SNR statistics across diversity branches. Our analytical results reveal that the diversity gains of AT- and NT-GSC are equivalent to that of maximal-ratio combining, and the differences in the combining gains for different threshold values and modulation and diversity formats are derived by using the concept of modulation factors. Index Terms--Combining gain, diversity gain, error rate, outage probability, threshold-based generalized selection combining (T-GSC).
- Published
- 2009
49. Parameters estimation for a mixture of Inverse Weibull distributions from censored data
- Author
-
Yuzhu, Tian, Wansheng, He, and Ping, Chen
- Subjects
Functions, Inverse -- Research ,Electronic data processing -- Methods -- Technology application ,Estimation theory -- Research ,Censoring (Statistics) -- Research ,Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,High technology industry ,Business, international ,Law ,Technology application ,Research ,Methods - Abstract
In this paper a failure model of the mixed Inverse Weibull distributions(MIWD) is considered and its estimators for all unknown parameters based on type-II and type-I censored data are obtained by means of ENI-Algorithm. Some simulations suggest that EM algorithm is effective to our model. Keywords mixed Inverse Weibull distribution, failure model, EM-algorithm, censored data., [section]1. Introduction Mixture models play a important role in many applicable fields, such as medicine, psychology research, cluster analysis, life testing and reliability analysis and so on. Mixture distributions have [...]
- Published
- 2009
50. Probabilistic modeling of walking excitation for building floors
- Author
-
Zivanovic, Stana and Pavic, Aleksandar
- Subjects
Combinatorial probabilities -- Research ,Geometric probabilities -- Research ,Probabilities -- Research ,Floors -- Mechanical properties ,Floors -- Models ,Vibration research -- Methods ,Maintainability (Engineering) -- Evaluation ,Engineering and manufacturing industries ,Science and technology - Abstract
Slender floor structures are becoming increasingly prone to excessive vibration due to human-induced walking excitation. To prevent discomfort of floor occupants and/or malfunctioning of sensitive equipment, it is necessary to have a reliable means of estimating floor vibration in the design phase. For accurate estimation of the floor vibration, both reliable excitation and structural models are required. This paper concentrates on the former by evaluating the performance of the existing force models and suggesting their improvement. For this a force model adopted in the United Kingdom by the Concrete Society was applied to four nominally identical floors using their experimentally identified modal properties. After comparison with experimental data the drawbacks of the force model were identified after which an improved model of the walking-induced dynamic force, based on the combination of two existing methodologies used separately for low- and high-frequency floors, is proposed. The improved model accounts for the intersubject variability in the walking force with respect to the pacing frequency, step length, and forcing magnitude. Moreover, it includes all relevant frequency components of the walking force into analysis, removing the need for classification of floors as low or high frequency. The proposed approach should help designers and building owners to make more informed decisions when evaluating vibration serviceability of floor structures. DOI: 10.1061/(ASCE)CF.1943-5509.0000005 CE Database subject headings: Floors; Vibration; Probability; Serviceability.
- Published
- 2009
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.