157 results on '"Equiprobability"'
Search Results
2. First Principles of the Classical Mechanics and the Foundations of Statistical Mechanics on the Example of a Disordered Spin System
- Author
-
V. V. Sahakyan and A. S. Gevorkyan
- Subjects
Physics ,Statistical ensemble ,Hamiltonian mechanics ,Partition function (statistical mechanics) ,Spin glass ,010308 nuclear & particles physics ,General Physics and Astronomy ,Statistical mechanics ,01 natural sciences ,Equiprobability ,Lattice (module) ,symbols.namesake ,Classical mechanics ,0103 physical sciences ,symbols ,010306 general physics ,Spin-½ - Abstract
We study the classical multicomponent disordered 3D spin system taking into account the temperature of the medium in the framework of the model of nearest neighbors. The latter allows the 3D problem with a cubic lattice to reduce to the 1D Heisenberg spin glass problem with a random environment. Using the Hamilton equations of motion, a recurrent equation is obtained that connects three spins in successive nodes of 1D lattice, taking into account the influence of a random environment. This equation, together with the corresponding conditions of a local minimum energy in nodes, allows to construct node-by-node a stable spin chains and, accordingly, to calculate all parameters of statistical ensemble from the first principles of classical mechanics, without using any additional assumptions, in particular, the main axiom of statistical mechanics – the equiprobability of statistical states. Using the example of 1D Heisenberg spin glass model, the features of the new approach are studied in detail and the statistical mechanics of the system are constructed without using the standard representation of the partition function (PF).
- Published
- 2020
3. Grammatically uniform population initialization for grammar-guided genetic programming
- Author
-
Daniel Manrique, Pablo Ramos Criado, D. Barrios Rolanía, and Emilio Serrano
- Subjects
education.field_of_study ,Uniform distribution (continuous) ,Computer science ,Population ,Sampling (statistics) ,Initialization ,Genetic programming ,Computational intelligence ,Evolutionary computation ,Theoretical Computer Science ,Equiprobability ,Geometry and Topology ,education ,Algorithm ,Software - Abstract
The initial population distribution is an essential issue in evolutionary computation performance. Population initialization methods for grammar-guided genetic programming have some difficulties generating a representative sample of the search space, which negatively affects the overall evolutionary process. This paper presents a grammatically uniform population initialization method to address this issue by improving the initial population uniformity: the equiprobability of obtaining any individual of the search space defined by the context-free grammar. The proposed initialization method assigns and updates probabilities dynamically to the production rules of the grammar to pursue uniformity and includes a code bloat control mechanism. We have conducted empirical experiments to compare the proposed algorithm with a standard initialization approach very often used in grammar-guided genetic programming. The results report that the proposed initialization method approximates very well a uniform distribution of the individuals in the search space. Moreover, the overall evolutionary process that takes place after the population initialization performs better in terms of convergence speed and quality of the final solutions achieved when the proposed method generates the initial population than when the usual approach does. The results also show that these performance differences are more significant when the experiments involve large search spaces.
- Published
- 2020
4. First-order phase statistics in Laguerre-Gauss speckles
- Author
-
Aristide Dogariu, Pedro A. Alvarez Fernandez, and Cristian Hernando Acevedo
- Subjects
Equiprobability ,Physics ,Speckle pattern ,Random field ,Scattering ,Gauss ,Phase (waves) ,Laguerre polynomials ,Statistical physics ,Standard deviation - Abstract
We present a statistical analysis of the phase of random fields generated by scattering of Laguerre-Gauss beams. The standard deviation of the speckle phase is studied using the formalism of equiprobability density ellipses.
- Published
- 2021
5. Complete null agent for games with externalities
- Author
-
M. G. Fiestras-Janeiro, Andrés Jiménez-Losada, José María Alonso-Meijide, Mikel Álvarez-Mozos, and Universitat de Barcelona
- Subjects
0209 industrial biotechnology ,Property (philosophy) ,Computer science ,02 engineering and technology ,computer.software_genre ,Equiprobability ,Externalitats (Economia) ,020901 industrial engineering & automation ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,Game theory ,Axiom ,Null (mathematics) ,General Engineering ,TheoryofComputation_GENERAL ,Shapley value ,Expert system ,Computer Science Applications ,Teoria de jocs ,Externalities (Economics) ,020201 artificial intelligence & image processing ,Mathematical economics ,Value (mathematics) ,computer ,Béns públics ,Public goods - Abstract
Game theory provides valuable tools to examine expert multi-agent systems. In a cooperative game, collaboration among agents leads to better outcomes. The most important solution for such games is the Shapley value, that coincides with the expected marginal contribution assuming equiprobability. This assumption is not plausible when externalities are present in an expert system. Generalizing the concept of marginal contributions, we propose a new family of Shapley values for situations with externalities. The properties of the Shapley value offer a rationale for its application. This family of values is characterized by extensions of Shapley’s axioms: efficiency, additivity, symmetry, and the null player property. The first three axioms have widely accepted generalizations to the framework of games with externalities. However, different concepts of null players have been proposed in the literature and we contribute to this debate with a new one. The null player property that we use is weaker than the others. Finally, we present one particular value of the family, new in the literature, and characterize it by two additional properties.
- Published
- 2019
6. Are non-accidental regularities a cosmic coincidence? Revisiting a central threat to Humean laws
- Author
-
Aldo Filomeno
- Subjects
Philosophy of science ,Credence ,media_common.quotation_subject ,Philosophy ,05 social sciences ,Doxastic logic ,General Social Sciences ,Ignorance ,06 humanities and the arts ,0603 philosophy, ethics and religion ,Principle of indifference ,050105 experimental psychology ,Equiprobability ,Argument ,Law ,060302 philosophy ,0501 psychology and cognitive sciences ,Suspension of judgment ,media_common - Abstract
If the laws of nature are as the Humean believes, it is an unexplained cosmic coincidence that the actual Humean mosaic is as extremely regular as it is. This is a strong and well-known objection to the Humean account of laws. Yet, as reasonable as this objection may seem, it is nowadays sometimes dismissed. The reason: its unjustified implicit assignment of equiprobability to each possible Humean mosaic; that is, its assumption of the principle of indifference, which has been attacked on many grounds ever since it was first proposed. In place of equiprobability, recent formal models represent the doxastic state of total ignorance as suspension of judgment. In this paper I revisit the cosmic coincidence objection to Humean laws by assessing which doxastic state we should endorse. By focusing on specific features of our scenario I conclude that suspending judgment results in an unnecessarily weak doxastic state. First, I point out that recent literature in epistemology has provided independent justifications of the principle of indifference. Second, given that the argument is framed within a Humean metaphysics, it turns out that we are warranted to appeal to these justifications and assign a uniform and additive credence distribution among Humean mosaics. This leads us to conclude that, contrary to widespread opinion, we should not dismiss the cosmic coincidence objection to the Humean account of laws.
- Published
- 2019
7. WAYS IN WHICH HIGH-SCHOOL STUDENTS UNDERSTAND THE SAMPLING DISTRIBUTION FOR PROPORTIONS
- Author
-
Nuria Begué, María Magdalena Gea, Manfred Borovcnik, and Carmen Batanero
- Subjects
Statistics and Probability ,education.field_of_study ,05 social sciences ,Population ,Sampling (statistics) ,Sample (statistics) ,Education ,Equiprobability ,Sampling distribution ,Sample size determination ,Concept learning ,0502 economics and business ,Statistics ,050211 marketing ,education ,Statistics education ,Psychology ,050203 business & management - Abstract
In Spain, curricular guidelines as well as the university-entrance tests for social-science high-school students (17–18 years old) include sampling distributions. To analyse the understanding of this concept we investigated a sample of 234 students. We administered a questionnaire to them and ask half for justifications of their answers. The questionnaire consisted of four sampling tasks with two sample sizes (n = 100 and 10) and population proportions (equal or different to 0.5)systematically varied. The experiment gathered twofold data from the students simultaneously, namely about their perception of the mean and about their understanding of variation of the sampling distribution. The analysis of students’ responses indicates a good understanding of the relationship between the theoretical proportion in the population and the sample proportion. Sampling variability, however, was overestimated in bigger samples. We also observed various types of biased thinking in the students: the equiprobability and recency biases, as well as deterministic pre-conceptions. The effect of the task variables on the students’ responses is also discussed here. First published December 2020 at Statistics Education Research Journal: Archives
- Published
- 2020
8. Autonomous Brownian gyrators: A study on gyrating characteristics
- Author
-
Hsin Chang, Chi Lun Lee, Yung-Fu Chen, and Pik Yin Lai
- Subjects
Physics ,Statistical Mechanics (cond-mat.stat-mech) ,Isotropy ,FOS: Physical sciences ,Harmonic (mathematics) ,01 natural sciences ,Stationary point ,010305 fluids & plasmas ,Equiprobability ,Maxima and minima ,Classical mechanics ,Flow (mathematics) ,Quartic function ,0103 physical sciences ,010306 general physics ,Condensed Matter - Statistical Mechanics ,Brownian motion - Abstract
We study the nonequilibrium steady-state (NESS) dynamics of two-dimensional Brownian gyrators under harmonic and nonharmonic potentials via computer simulations and analyses based on the Fokker-Planck equation, while our nonharmonic cases feature a double-well potential and an isotropic quartic potential. In particular, we report two simple methods that can help understand gyrating patterns. For harmonic potentials, we use the Fokker-Planck equation to survey the NESS dynamical characteristics, i.e., the NESS currents gyrate along the equiprobability contours and the stationary point of flow coincides with the potential minimum. As a contrast, the NESS results in our nonharmonic potentials show that these properties are largely absent, as the gyrating patterns are much distinct from those of corresponding probability distributions. Furthermore, we observe a critical case of the double-well potential, where the harmonic contribution to the gyrating pattern becomes absent, and the NESS currents do not circulate about the equiprobability contours nearby the potential minima even at low temperatures., 9 pages, 3 figures
- Published
- 2020
9. Pre-service Teachers’ Probabilistic Reasoning in Constructivist Classroom
- Author
-
Evans Kofi Hokor
- Subjects
Class (computer programming) ,Logical reasoning ,Teaching method ,0211 other engineering and technologies ,021107 urban & regional planning ,02 engineering and technology ,Representativeness heuristic ,Teacher education ,Constructivist teaching methods ,Equiprobability ,021105 building & construction ,Mathematics education ,Belief bias ,Psychology - Abstract
Several studies revealed that probability misconceptions were widespread among students, but the activities for addressing the misconceptions has been lacking. This study designed activities that reflect real life situations for addressing equiprobability bias, positive and negative recency effects, belief bias and representativeness bias for teaching probability globally. Thirty-two pre-service teachers from one intact class were purposively sampled for the study. The instruments used in the collection of data were observation and questionnaire. The study found constructivist approach of teaching with critical questions asked by the teacher to be vital in addressing misconceptions. The findings suggest that teacher educators should use the constructivist approach of teaching targeting probabilistic misconceptions in training of teachers.
- Published
- 2020
10. Algumas reflexões sobre a definição de probabilidade
- Author
-
Fernando Montanaro Paiva de Almeida, André Gustavo Campos Pereira, Igor Bruno Dantas Nunes, Francisco Erivan de Almeida Júnior, Gleydson Medeiros de Souza, George Luiz Coelho Cortes, Arthur Henrique da Silva, and George Homer Barbosa de Medeiros
- Subjects
Educación Matemática desde otras disciplinas ,Subject (philosophy) ,Comprensión ,Face (sociological concept) ,General Medicine ,_Otro (probabilidad) ,Cone (formal languages) ,Outcome (probability) ,Test (assessment) ,Epistemology ,Equiprobability ,Libros de texto ,Ice cream ,Situaciones ,Psychology ,Simple (philosophy) - Abstract
Every day we face situations in which decisions have to be made. Some of them are very simple ones, e.g., whether you want an ice cream or not. In case you decide to have one, you have to decide if you want it in a plastic bowl or in an ice cream cone, and you still have to choose the ice cream flavor(s). Sometimes we have some preferences among all options presented, sometimes all options seem the same. Even in simple situations, partiality is always present, then why the teaching of probability (in middle/high school) focus on the indifference (equiprobability)? In this work we observed not only by the analysis of books and master dissertations but also by the analysis of the outcome of a test that was answered by students (high school and undergraduate), that the sedimented probability definition is the one that force us to accept the equiprobability as the only way to deal with stochastic happenings, and that very little has been made to change such picture. We also regard that the manner that some books illustrate the subject can harden the understanding even when equiprobability is considered.
- Published
- 2020
11. JAPANESE AND THAI SENIOR HIGH SCHOOL MATHEMATICS TEACHERS’ KNOWLEDGE OF VARIABILITY
- Author
-
Orlando Rafael González González, Masami Isoda, and Somchai Chitmun
- Subjects
Statistics and Probability ,Estimation ,Equiprobability ,Research literature ,Mathematics education ,Survey research ,Statistical literacy ,Mathematics instruction ,Statistics education ,Education - Abstract
In this article, the conceptions of variability held by samples of Japanese and Thai senior high school mathematics teachers were identified, based on the framework proposed by Shaughnessy (2007), using a comparative survey study. From contrasting the results of the two groups, relative tendencies of insufficient statistical knowledge for variability were found in both samples, such as a tendency of Japanese teachers to overgeneralize equiprobability, whereas Thai teachers tended to overgeneralize estimation. Based on these findings, the use of well-known tasks from the research literature for this comparative study seems useful to clarify the relative tendencies and insufficiencies in teacher knowledge and conceptions regarding variability held by both groups. First published November 2018 at Statistics Education Research Journal Archives
- Published
- 2018
12. Korean Preservice Elementary Teachers’ Abilities to Identify Equiprobability Bias and Teaching Strategies
- Author
-
Eun-Jung Lee and Mimi Park
- Subjects
General Mathematics ,05 social sciences ,050301 education ,Problem context ,Science education ,Teacher education ,Education ,Equiprobability ,School teachers ,Mathematics education ,0501 psychology and cognitive sciences ,Psychology ,0503 education ,050104 developmental & child psychology - Abstract
Equiprobability bias (EB) is one of the frequently observed misconceptions in probability education in K-12 and can be affected by a problem context. As future teachers, preservice teachers need to have a stable understanding of probability and to have the knowledge to identify EB in their students regardless of the problem context. However, there are few studies to explore how preservice teachers identify students’ EB and how they respond to students’ EB. This study investigated Korean preservice elementary school teachers’ abilities to identify students’ EB in two problem contexts, marble and baseball problems, as well as their teaching strategies for correcting students’ EB within each problem. Ninety-six preservice elementary school teachers participated in this study. They were presented with two problems with students having EB and were asked to write lesson plays. From the analysis of their lesson plays, it was found that 87% of the preservice teachers identified students’ EB in both problems, and in the baseball problem, 13% of them did not. Three teaching strategies for correcting students’ EB in each problem were found. Based on the results, implications for preservice elementary teacher education were discussed.
- Published
- 2018
13. Cross‐polarisation discrimination models assessment and improvement on earth‐space propagation paths at Ka and V‐bands
- Author
-
Flavio Jorge, Carlo Riva, and Armando Rocha
- Subjects
Physics ,Cumulative distribution function ,Attenuation ,020208 electrical & electronic engineering ,020206 networking & telecommunications ,02 engineering and technology ,Spectral efficiency ,Interference (wave propagation) ,Computational physics ,Equiprobability ,0202 electrical engineering, electronic engineering, information engineering ,Communications satellite ,Ka band ,Electrical and Electronic Engineering ,V band - Abstract
The performance of the satellite communication systems employing polarisation diversity or frequency-reuse schemes to improve the spectral efficiency is degraded due to the depolarisation-induced interference originated by raindrops and ice particles present along the Earth-Space propagation path. Two models account for both rain and ice contributions. One predicts the long-term cumulative distribution function (CDF) of cross-polarisation discrimination (XPD). The other predicts the relationship between XPD and co-polar attenuation (CPA) and it was derived considering exclusively data at the V-band. In this study, the former model is improved by considering the individual ice and rain contributions and their combined effects, while the latter is validated against new measurements at the Ka band. New models are then proposed for the XPD-CPA relationships at the Ka-band taking into account both rain and ice contributions and also their combined effects. Finally, the predictions provided by the first model are usually converted to the corresponding XPD-CPA relationship using the long-term first-order statistics of rain attenuation, (incorrectly) considering that the equiprobability hypothesis applies. A new approach for the conversion of the XPD CDF into the corresponding XPD-CPA relationship is presented.
- Published
- 2018
14. The meaning of justified subjectivism and its role in the reconciliation of recent disagreements over forensic probabilism
- Author
-
Franco Taroni, Colin Aitken, Silvia Bozza, and Alex Biedermann
- Subjects
Evaluative precepts ,Probability assignment ,010401 analytical chemistry ,Assertion ,Subjective probability, Probability assignment, Justification, Evaluative precepts ,Subjective expected utility ,Probabilism ,Imprecise probability ,01 natural sciences ,Justification ,Subjective probability ,0104 chemical sciences ,Pathology and Forensic Medicine ,Epistemology ,Constraint (information theory) ,Equiprobability ,03 medical and health sciences ,0302 clinical medicine ,Subjectivism ,030216 legal & forensic medicine ,Meaning (existential) ,Settore SECS-S/01 - Statistica ,Psychology ,Social psychology - Abstract
In this paper we reply to recent comments in this Special Issue according to which subjective probability is not considered to be a concept fit for use in forensic evaluation and expert reporting. We identify the source of these criticisms to lie in a misunderstanding of subjective probability as unconstrained subjective probability; a lack of constraint that neither corresponds to the way in which we referred to subjective probability in our previous contributions, nor to the way in which probability assignment is understood by current evaluative guidelines (e.g., of ENFSI). Specifically, we explain that we understand subjective probability as a justified assertion, i.e. a conditional assessment based on task-relevant data and information, that may be thought of as a constrained subjective probability. This leads us to emphasise again the general conclusion that there is no gap between justified (or, reasonable) subjective probability and other concepts of probability in terms of its ability to provide assessments that are soundly based on whatever relevant information available. We also note that the challenges an expert faces in reporting probabilities apply equally to all interpretations of probability, not only to subjective probability.
- Published
- 2017
15. Gravity modulates behaviour control strategy
- Author
-
Elisa Raffaella Ferrè, Iqra Arshad, and Maria Gallagher
- Subjects
Adult ,Male ,Supine position ,Posture ,Affect (psychology) ,Choice Behavior ,050105 experimental psychology ,psyc ,Equiprobability ,Thinking ,03 medical and health sciences ,Young Adult ,0302 clinical medicine ,Orientation (mental) ,Humans ,0501 psychology and cognitive sciences ,Randomness ,Balance (ability) ,Vestibular system ,General Neuroscience ,05 social sciences ,Cognition ,Exploratory Behavior ,Female ,Vestibule, Labyrinth ,Psychology ,030217 neurology & neurosurgery ,Cognitive psychology ,Gravitation - Abstract
Human behaviour is a trade-off between exploitation of familiar resources and exploration of new ones. In a challenging environment—such as outer space—making the correct decision is vital. On Earth, gravity is always there, and is an important reference for behaviour. Thus, altered gravitational signals may affect behaviour control strategies. Here, we investigated whether changing the body’s orientation to the gravitational vector would modulate the balance between routine and novel behaviour. Participants completed a random number generation task while upright or supine. We found decreased randomness when participants were supine. In particular, the degree of equiprobability of pairs of consecutive responses was reduced in the supine orientation. Online gravitational signals may shape the balance between exploitation and exploration, in favour of more stereotyped and routine responses.
- Published
- 2018
16. The Duhem-Quine problem for equiprobable conjuncts
- Author
-
Vikram Singh Sirola and Abhishek Kashyap
- Subjects
History ,Computer science ,05 social sciences ,Bayesian probability ,06 humanities and the arts ,Quine ,0603 philosophy, ethics and religion ,050105 experimental psychology ,Conjunction (grammar) ,Epistemology ,Equiprobability ,History and Philosophy of Science ,060302 philosophy ,Prior probability ,0501 psychology and cognitive sciences ,Construal level theory ,Holism - Abstract
In this paper, we distinguish Quine’s thesis of holism from the related Duhem-Quine problem. We discuss the construal of holism which claims that the effect of falsification is felt on a conjunction of hypotheses. The Duhem-Quine problem claims that there is no principled way of knowing how falsification affects individual conjuncts. This latter claim relies on holism and an additional commitment to the hypothetico-deductive model of theory confirmation such that it need not arise in non-deductive accounts. While existing personalist Bayesian treatments of the problem make this point by assuming values of priors for the conjuncts, we arrive at the same conclusion without invoking such assumptions. Our discussion focuses on the falsification of equiprobable conjuncts and highlights the role played by their alternatives in ascertaining their relative disconfirmation. The equiprobability of conjuncts is discussed alongside a historical case study
- Published
- 2018
17. Prospective Teachers’ Probabilistic Reasoning in the Context of Sampling
- Author
-
José Miguel Contreras, Carmen Díaz, Juan Jesús Ortiz, and Emilse Gómez-Torres
- Subjects
Equiprobability ,education.field_of_study ,Sample size determination ,Proportional reasoning ,Statistics ,Population ,Probabilistic logic ,Primary education ,Population proportion ,Psychology ,education ,Representativeness heuristic - Abstract
In this paper, we analyse the knowledge of sampling in 157 prospective primary school teachers in Spain. Using two different tasks, and taking into account common and horizon content knowledge (described in the model proposed by Ball et al. in J Teacher Educ 59:389–407, 2008), we assess the teachers’ understanding of the following concepts: population and sample, frequency, proportion, estimation, variability of estimates, and the effect of sample size on this variability. Our results suggest that these prospective teachers have correct intuitions when estimating the sample proportion when the population proportion is known. However, they tend to confuse samples and populations, sometimes fail to apply proportional reasoning, misinterpret unpredictability, and show the representativeness heuristic and the equiprobability bias.
- Published
- 2018
18. Using and Interpreting the Probability Calculus
- Author
-
Matthew D. Lund
- Subjects
Equiprobability ,Calculus ,medicine ,Probability calculus ,medicine.disease ,Formal system ,Calculus (medicine) ,Mathematics - Abstract
We have been discussing some of the fundamental features of the classical calculus of probability. The equiprobability of rival events was seen to be a major assumption of the calculus. Moreover, it is an assumption which the pure mathematician need not bother to justify. He need only present his formal system as follows
- Published
- 2018
19. An extended TODIM method under probabilistic dual hesitant fuzzy information and its application on enterprise strategic assessment
- Author
-
Zhiliang Ren, Zeshui Xu, and Hai Wang
- Subjects
Mathematical optimization ,021103 operations research ,Fuzzy set ,0211 other engineering and technologies ,Probabilistic logic ,Score ,02 engineering and technology ,Fuzzy logic ,Equiprobability ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Decision-making ,Uncertainty quantification ,Axiom - Abstract
In this paper, an extended TODIM method under the probabilistic dual hesitant fuzzy environment is proposed based on a revised score function and an equiprobability distance measure. The TODIM method can deal with multi-criteria decision making problems considering the DMs' psychological behavior. The probabilistic dual hesitant fuzzy set (PDHFS) is a very useful tool to handle the uncertainty in decision making process due to its ability that can describe the aleatory uncertainty and epistemic uncertainty in a single framework simultaneously. A revised score function of the probabilistic dual hesitant fuzzy element (PDHFE) is proposed to distinguish different probabilistic dual hesitant fuzzy information. In addition, we give an axiomatic definition about the distance measure of the PDHFEs and propose an equiprobability distance measure, which satisfies people's intuition better. Finally, we develop a new TODIM method and use a numerical case on enterprise strategic assessment to show its effectiveness and availability.
- Published
- 2017
20. Quantifying Changes in Reconnaissance Drought Index using Equiprobability Transformation Function
- Author
-
S. Zahra Samadi, Abolfazl Mosaedi, Mohammad Ghabaei Sough, and Hamid Zare Abyaneh
- Subjects
Equiprobability ,Transformation (function) ,Goodness of fit ,Threshold limit value ,Evapotranspiration ,Log-normal distribution ,Statistics ,Econometrics ,Probability distribution ,Probability density function ,Water Science and Technology ,Civil and Structural Engineering ,Mathematics - Abstract
The Reconnaissance Drought Index (RDI) is obtained by fitting a lognormal probability density function (PDF) to the ratio of accumulated precipitation over potential evapotranspiration values (αk) at different time scales. This paper aims to address the question of how a probability distribution may fit better to the αk values than a lognormal distribution and how RDI values may change in shorter (i.e.,3-month, and 6-month) and longer (i.e., 9-month, and annual) time scales during 1960–2010 period over various climate conditions (arid, semi-arid, and humid) in Iran. For this purpose, the series of RDI were initially computed by fitting a lognormal PDF to the αk values and the Kolmogorov–Smirnov (K-S) test was implemented to choose the best probability function in different window sizes from 3 to 12-months. The corresponding RDI values for the best distribution were then deriven based on an equiprobability transformation function. The differences between RDI values (the lognormal (RDIlog) and the best (RDIApp) distributions) were compared based on Nash-Sutcliffe efficiency (NSE) criterion. The results of goodness of fit test based on threshold value in the K-S test showed that the goodness of fit in the lognormal distribution may not be rejected at 0.01 and 0.05 significance levels while may only be rejected in a short term (Apr.-Jun.) period at humid station (Rasht station), and three-month (Oct.-Dec. and Apr.-Jun.) and six-month (Apr.-Sep.) periods in semi-arid station (Shiraz station) at significance levels of 0.10 and 0.20, correspondingly. Further a difference between RDIlog and RDIApp performed that RDI values may change if the best distribution employs and this may therefore lead to significant discrepant and/or displacement of drought severity classes in the RDI estimation.
- Published
- 2015
21. Minimum yield principle under incomplete prediction of financial markets
- Author
-
G. A. Agasandyan
- Subjects
Equiprobability ,Microeconomics ,Yield (finance) ,Financial market ,Economics ,Econometrics ,Portfolio ,Function (mathematics) ,Investment (macroeconomics) ,Beta distribution ,Exponential function - Abstract
The work investigates the properties of the solutions derived from the minimum yield principle in problems of constructing optimal in continuous VaR-criterion (CC-VaR) portfolio for an investor with own partial market forecast and own risk preferences function. Fundamental theoretical results are adduced and illustrated by examples of two-sided exponential, equiprobability, and beta distributions both for more underly's price and market forecast.
- Published
- 2017
22. Exact Algorithms for the Multinomial Extremes: Maximum, Minimum, Range and Sums
- Author
-
Anton Ogay, Marco Bonetti, and Pasquale Cirillo
- Subjects
Equiprobability ,symbols.namesake ,Exact algorithm ,Distribution (mathematics) ,Multivariate random variable ,Order statistic ,symbols ,Range (statistics) ,Multinomial distribution ,Poisson distribution ,Algorithm ,Mathematics - Abstract
Starting from a neglected work by Rappeport (1968), we re-propose an exact algorithm to compute the distribution of the maximum of a multinomial random vector under the hypothesis of equiprobability. We then show how to compute the exact probabilities of the sum of the J largest order statistics of the vector, following the suggestions and correcting the errors in the same article. Finally, we introduce brand new ways of computing the exact probabilities of the multinomial minimum and of the multinomial range. The exact probabilities we derive can be used in all those situations in which the multinomial distribution plays an important role, from goodness-of-fit tests to the study of Poisson processes, with applications spanning from biostatistics to finance. For all algorithms, we provide Matlab codes and ready-to-use tables of critical values.
- Published
- 2017
23. TEACHING PROBABILITY WITH THE SUPPORT OF THE R STATISTICAL SOFTWARE
- Author
-
Monica Karrer, Verônica Yumi Kataoka, and Robson Dos Santos Ferreira
- Subjects
Statistics and Probability ,Theoretical computer science ,Computer science ,Teaching method ,media_common.quotation_subject ,Probabilistic logic ,Context (language use) ,Tree diagram ,Literacy ,Education ,Equiprobability ,Task analysis ,Constructionism ,Mathematics education ,media_common - Abstract
The objective of this paper is to discuss aspects of high school students’ learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal’s probabilistic literacy and Papert’s constructionism. The results show improvement in students’ learning of basic concepts, such as: random experiment, estimation of probabilities, and calculation of probabilities using a tree diagram. The use of R allowed students to extend their reasoning beyond that developed from paper-and-pencil approaches, since it made it possible for them to work with a larger number of simulations, and go beyond the standard equiprobability assumption in coin tosses. First published November 2014 at Statistics Education Research Journal Archives
- Published
- 2014
24. The Real ‘Letter to Arbuthnot’? a Motive For Hume's Probability Theory in an Early Modern Design Argument
- Author
-
Catherine Kemp
- Subjects
Value (ethics) ,Game of chance ,Equiprobability ,Philosophy ,Extension (metaphysics) ,Probability theory ,Teleological argument ,Natural (music) ,Epistemology - Abstract
John Arbuthnot's celebrated but flawed paper in the Philosophical Transactions of 1711–12 is a philosophically and historically plausible target of Hume's probability theory. Arbuthnot argues for providential design rather than chance as a cause of the annual birth ratio, and the paper was championed as a successful extension of the new calculations of the value of wagers in games of chance to wagers about natural and social phenomena. Arbuthnot replaces the earlier anti-Epicurean notion of chance with the equiprobability assumption of Huygens's mathematics of games of chance, and misrepresents the birth ratio data to rule out chance in favour of design. The probability sections of Hume's Treatise taken together correct the equiprobability assumption and its extension to other kinds of phenomena in the estimation of wagers or expectations about particular events. Hume's probability theory demonstrates the flaw in this version of the design argument.
- Published
- 2014
25. The Effect of Activity-Based Teaching on Remedying the Probability-Related Misconceptions: A Cross-Age Comparison
- Author
-
Emrullah Erdem, Selçuk Fırat, and Ramazan Gürbüz
- Subjects
Equiprobability ,Computer science ,Intervention (counseling) ,education ,Significant difference ,Mathematics education ,Experimental Instructions ,General Medicine ,Session (computer science) ,Representativeness heuristic ,Developmental psychology - Abstract
The aim of this paper is to compare the effect of activity-based teaching on remedying probability-related misconceptions of students at different grades. Thus, a cross-sectional/age study was conducted with a total of 74 students in 6th-8th grades. Experimental instructions were given to all the groups three times/ week, 40 min/session, for 2 weeks. Students’ progress was examined by pre-test and post-test measurements. The results of the analysis showed that, as a result of the intervention, all graders’ post-test scores regarding all the concepts (PC: Probability Comparison, E: Equiprobability and R: Representativeness) showed a significant increase when compared to pre-test scores. It was found out that this increase did not create a significant difference based on age in PC concept, but that in 8th grade students, it showed a significant difference in E and R concepts compared to 6th graders. On the other hand, it was also assessed that the increases observed between 7th and 8th graders with regard to E and R concepts were not significant. In summary, the implemented intervention can be suggested to have different effects depending on age and the concept.
- Published
- 2014
26. A consistent set of infinite-order probabilities
- Author
-
David Atkinson, Jeanne Peijnenburg, Faculty of Philosophy, Theoretical Philosophy, and High-Energy Frontier
- Subjects
Higher-order probability ,Chain rule (probability) ,Infinite regress ,Applied Mathematics ,Law of total probability ,Conditional probability ,Symmetric probability distribution ,Tree diagram ,Theoretical Computer Science ,Combinatorics ,Equiprobability ,Regular conditional probability ,Artificial Intelligence ,Probability distribution ,Applied mathematics ,Software ,Model ,Mathematics - Abstract
Some philosophers have claimed that it is meaningless or paradoxical to consider the probability of a probability. Others have however argued that second-order probabilities do not pose any particular problem. We side with the latter group. On condition that the relevant distinctions are taken into account, second-order probabilities can be shown to be perfectly consistent.May the same be said of an infinite hierarchy of higher-order probabilities? Is it consistent to speak of a probability of a probability, and of a probability of a probability of a probability, and so on, ad infinitum? We argue that it is, for it can be shown that there exists an infinite system of probabilities that has a model. In particular, we define a regress of higher-order probabilities that leads to a convergent series which determines an infinite-order probability value. We demonstrate the consistency of the regress by constructing a model based on coin-making machines. We show that an infinite hierarchy of probabilities of probabilities is consistent.The proof consists in a model involving coin-making machines.Weak conditions are given for the convergence of the infinite system.
- Published
- 2013
27. Contractarian ethics and Harsanyi’s two justifications of utilitarianism
- Author
-
Michael Moehler
- Subjects
Economics and Econometrics ,Sociology and Political Science ,Welfare economics ,Rationality ,Rational agent ,Contractualism ,Equiprobability ,Philosophy ,Meaning (philosophy of language) ,Original position ,Utilitarianism ,Economics ,Mathematical economics ,Axiom - Abstract
Harsanyi defends utilitarianism by means of an axiomatic proof and by what he calls the ‘equiprobability model’. Both justifications of utilitarianism aim to show that utilitarian ethics can be derived from Bayesian rationality and some weak moral constraints on the reasoning of rational agents. I argue that, from the perspective of Bayesian agents, one of these constraints, the impersonality constraint, is not weak at all if its meaning is made precise and that generally it even contradicts individual rational agency. Without the impersonality constraint, Harsanyi’s two justifications of utilitarianism on the grounds of Bayesian rationality fail. As an alternative, I develop a contractarian framework that is compatible with individual rational agency and Harsanyi’s central assumptions, and that allows the derivation of moral conclusions on the grounds of Bayesian rationality. The developed framework offers a novel justification of contractarian ethics and may best be described as a combined version of Harsanyi’s equiprobability model and Rawls’s original position.
- Published
- 2013
28. Maximal entropy random walk in community detection
- Author
-
Zdzislaw Burda and Jeremi K. Ochab
- Subjects
Equiprobability ,Heterogeneous random walk in one dimension ,Maximal entropy ,Computer science ,Loop-erased random walk ,Stochastic matrix ,General Physics and Astronomy ,Entropy (information theory) ,General Materials Science ,Physical and Theoretical Chemistry ,Complex network ,Random walk ,Algorithm - Abstract
The aim of this paper is to check feasibility of using the maximal-entropy random walk in algorithms finding communities in complex networks. A number of such algorithms exploit an ordinary or a biased random walk for this purpose. Their key part is a (dis)similarity matrix, according to which nodes are grouped. This study en- compasses the use of a stochastic matrix of a random walk, its mean first-passage time matrix, and a matrix of weighted paths count. We briefly indicate the connection between those quantities and propose substituting the maximal-entropy random walk for the previously chosen models. This unique random walk maximises the entropy of ensembles of paths of given length and endpoints, which results in equiprobability of those paths. We compare the performance of the selected algorithms on LFR benchmark graphs. The results show that the change in performance depends very strongly on the particular algorithm, and can lead to slight improvements as well as to significant deterioration.
- Published
- 2013
29. Modelling Information by Probabilities
- Author
-
Carmen Batanero and Manfred Borovcnik
- Subjects
Equiprobability ,Interpretation (logic) ,Computer science ,Frequentist inference ,Conditional probability ,Experimental data ,Mathematical economics ,Frequency ,Independence (probability theory) ,Central limit theorem - Abstract
Probability embraces a cluster of ideas that help us to make predictions and judgements by modelling random situations suitably. Ideas such as experimental data, weight of uncertainty, and equiprobability contribute towards the concept of probability. The concept of independence is a basic prerequisite for the frequentist interpretation, whilst conditional probabilities are essential to adapt personal weights in view of new information.
- Published
- 2016
30. Asymptotic equidistribution of congruence classes with respect to the convolution iterates of a probability vector
- Author
-
Gilles Gnacadja
- Subjects
Statistics and Probability ,Discrete mathematics ,Doubly stochastic matrix ,Combinatorics ,Equiprobability ,Integer ,Iterated function ,Congruence (manifolds) ,Statistics, Probability and Uncertainty ,Circulant matrix ,Probability vector ,Convolution ,Mathematics - Abstract
Consider a positive integer d and a positive probability vector f over the numbers 0 , … , l . The n -fold convolution f ∗ n of f is a probability vector over the numbers 0 , … , n l , and these can be partitioned into congruence classes modulo d . The main result of this paper is that, asymptotically in n , these d congruence classes have equiprobability 1 / d . In the motivating application, one has N containers of capacity d and repeatedly retrieves one item from each of M randomly selected containers ( 0 M N ); containers are replenished to full capacity when emptied. The result implies that, over the long term, the number of containers requiring replenishment is M / d . This finding is relevant wherever one would be interested in the steady-state pace of replenishing fixed-capacity containers.
- Published
- 2012
31. The effect of computer-assisted teaching on remedying misconceptions: The case of the subject 'probability'
- Author
-
Ramazan Gürbüz, Osman Birgin, and Uşak Üniversitesi, Eğitim Fakültesi, Matematik ve Fen Bilimleri Eğitimi Bölümü
- Subjects
Secondary education ,General Computer Science ,Group study ,Computer science ,Interactive learning environments ,Representativeness heuristic ,Education ,Equiprobability ,Teaching/learning strategies ,Intervention (counseling) ,Improving classroom teaching ,Mathematics education ,Control (linguistics) ,Research method - Abstract
The aim of this study is to determine the effects of computer-assisted teaching (CAT) on remedying misconceptions students often have regarding some probability concepts in mathematics. Toward this aim, computer-assisted teaching materials were developed and used in the process of teaching. Within the true-experimental research method, a pre- and post-test control group study was carried out with 37 seventh-grade students-18 in the experimental group (CAT) and 19 in the control group (traditional teaching). A 12-item instrument, made up of 4 items related to each of the concepts "Probability Comparisons (PC)," "Equiprobability (E)," and "Representativeness (R)," was developed and implemented with the participants. After the teaching intervention, the same instrument was again administered to both groups as a post-test. In light of the findings, it can be concluded that computer-assisted teaching was significantly more effective than traditional methods in terms of remedying students' misconceptions. Highlights? We try to remedy misconceptions regarding probability. ? We design two different sets of computer-assisted teaching (CAT) materials. ? We assume that using materials together will reduce each other's disadvantages. ? CAT is more effective than traditional teaching in remedying misconceptions.
- Published
- 2012
32. Making heads or tails of probability: An experiment with random generators
- Author
-
Sylvie Serpell, Simon J. Handley, and Kinga Morsanyi
- Subjects
education ,Probabilistic logic ,Sample (statistics) ,Cognition ,Representativeness heuristic ,Session (web analytics) ,Education ,Equiprobability ,Developmental and Educational Psychology ,Heuristics ,Psychology ,Social psychology ,Randomness ,Cognitive psychology - Abstract
Background. The equiprobability bias is a tendency for individuals to think of probabilistic events as 'equiprobable' by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based on a misunderstanding of the concept of randomness. Aims. The aim of the present study was to examine whether experimenting with random generators would decrease the equiprobability bias. Sample. The participants were 108 psychology students whose performance was measured either immediately after taking part in a training session ( n= 55), or without doing any training exercises ( n= 53). Method. The training session consisted of four activities. These included generating random sequences of events, and learning about the law of large numbers. Subsequently, the participants were tested on a series of equiprobability problems, and a number of other problems with similar structure and content. Results. The results indicated that the training successfully decreased the equiprobability bias. However, this effect was moderated by participants' cognitive ability (i.e., higher ability participants benefitted from the training more than participants with lower cognitive ability). Finally, the training session had the unexpected side effect of increasing students' susceptibility to the representativeness heuristic. Conclusions. Experimenting with random generators has a positive effect on students' general understanding of probability, but the same time it might increase their susceptibility to certain biases (especially, to the representativeness heuristic). These findings have important implications for using training methods to improve probabilistic reasoning performance.
- Published
- 2012
33. Dissecting Perceptual Processes with a New Tri-Stable Reversible Figure
- Author
-
Gerald M. Long and Jared M. Batterman
- Subjects
Male ,Volition ,Optical Illusions ,Optical illusion ,media_common.quotation_subject ,Experimental and Cognitive Psychology ,Small sample ,Fixation, Ocular ,Sensory Systems ,Equiprobability ,Ophthalmology ,Discrimination, Psychological ,Pattern Recognition, Visual ,Artificial Intelligence ,Orientation ,Perception ,Fixation (visual) ,Humans ,Attention ,Female ,Cues ,Percept ,Psychology ,media_common ,Cognitive psychology - Abstract
Five experiments are presented that examine observers' reports with a new tri-stable reversible figure using two measures of observers' experience with the figure: observers' initial percept upon figure presentation in the test period and the total number of reversals reported in the test period. Experiment 1 demonstrates the equiprobability of the three alternatives for the figure. Experiment 2 demonstrates the powerful effect of fixation location on observers' reported organization of the tri-stable figure. Experiment 3 demonstrates clear priming effects following brief presentation of particular components of the tri-stable figure. Experiment 4 demonstrates clear adaptation effects following prolonged presentation of the same components of the figure used in experiment 3 as well as the transient nature of this adaptation. Experiment 5 demonstrates observers' ability to “hold” each of the three percepts regardless of fixation location. The special sensitivity of the tri-stable figure to these manipulations even with naive subjects and small sample sizes is discussed, and the interplay of both bottom–up and top–down processes on figural reversal is emphasized.
- Published
- 2012
34. Perception of probabilities in situations of risk: A case based approach
- Author
-
Gabrielle Gayer
- Subjects
Economics and Econometrics ,media_common.quotation_subject ,Law of total probability ,Function (mathematics) ,Equiprobability ,Perception ,Mental process ,Statistics ,Similarity (psychology) ,Econometrics ,Value (mathematics) ,Finance ,Mathematics ,media_common ,Simple (philosophy) - Abstract
This paper provides a description of a possible mental process individuals go through in their attempt to comprehend stated probabilities in simple lotteries. The evaluation of probabilities is based on the following main components: lotteries encountered in the past, the realizations of these lotteries, and the similarity between stated probabilities. A probability is evaluated based on the experienced relative frequencies of outcomes that had that stated probability, as well as outcomes of other lotteries that had similar stated probabilities. This process may result in distortion of probabilities as observed in the literature, and in particular, in overvaluing low probabilities and undervaluing high probabilities. If the decision maker uses a less permissive similarity function as the size of memory grows, she will learn the real value of the stated probabilities. If, however, the similarity function is independent of memory, biases persist even when data are accumulated.
- Published
- 2010
35. An empirical approach to symmetry and probability
- Author
-
Jill North
- Subjects
Equiprobability ,History ,A priori probability ,History and Philosophy of Science ,Computer science ,Fair coin ,Probabilistic logic ,General Physics and Astronomy ,A priori and a posteriori ,Probability and statistics ,Statistical mechanics ,Mathematical economics ,Principle of indifference - Abstract
We often rely on symmetries to infer outcomes’ probabilities, as when we infer that each side of a fair coin is equally likely to come up on a given toss. Why are these inferences successful? I argue against answering this question with an a priori indifference principle. Reasons to reject such a principle are familiar, yet instructive. They point to a new, empirical explanation for the success of our probabilistic predictions. This has implications for indifference reasoning generally. I argue that a priori symmetries need never constrain our probability attributions, even for initial credences.
- Published
- 2010
36. The effects and side-effects of statistics education: Psychology students’ (mis-)conceptions of probability
- Author
-
Caterina Primi, Kinga Morsanyi, Francesca Chiesi, and Simon J. Handley
- Subjects
Need for cognition ,Equiprobability ,Logical reasoning ,Developmental and Educational Psychology ,Contrast (statistics) ,Cognition ,Statistics education ,Psychology ,Representativeness heuristic ,Social psychology ,Cognitive bias ,Education ,Cognitive psychology - Abstract
In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists’ prediction who propose that education reduces misconceptions in general. They also predict that students with higher cognitive ability and higher need for cognition are less susceptible to biases. In Experiments 1 and 2 we found that the equiprobability bias increased with statistics education, and it was negatively correlated with students’ cognitive abilities. The representativeness heuristic was mostly unaffected by education, and it was also unrelated to cognitive abilities. In Experiment 3 we demonstrated through an instruction manipulation (by asking participants to think logically vs. rely on their intuitions) that the reason for these differences was that these biases originated in different cognitive processes.
- Published
- 2009
37. Lotteries, Justice and Probability
- Author
-
Peter Stone
- Subjects
Microeconomics ,Equiprobability ,Lottery ,Sociology and Political Science ,media_common.quotation_subject ,Economics ,Allocative efficiency ,Publicity ,Law and economics ,media_common ,Intuition - Abstract
Intuition suggests that a fair lottery is the appropriate way to allocate a scarce good when two or more people have equally strong claims to it. This article lays out three conditions that any conception of justice compatible with this intuition must satisfy — efficiency of outcomes, fairness of outcomes, and fairness of treatment. The third, unlike the first two, manifests itself only in the intentions of the allocative agent, not in the final allocation itself. For this reason, while justice generally requires publicity — requires, that is, that the justice of public practices be as visible as possible — for fairness of treatment publicity is indispensable. This fact has implications for defining a fair lottery. On most accounts, fair lotteries must be equiprobable. But while some theories of probability facilitate the connection between the equiprobability of fair lotteries and the contribution they can make to justice, others do not.
- Published
- 2009
38. Reconciling support theory and the book-making principle
- Author
-
Enrico Diecidue and Dolchai La-ornual
- Subjects
Economics and Econometrics ,Support ,Odds ,Equiprobability ,Consistency (negotiation) ,Accounting ,Premise ,Statistics ,Set (psychology) ,Mathematical economics ,Finance ,Decision analysis ,Event (probability theory) ,Mathematics - Abstract
Support theory postulates that an individual’s probability judgment for a particular event depends on the description of that event. We analyze decisions based on such a premise and demonstrate the theory’s incompatibility with popular models of choice under uncertainty. In particular, we show how support theory’s subjective probabilities are at odds with multi-prior beliefs in addition to additive and nonadditive probabilities. We propose a behavioral relaxation of a well-known consistency argument—the book-making principle, in order to accommodate such description-dependent subjective probabilities. As a consequence, we provide a characterization of a set of decisions where the underlying probability judgments follow from support theory. This result offers a unique way for using description-dependent subjective probabilities as consistent inputs for decision analysis and can aid the design of elicitation procedures.
- Published
- 2009
39. Beliefs about what types of mechanisms produce random sequences
- Author
-
Deborah S. Blinder and Daniel M. Oppenheimer
- Subjects
Scrutiny ,Sociology and Political Science ,Strategy and Management ,media_common.quotation_subject ,Stability (learning theory) ,General Decision Sciences ,Outcome (probability) ,Equiprobability ,Arts and Humanities (miscellaneous) ,Perception ,Independence (mathematical logic) ,Construct (philosophy) ,Social psychology ,Applied Psychology ,Randomness ,media_common ,Cognitive psychology ,Mathematics - Abstract
Although many researchers use Wagenaar’s framework for understanding the factors that people use to determine whether a process is random, the framework has never undergone empirical scrutiny. This paper uses Wagenaar’s framework as a starting point and examines the three properties of his framework—independence of events, fixed alternatives, and equiprobability. We find strong evidence to suggest that independence of events is indeed used as a cue toward randomness. Equiprobability has an effect on randomness judgments. However, it appears to work only in a limited role. Fixedness of alternatives is a complex construct that consists of multiple sub-concepts. We find that each of these sub-concepts influences randomness judgments, but that they exert forces in different directions. Stability of outcome ratios increases randomness judgments, while knowledge of outcome ratios decreases randomness judgments. Future directions for development of a functional framework for understanding perceptions of randomness are suggested. Copyright # 2008 John Wiley & Sons, Ltd.
- Published
- 2008
40. Boltzmann Kinetic Equation and Equiprobability Postulate
- Author
-
K. G. Folin
- Subjects
Equiprobability ,Physics ,Classical mechanics ,Boltzmann kinetic equation ,General Physics and Astronomy ,Boltzmann equation - Published
- 2007
41. Modeling of rain attenuation using equiprobability of rain fade and rain fall
- Author
-
Anurag Vidyarthi, B. S. Jassal, Swati Garg, and R. Gowri
- Subjects
Equiprobability ,Meteorology ,Attenuation ,Rain fall ,Rain fade ,Environmental science ,Satellite ,Time series ,Signal ,Data modeling - Abstract
Proposed work shows the modeling of rain attenuation. This modeling is based on rain rate versus rain attenuation characteristics by using equiprobability. Total three years data of signal and rain is used in this work. Data includes the information of received signal strength from IPSTAR satellite and rain fall data. Two year signal data at 20.2 GHz and two year rain data of corresponding period is used to develop rain attenuation model and one year data is used for validation. Delhi earth station data provided by SAC, Ahmedabad is used. The outcome of this work is very useful in short term signal behavior and rain fade mitigation technique to support uplink power control.
- Published
- 2015
42. Statistical Complexity. Applications in Electronic Systems
- Author
-
Jaime Sañudo and Ricardo López-Ruiz
- Subjects
Equiprobability ,Set (abstract data type) ,Distribution (number theory) ,Computer science ,Disequilibrium ,medicine ,Applied mathematics ,Function (mathematics) ,Statistical physics ,Type (model theory) ,medicine.symptom ,Measure (mathematics) ,Quantum - Abstract
In this review, a statistical measure of complexity is introduced and its properties are discussed. This measure is based on the interplay between the Shannon information, or a function of it, and the separation of the set of accessible states to a system from the equiprobability distribution, i.e. the disequilibrium. Different applications concerning with quantum systems are shown, from prototypical systems such as the H-atom, to other ones such as the periodic table, the metal clusters, the crystalline bands or the traveling densities. In all of them, this type of statistical indicators shows an interesting behavior able to discern and highlight some conformational properties of those systems.
- Published
- 2015
43. Statistical mechanics of two-dimensional foams: Physical foundations of the model
- Author
-
Marc Durand
- Subjects
Models, Molecular ,Molecular Conformation ,Biophysics ,Complex system ,FOS: Physical sciences ,Condensed Matter - Soft Condensed Matter ,Curvature ,Equiprobability ,Range (statistics) ,General Materials Science ,Statistical physics ,Condensed Matter - Statistical Mechanics ,Mechanical Phenomena ,Physics ,Conservation law ,Series (mathematics) ,Statistical Mechanics (cond-mat.stat-mech) ,Temperature ,Statistical model ,Surfaces and Interfaces ,General Chemistry ,Statistical mechanics ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Models, Theoretical ,Soft Condensed Matter (cond-mat.soft) ,Biotechnology - Abstract
In a recent series of papers [1--3], a statistical model that accounts for correlations between topological and geometrical properties of a two-dimensional shuffled foam has been proposed and compared with experimental and numerical data. Here, the various assumptions on which the model is based are exposed and justified: the equiprobability hypothesis of the foam configurations is argued. The range of correlations between bubbles is discussed, and the mean field approximation that is used in the model is detailed. The two self-consistency equations associated with this mean field description can be interpreted as the conservation laws of number of sides and bubble curvature, respectively. Finally, the use of a '' Grand-Canonical '' description, in which the foam constitutes a reservoir of sides and curvature, is justified.
- Published
- 2015
- Full Text
- View/download PDF
44. The significant digit law: a paradigm of statistical scale symmetries
- Author
-
Alain Pocheau
- Subjects
Distribution (number theory) ,Scale (ratio) ,Context (language use) ,Covariance ,Scale invariance ,Condensed Matter Physics ,01 natural sciences ,Symmetric probability distribution ,Electronic, Optical and Magnetic Materials ,Logarithmic distribution ,Equiprobability ,010104 statistics & probability ,Law ,0103 physical sciences ,0101 mathematics ,010306 general physics ,Mathematics - Abstract
In many different topics, the most significant digits of data series display a non-uniform distribution which points to an equiprobability of logarithms. This surprising ubiquitous property, known as the significant digit law, is shown here to follow from two similar, albeit different, scale symmetries: the scale-invariance and the scale-ratio invariance. After having legitimized these symmetries in the present context, the corresponding symmetric distributions are determined by implementing a covariance criterion. The logarithmic distribution is identified as the only distribution satisfying both symmetries. Attraction of other distributions to this most symmetric distribution by dilation, stretching and merging is investigated and clarified. The natures of both the scale-invariance and the scale-ratio invariance are further analyzed by determining the structure of the sets composed by the corresponding symmetric distributions. Altogether, these results provide new insights into the meaning and the role of scale symmetries in statistics.
- Published
- 2006
45. The Diversification Theorem Restated: Risk-pooling Without Assignment of Probabilities
- Author
-
Hong Wu and Göran Skogh
- Subjects
Equiprobability ,Economics and Econometrics ,Bayes estimator ,Actuarial science ,Accounting ,Pooling ,Diversification (finance) ,Economics ,Risk pool ,Finance - Abstract
Bayesian decision theory assumes that agents making choices assign subjective probabilities to outcomes, even in cases where information on probabilities is obviously absent. Here we show that agents that presume that they are equal risks can share risks mutually beneficially, even if the probabilities of losses are unpredictable or genuinely uncertain. We show also that different risk aversions among pool members do not exclude mutually beneficial loss sharing at uncertainty. Sharing when individuals’ losses differ in probabilities or in amount may still make individuals better off. Our findings are related to the theory of the insurance firm, to the management of development risks, and to the theory of justice.
- Published
- 2005
46. Sampling design for the World Health Survey in Brazil
- Author
-
Mauricio Teixeira Leite de Vasconcellos, Célia Landmann Szwarcwald, and Pedro Luis do Nascimento Silva
- Subjects
Adult ,Male ,Rural Population ,Urban Population ,Calibration (statistics) ,media_common.quotation_subject ,Population ,Sample (statistics) ,Global Health ,Sampling Studies ,Equiprobability ,Survey methodology ,Sampling design ,Humans ,education ,Selection Bias ,Aged ,media_common ,Sampling bias ,Selection bias ,education.field_of_study ,Public Health, Environmental and Occupational Health ,Middle Aged ,Health Surveys ,Survey Methods ,Geography ,Research Design ,Calibration ,Female ,Viés de Seleção ,Amostragem ,Métodos de Levantamento ,Brazil ,Demography - Abstract
This paper describes the sample design used in the Brazilian application of the World Health Survey. The sample was selected in three stages. First, the census tracts were allocated in six strata defined by their urban/rural situation and population groups of the municipalities (counties). The tracts were selected using probabilities proportional to the respective number of households. In the second stage, households were selected with equiprobability using an inverse sample design to ensure 20 households interviewed per tract. In the last stage, one adult (18 years or older) per household was selected with equiprobability to answer the majority of the questionnaire. Sample weights were based on the inverse of the inclusion probabilities in the sample. To reduce bias in regional estimates, a household weighting calibration procedure was used to reduce sample bias in relation to income, sex, and age group. Este artigo descreve o desenho da amostra da Pesquisa Mundial de Saúde no Brasil. A amostra foi selecionada em três estágios. No primeiro, os setores censitários foram divididos em seis estratos, definidos pela situação e porte populacional dos municípios, e selecionados com probabilidade proporcional ao seu número de domicílios. No segundo estágio, os domicílios foram selecionados com eqüiprobabilidade, seguindo um esquema de amostragem inversa, para assegurar vinte entrevistas realizadas por setor. No último estágio foi selecionado com eqüiprobabilidade um adulto (18 anos ou mais) por domicílio para responder aos principais quesitos do questionário. A expansão da amostra foi feita com base nas probabilidades de seleção e, para permitir a obtenção de estimativas regionalizadas, os fatores de expansão foram calibrados para assegurar coerência com os totais populacionais por grupos de macrorregiões, quintos de renda, sexo e grupos etários, por meio de estimadores de regressão.
- Published
- 2005
47. Условия предельной равновероятности распределений в схеме линейной авторегрессии со случайным управлением на конечной группе
- Author
-
Igor Aleksandrovich Kruglov
- Subjects
Equiprobability ,Discrete mathematics ,Finite group ,Autoregressive model ,Limit distribution ,Scheme (mathematics) ,Applied mathematics ,Mathematics - Published
- 2005
48. χ2and the lottery
- Author
-
Christian Genest, Michael A. Stephens, and Richard A. Lockhart
- Subjects
Statistics and Probability ,Equiprobability ,Lottery ,symbols.namesake ,Goodness of fit ,Statistics ,Pearson's chi-squared test ,symbols ,Test statistic ,Asymptotic distribution ,Random variable ,Statistic ,Mathematics - Abstract
Summary. The winners of many lotteries are determined by selecting at random some numbered balls from an urn. This paper discusses the use of Pearson’s standard goodness-of-fit statistic to test for the equiprobability of occurrence of such lottery numbers, whether taken individually, in pairs or in larger subsets. Because the numbers are drawn without replacement, Pearson’s statistic does not follow a simple χ 2 -distribution, even for large samples of draws. In fact, it can be shown that its asymptotic distribution is that of a weighted sum of χ 2 random variables. An explicit formula is given for the weights, and this result is used to check the uniformity of winning numbers in Canada’s Lotto 6/49 over a period of nearly 20 years.
- Published
- 2002
49. Ro-based PRNG: FPGA implementation and stochastic analysis
- Author
-
Luciana De Micco, Hilda A. Larrondo, Eduardo Boemo, and M. Antonelli
- Subjects
Pseudorandom number generator ,Equiprobability ,Computer science ,Stochastic process ,Histogram ,Entropy (information theory) ,NIST ,Field-programmable gate array ,Information theory ,Algorithm - Abstract
This paper deals with the use of Ring Oscillators (ROs) as pseudo random number generators (PRNG). The design, made for ALTERA Cyclone III ©, using low level primitives is explained. Two relevant characteristics of a PRNG are considered to validate the design: 1) the equiprobability of all possible outcomes and 2) the statistical independence of consecutive values. In this work these properties are measured via Information Theory Quantifiers. A dual entropy plane is used to represent the time series and easily visualize the results obtained with different configurations. The quality is also compared with other available PRNGs by means of the dual entropy plane. Our method constitutes an effective reduction of the complete analysis made with test suites like DIEHARD or NIST.
- Published
- 2014
50. The equiprobability bias from a mathematical and psychological perspective
- Author
-
Nicolas Gauvrit and Kinga Morsanyi
- Subjects
Computer science ,Fair distribution ,Stochastic process ,General Neuroscience ,Perspective (graphical) ,randomness ,Experimental and Cognitive Psychology ,uniformity ,Outcome (probability) ,Equiprobability ,Mathematical theory ,Psychiatry and Mental health ,Clinical Psychology ,subjective probability ,Econometrics ,Psychology (miscellaneous) ,complexity ,Social psychology ,equiprobability bias ,Applied Psychology ,Randomness ,Research Article - Abstract
The equiprobability bias (EB) is a tendency to believe that every process in which randomness is involved corresponds to a fair distribution, with equal probabilities for any possible outcome. The EB is known to affect both children and adults, and to increase with probability education. Because it results in probability errors resistant to pedagogical interventions, it has been described as a deep misconception about randomness: the erroneous belief that randomness implies uniformity. In the present paper, we show that the EB is actually not the result of a conceptual error about the definition of randomness. On the contrary, the mathematical theory of randomness does imply uniformity. However, the EB is still a bias, because people tend to assume uniformity even in the case of events that are not random. The pervasiveness of the EB reveals a paradox: The combination of random processes is not necessarily random. The link between the EB and this paradox is discussed, and suggestions are made regarding educational design to overcome difficulties encountered by students as a consequence of the EB.
- Published
- 2014
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.