192 results on '"Equiprobability"'
Search Results
2. First Principles of the Classical Mechanics and the Foundations of Statistical Mechanics on the Example of a Disordered Spin System
- Author
-
V. V. Sahakyan and A. S. Gevorkyan
- Subjects
Physics ,Statistical ensemble ,Hamiltonian mechanics ,Partition function (statistical mechanics) ,Spin glass ,010308 nuclear & particles physics ,General Physics and Astronomy ,Statistical mechanics ,01 natural sciences ,Equiprobability ,Lattice (module) ,symbols.namesake ,Classical mechanics ,0103 physical sciences ,symbols ,010306 general physics ,Spin-½ - Abstract
We study the classical multicomponent disordered 3D spin system taking into account the temperature of the medium in the framework of the model of nearest neighbors. The latter allows the 3D problem with a cubic lattice to reduce to the 1D Heisenberg spin glass problem with a random environment. Using the Hamilton equations of motion, a recurrent equation is obtained that connects three spins in successive nodes of 1D lattice, taking into account the influence of a random environment. This equation, together with the corresponding conditions of a local minimum energy in nodes, allows to construct node-by-node a stable spin chains and, accordingly, to calculate all parameters of statistical ensemble from the first principles of classical mechanics, without using any additional assumptions, in particular, the main axiom of statistical mechanics – the equiprobability of statistical states. Using the example of 1D Heisenberg spin glass model, the features of the new approach are studied in detail and the statistical mechanics of the system are constructed without using the standard representation of the partition function (PF).
- Published
- 2020
3. Grammatically uniform population initialization for grammar-guided genetic programming
- Author
-
Daniel Manrique, Pablo Ramos Criado, D. Barrios Rolanía, and Emilio Serrano
- Subjects
education.field_of_study ,Uniform distribution (continuous) ,Computer science ,Population ,Sampling (statistics) ,Initialization ,Genetic programming ,Computational intelligence ,Evolutionary computation ,Theoretical Computer Science ,Equiprobability ,Geometry and Topology ,education ,Algorithm ,Software - Abstract
The initial population distribution is an essential issue in evolutionary computation performance. Population initialization methods for grammar-guided genetic programming have some difficulties generating a representative sample of the search space, which negatively affects the overall evolutionary process. This paper presents a grammatically uniform population initialization method to address this issue by improving the initial population uniformity: the equiprobability of obtaining any individual of the search space defined by the context-free grammar. The proposed initialization method assigns and updates probabilities dynamically to the production rules of the grammar to pursue uniformity and includes a code bloat control mechanism. We have conducted empirical experiments to compare the proposed algorithm with a standard initialization approach very often used in grammar-guided genetic programming. The results report that the proposed initialization method approximates very well a uniform distribution of the individuals in the search space. Moreover, the overall evolutionary process that takes place after the population initialization performs better in terms of convergence speed and quality of the final solutions achieved when the proposed method generates the initial population than when the usual approach does. The results also show that these performance differences are more significant when the experiments involve large search spaces.
- Published
- 2020
4. Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
- Author
-
Xavier Calbet, Ricardo López-Ruiz, and Jaime Sañudo
- Subjects
multi-agent systems ,equiprobability ,economic models ,Gamma distributions ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed.
- Published
- 2009
- Full Text
- View/download PDF
5. Prospective Mathematics Teachers Understanding of Classical and Frequentist Probability
- Author
-
Rocío Álvarez-Arroyo, Silvia M. Valenzuela-Ruiz, Carmen Batanero, and Nuria Begué
- Subjects
relating the classical and frequentist views ,Frequentist probability ,teacher’s knowledge ,General Mathematics ,probability ,assessment ,Probabilistic logic ,Sampling (statistics) ,Sample (statistics) ,Assessment ,Relating the classical and frequentist views ,Equiprobability ,Frequentist inference ,QA1-939 ,Computer Science (miscellaneous) ,Mathematics education ,Teacher’s knowledge ,Engineering (miscellaneous) ,Mathematics ,Independence (probability theory) ,Event (probability theory) ,Probability - Abstract
Strengthening the teaching of probability requires an adequate training of prospective teachers, which should be based on the prior assessment of their knowledge. Consequently, the aim of this study was to analyse how 139 prospective Spanish mathematics teachers relate the classical and frequentist approaches to probability. To achieve this goal, content analysis was used to categorize the prospective teachers’ answers to a questionnaire with open-ended tasks in which they had to estimate and justify the composition of an urn, basing their answers on the results of 1000 extractions from the urn. Most of the sample proposed an urn model consistent with the data provided; however, the percentage that adequately justified the construction was lower. Although the majority of the sample correctly calculated the probability of an event in a new extraction and chose the urn giving the highest probability, a large proportion of the sample forgot the previously constructed urn model, using only the frequency data. Difficulties, such as equiprobability bias or not perceiving independence of trials in replacement sampling, were also observed for a small part of the sample. These results should be considered in the organisation of probabilistic training for prospective teachers., PID2019-105601GB-I00/AEI/10.13039/501100011033 (Ministerio de Ciencia e Innovación), FQM-126 (Junta de Andalucía, Spain)
- Published
- 2021
- Full Text
- View/download PDF
6. Age of Changed Information: Content-Aware Status Updating in the Internet of Things
- Author
-
Xijun Wang, Wenrui Lin, Xinghua Sun, Xiang Chen, and Chao Xu
- Subjects
Networking and Internet Architecture (cs.NI) ,FOS: Computer and information sciences ,Mathematical optimization ,Markov chain ,Computer science ,Network packet ,Information Theory (cs.IT) ,Computer Science - Information Theory ,Process (computing) ,Markov model ,Equiprobability ,Computer Science - Networking and Internet Architecture ,Discrete time and continuous time ,Markov decision process ,Electrical and Electronic Engineering ,Average cost - Abstract
In Internet of Things (IoT), the freshness of status updates is crucial for mission-critical applications. In this regard, it is suggested to quantify the freshness of updates by using Age of Information (AoI) from the receiver's perspective. Specifically, the AoI measures the freshness over time. However, the freshness in the content is neglected. In this paper, we introduce an age-based utility, named as Age of Changed Information (AoCI), which captures both the passage of time and the change of information content. By modeling the underlying physical process as a discrete time Markov chain, we investigate the AoCI in a time-slotted status update system, where a sensor samples the physical process and transmits the update packets to the destination. With the aim of minimizing the weighted sum of the AoCI and the update cost, we formulate an infinite horizon average cost Markov Decision Process. We show that the optimal updating policy has a special structure with respect to the AoCI and identify the condition under which the special structure exists. By exploiting the special structure, we provide a low complexity relative policy iteration algorithm that finds the optimal updating policy. We further investigate the optimal policy for two special cases. In the first case where the state of the physical process transits with equiprobability, we show that optimal policy is of threshold type and derive the closed-form of the optimal threshold. We then study a more generalized periodic Markov model of the physical process in the second case. Lastly, simulation results are laid out to exhibit the performance of the optimal updating policy and its superiority over the zero-wait baseline policy., arXiv admin note: text overlap with arXiv:2003.00384
- Published
- 2021
7. First-order phase statistics in Laguerre-Gauss speckles
- Author
-
Aristide Dogariu, Pedro A. Alvarez Fernandez, and Cristian Hernando Acevedo
- Subjects
Equiprobability ,Physics ,Speckle pattern ,Random field ,Scattering ,Gauss ,Phase (waves) ,Laguerre polynomials ,Statistical physics ,Standard deviation - Abstract
We present a statistical analysis of the phase of random fields generated by scattering of Laguerre-Gauss beams. The standard deviation of the speckle phase is studied using the formalism of equiprobability density ellipses.
- Published
- 2021
8. Complete null agent for games with externalities
- Author
-
M. G. Fiestras-Janeiro, Andrés Jiménez-Losada, José María Alonso-Meijide, Mikel Álvarez-Mozos, and Universitat de Barcelona
- Subjects
0209 industrial biotechnology ,Property (philosophy) ,Computer science ,02 engineering and technology ,computer.software_genre ,Equiprobability ,Externalitats (Economia) ,020901 industrial engineering & automation ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,Game theory ,Axiom ,Null (mathematics) ,General Engineering ,TheoryofComputation_GENERAL ,Shapley value ,Expert system ,Computer Science Applications ,Teoria de jocs ,Externalities (Economics) ,020201 artificial intelligence & image processing ,Mathematical economics ,Value (mathematics) ,computer ,Béns públics ,Public goods - Abstract
Game theory provides valuable tools to examine expert multi-agent systems. In a cooperative game, collaboration among agents leads to better outcomes. The most important solution for such games is the Shapley value, that coincides with the expected marginal contribution assuming equiprobability. This assumption is not plausible when externalities are present in an expert system. Generalizing the concept of marginal contributions, we propose a new family of Shapley values for situations with externalities. The properties of the Shapley value offer a rationale for its application. This family of values is characterized by extensions of Shapley’s axioms: efficiency, additivity, symmetry, and the null player property. The first three axioms have widely accepted generalizations to the framework of games with externalities. However, different concepts of null players have been proposed in the literature and we contribute to this debate with a new one. The null player property that we use is weaker than the others. Finally, we present one particular value of the family, new in the literature, and characterize it by two additional properties.
- Published
- 2019
9. Are non-accidental regularities a cosmic coincidence? Revisiting a central threat to Humean laws
- Author
-
Aldo Filomeno
- Subjects
Philosophy of science ,Credence ,media_common.quotation_subject ,Philosophy ,05 social sciences ,Doxastic logic ,General Social Sciences ,Ignorance ,06 humanities and the arts ,0603 philosophy, ethics and religion ,Principle of indifference ,050105 experimental psychology ,Equiprobability ,Argument ,Law ,060302 philosophy ,0501 psychology and cognitive sciences ,Suspension of judgment ,media_common - Abstract
If the laws of nature are as the Humean believes, it is an unexplained cosmic coincidence that the actual Humean mosaic is as extremely regular as it is. This is a strong and well-known objection to the Humean account of laws. Yet, as reasonable as this objection may seem, it is nowadays sometimes dismissed. The reason: its unjustified implicit assignment of equiprobability to each possible Humean mosaic; that is, its assumption of the principle of indifference, which has been attacked on many grounds ever since it was first proposed. In place of equiprobability, recent formal models represent the doxastic state of total ignorance as suspension of judgment. In this paper I revisit the cosmic coincidence objection to Humean laws by assessing which doxastic state we should endorse. By focusing on specific features of our scenario I conclude that suspending judgment results in an unnecessarily weak doxastic state. First, I point out that recent literature in epistemology has provided independent justifications of the principle of indifference. Second, given that the argument is framed within a Humean metaphysics, it turns out that we are warranted to appeal to these justifications and assign a uniform and additive credence distribution among Humean mosaics. This leads us to conclude that, contrary to widespread opinion, we should not dismiss the cosmic coincidence objection to the Humean account of laws.
- Published
- 2019
10. The Shannon–McMillan Theorem Proves Convergence to Equiprobability of Boltzmann’s Microstates
- Author
-
Arnaldo Spalvieri
- Subjects
Science ,QC1-999 ,FOS: Physical sciences ,General Physics and Astronomy ,Physics - Classical Physics ,02 engineering and technology ,Astrophysics ,Information theory ,01 natural sciences ,Article ,010305 fluids & plasmas ,Equiprobability ,symbols.namesake ,0103 physical sciences ,Convergence (routing) ,0202 electrical engineering, electronic engineering, information engineering ,Statistical physics ,Link (knot theory) ,Boltzmann's entropy formula ,Shannon–McMillan theorem ,Mathematics ,Physics ,Classical Physics (physics.class-ph) ,Statistical mechanics ,Boltzmann–Planck entropy formula ,QB460-466 ,Boltzmann constant ,symbols ,equiprobability of microstates ,020201 artificial intelligence & image processing ,Shannon–McMillan theorem, equiprobability of microstates ,Identical particles - Abstract
The paper shows that, for large number of particles and for distinguishable and non-interacting identical particles, convergence to equiprobability of the $W$ microstates of the famous Boltzmann-Planck entropy formula $S=k \log(W)$ is proved by the Shannon-McMillan theorem, a cornerstone of information theory. This result further strengthens the link between information theory and statistical mechanics., Comment: 5 pages
- Published
- 2021
11. WAYS IN WHICH HIGH-SCHOOL STUDENTS UNDERSTAND THE SAMPLING DISTRIBUTION FOR PROPORTIONS
- Author
-
Nuria Begué, María Magdalena Gea, Manfred Borovcnik, and Carmen Batanero
- Subjects
Statistics and Probability ,education.field_of_study ,05 social sciences ,Population ,Sampling (statistics) ,Sample (statistics) ,Education ,Equiprobability ,Sampling distribution ,Sample size determination ,Concept learning ,0502 economics and business ,Statistics ,050211 marketing ,education ,Statistics education ,Psychology ,050203 business & management - Abstract
In Spain, curricular guidelines as well as the university-entrance tests for social-science high-school students (17–18 years old) include sampling distributions. To analyse the understanding of this concept we investigated a sample of 234 students. We administered a questionnaire to them and ask half for justifications of their answers. The questionnaire consisted of four sampling tasks with two sample sizes (n = 100 and 10) and population proportions (equal or different to 0.5)systematically varied. The experiment gathered twofold data from the students simultaneously, namely about their perception of the mean and about their understanding of variation of the sampling distribution. The analysis of students’ responses indicates a good understanding of the relationship between the theoretical proportion in the population and the sample proportion. Sampling variability, however, was overestimated in bigger samples. We also observed various types of biased thinking in the students: the equiprobability and recency biases, as well as deterministic pre-conceptions. The effect of the task variables on the students’ responses is also discussed here. First published December 2020 at Statistics Education Research Journal: Archives
- Published
- 2020
12. Autonomous Brownian gyrators: A study on gyrating characteristics
- Author
-
Hsin Chang, Chi Lun Lee, Yung-Fu Chen, and Pik Yin Lai
- Subjects
Physics ,Statistical Mechanics (cond-mat.stat-mech) ,Isotropy ,FOS: Physical sciences ,Harmonic (mathematics) ,01 natural sciences ,Stationary point ,010305 fluids & plasmas ,Equiprobability ,Maxima and minima ,Classical mechanics ,Flow (mathematics) ,Quartic function ,0103 physical sciences ,010306 general physics ,Condensed Matter - Statistical Mechanics ,Brownian motion - Abstract
We study the nonequilibrium steady-state (NESS) dynamics of two-dimensional Brownian gyrators under harmonic and nonharmonic potentials via computer simulations and analyses based on the Fokker-Planck equation, while our nonharmonic cases feature a double-well potential and an isotropic quartic potential. In particular, we report two simple methods that can help understand gyrating patterns. For harmonic potentials, we use the Fokker-Planck equation to survey the NESS dynamical characteristics, i.e., the NESS currents gyrate along the equiprobability contours and the stationary point of flow coincides with the potential minimum. As a contrast, the NESS results in our nonharmonic potentials show that these properties are largely absent, as the gyrating patterns are much distinct from those of corresponding probability distributions. Furthermore, we observe a critical case of the double-well potential, where the harmonic contribution to the gyrating pattern becomes absent, and the NESS currents do not circulate about the equiprobability contours nearby the potential minima even at low temperatures., 9 pages, 3 figures
- Published
- 2020
13. Statistical mechanics of a nonequilibrium steady-state classical particle system driven by a constant external force
- Author
-
Yanting Wang and Jie Yao
- Subjects
Physics ,Work (thermodynamics) ,Steady state ,Statistical Mechanics (cond-mat.stat-mech) ,Physics and Astronomy (miscellaneous) ,010308 nuclear & particles physics ,Entropy (statistical thermodynamics) ,Ergodicity ,FOS: Physical sciences ,Position and momentum space ,Statistical mechanics ,01 natural sciences ,Equiprobability ,0103 physical sciences ,Relaxation (physics) ,Statistical physics ,010306 general physics ,Condensed Matter - Statistical Mechanics - Abstract
A classical particle system coupled with a thermostat driven by an external constant force reaches its steady state when the ensemble-averaged drift velocity does not vary with time. The statistical mechanics of such a system is derived merely based on the equal probability and ergodicity principles, free from any conclusions drawn on equilibrium statistical mechanics or local equilibrium hypothesis. The momentum space distribution is determined by a random walk argument, and the position space distribution is determined by employing the equal probability and ergodicity principles. The expressions for energy, entropy, free energy, and pressures are then deduced, and the relation among external force, drift velocity, and temperature is also established. Moreover, the relaxation towards its equilibrium is found to be an exponentially decaying process obeying the minimum entropy production theorem., 11 pages, 1 figure
- Published
- 2020
14. The Lorenz Curve: A Proper Framework to Define Satisfactory Measures of Symbol Dominance, Symbol Diversity, and Information Entropy
- Author
-
Julio A. Camargo and Universidad de Alcalá. Departamento de Ciencias de la Vida
- Subjects
Information entropy ,General Physics and Astronomy ,lcsh:Astrophysics ,Absolute difference ,0603 philosophy, ethics and religion ,01 natural sciences ,Article ,Environmental science ,Equiprobability ,symbol dominance ,010104 statistics & probability ,Lorenz curve ,Symbol diversity ,Rényi’s entropy ,Statistics ,lcsh:QB460-466 ,Entropy (information theory) ,0101 mathematics ,lcsh:Science ,Relative species abundance ,Mathematics ,Rényi´s entropy ,Shannon´s entropy ,Vertical distance ,information entropy ,06 humanities and the arts ,lcsh:QC1-999 ,Shannon’s entropy ,Medio Ambiente ,060302 philosophy ,Probability distribution ,lcsh:Q ,Symbol dominance ,lcsh:Physics ,Camargo statistics ,symbol diversity - Abstract
Novel measures of symbol dominance (dC1 and dC2), symbol diversity (DC1 = N (1 &minus, dC1) and DC2 = N (1 &minus, dC2)), and information entropy (HC1 = log2 DC1 and HC2 = log2 DC2) are derived from Lorenz-consistent statistics that I had previously proposed to quantify dominance and diversity in ecology. Here, dC1 refers to the average absolute difference between the relative abundances of dominant and subordinate symbols, with its value being equivalent to the maximum vertical distance from the Lorenz curve to the 45-degree line of equiprobability, dC2 refers to the average absolute difference between all pairs of relative symbol abundances, with its value being equivalent to twice the area between the Lorenz curve and the 45-degree line of equiprobability, N is the number of different symbols or maximum expected diversity. These Lorenz-consistent statistics are compared with statistics based on Shannon&rsquo, s entropy and Ré, nyi&rsquo, s second-order entropy to show that the former have better mathematical behavior than the latter. The use of dC1, DC1, and HC1 is particularly recommended, as only changes in the allocation of relative abundance between dominant (pd >, 1/N) and subordinate (ps <, 1/N) symbols are of real relevance for probability distributions to achieve the reference distribution (pi = 1/N) or to deviate from it.
- Published
- 2020
- Full Text
- View/download PDF
15. Pre-service Teachers’ Probabilistic Reasoning in Constructivist Classroom
- Author
-
Evans Kofi Hokor
- Subjects
Class (computer programming) ,Logical reasoning ,Teaching method ,0211 other engineering and technologies ,021107 urban & regional planning ,02 engineering and technology ,Representativeness heuristic ,Teacher education ,Constructivist teaching methods ,Equiprobability ,021105 building & construction ,Mathematics education ,Belief bias ,Psychology - Abstract
Several studies revealed that probability misconceptions were widespread among students, but the activities for addressing the misconceptions has been lacking. This study designed activities that reflect real life situations for addressing equiprobability bias, positive and negative recency effects, belief bias and representativeness bias for teaching probability globally. Thirty-two pre-service teachers from one intact class were purposively sampled for the study. The instruments used in the collection of data were observation and questionnaire. The study found constructivist approach of teaching with critical questions asked by the teacher to be vital in addressing misconceptions. The findings suggest that teacher educators should use the constructivist approach of teaching targeting probabilistic misconceptions in training of teachers.
- Published
- 2020
16. Algumas reflexões sobre a definição de probabilidade
- Author
-
Fernando Montanaro Paiva de Almeida, André Gustavo Campos Pereira, Igor Bruno Dantas Nunes, Francisco Erivan de Almeida Júnior, Gleydson Medeiros de Souza, George Luiz Coelho Cortes, Arthur Henrique da Silva, and George Homer Barbosa de Medeiros
- Subjects
Educación Matemática desde otras disciplinas ,Subject (philosophy) ,Comprensión ,Face (sociological concept) ,General Medicine ,_Otro (probabilidad) ,Cone (formal languages) ,Outcome (probability) ,Test (assessment) ,Epistemology ,Equiprobability ,Libros de texto ,Ice cream ,Situaciones ,Psychology ,Simple (philosophy) - Abstract
Every day we face situations in which decisions have to be made. Some of them are very simple ones, e.g., whether you want an ice cream or not. In case you decide to have one, you have to decide if you want it in a plastic bowl or in an ice cream cone, and you still have to choose the ice cream flavor(s). Sometimes we have some preferences among all options presented, sometimes all options seem the same. Even in simple situations, partiality is always present, then why the teaching of probability (in middle/high school) focus on the indifference (equiprobability)? In this work we observed not only by the analysis of books and master dissertations but also by the analysis of the outcome of a test that was answered by students (high school and undergraduate), that the sedimented probability definition is the one that force us to accept the equiprobability as the only way to deal with stochastic happenings, and that very little has been made to change such picture. We also regard that the manner that some books illustrate the subject can harden the understanding even when equiprobability is considered.
- Published
- 2020
17. JAPANESE AND THAI SENIOR HIGH SCHOOL MATHEMATICS TEACHERS’ KNOWLEDGE OF VARIABILITY
- Author
-
Orlando Rafael González González, Masami Isoda, and Somchai Chitmun
- Subjects
Statistics and Probability ,Estimation ,Equiprobability ,Research literature ,Mathematics education ,Survey research ,Statistical literacy ,Mathematics instruction ,Statistics education ,Education - Abstract
In this article, the conceptions of variability held by samples of Japanese and Thai senior high school mathematics teachers were identified, based on the framework proposed by Shaughnessy (2007), using a comparative survey study. From contrasting the results of the two groups, relative tendencies of insufficient statistical knowledge for variability were found in both samples, such as a tendency of Japanese teachers to overgeneralize equiprobability, whereas Thai teachers tended to overgeneralize estimation. Based on these findings, the use of well-known tasks from the research literature for this comparative study seems useful to clarify the relative tendencies and insufficiencies in teacher knowledge and conceptions regarding variability held by both groups. First published November 2018 at Statistics Education Research Journal Archives
- Published
- 2018
18. Korean Preservice Elementary Teachers’ Abilities to Identify Equiprobability Bias and Teaching Strategies
- Author
-
Eun-Jung Lee and Mimi Park
- Subjects
General Mathematics ,05 social sciences ,050301 education ,Problem context ,Science education ,Teacher education ,Education ,Equiprobability ,School teachers ,Mathematics education ,0501 psychology and cognitive sciences ,Psychology ,0503 education ,050104 developmental & child psychology - Abstract
Equiprobability bias (EB) is one of the frequently observed misconceptions in probability education in K-12 and can be affected by a problem context. As future teachers, preservice teachers need to have a stable understanding of probability and to have the knowledge to identify EB in their students regardless of the problem context. However, there are few studies to explore how preservice teachers identify students’ EB and how they respond to students’ EB. This study investigated Korean preservice elementary school teachers’ abilities to identify students’ EB in two problem contexts, marble and baseball problems, as well as their teaching strategies for correcting students’ EB within each problem. Ninety-six preservice elementary school teachers participated in this study. They were presented with two problems with students having EB and were asked to write lesson plays. From the analysis of their lesson plays, it was found that 87% of the preservice teachers identified students’ EB in both problems, and in the baseball problem, 13% of them did not. Three teaching strategies for correcting students’ EB in each problem were found. Based on the results, implications for preservice elementary teacher education were discussed.
- Published
- 2018
19. Cross‐polarisation discrimination models assessment and improvement on earth‐space propagation paths at Ka and V‐bands
- Author
-
Flavio Jorge, Carlo Riva, and Armando Rocha
- Subjects
Physics ,Cumulative distribution function ,Attenuation ,020208 electrical & electronic engineering ,020206 networking & telecommunications ,02 engineering and technology ,Spectral efficiency ,Interference (wave propagation) ,Computational physics ,Equiprobability ,0202 electrical engineering, electronic engineering, information engineering ,Communications satellite ,Ka band ,Electrical and Electronic Engineering ,V band - Abstract
The performance of the satellite communication systems employing polarisation diversity or frequency-reuse schemes to improve the spectral efficiency is degraded due to the depolarisation-induced interference originated by raindrops and ice particles present along the Earth-Space propagation path. Two models account for both rain and ice contributions. One predicts the long-term cumulative distribution function (CDF) of cross-polarisation discrimination (XPD). The other predicts the relationship between XPD and co-polar attenuation (CPA) and it was derived considering exclusively data at the V-band. In this study, the former model is improved by considering the individual ice and rain contributions and their combined effects, while the latter is validated against new measurements at the Ka band. New models are then proposed for the XPD-CPA relationships at the Ka-band taking into account both rain and ice contributions and also their combined effects. Finally, the predictions provided by the first model are usually converted to the corresponding XPD-CPA relationship using the long-term first-order statistics of rain attenuation, (incorrectly) considering that the equiprobability hypothesis applies. A new approach for the conversion of the XPD CDF into the corresponding XPD-CPA relationship is presented.
- Published
- 2018
20. Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems.
- Author
-
López-Ruiz, Ricardo, Sañudo, Jaime, and Calbet, Xavier
- Subjects
- *
ESTIMATION theory , *ENTROPY (Information theory) , *INFORMATION theory , *SYSTEMS theory , *DISTRIBUTION (Probability theory) , *STOCHASTIC processes - Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
21. Beliefs about what types of mechanisms produce random sequences.
- Author
-
Blinder, Deborah S. and Oppenheimer, Daniel M.
- Subjects
JUDGMENT (Psychology) ,THOUGHT & thinking ,THEORY of knowledge ,PSYCHOLOGY ,DECISION making - Abstract
Although many researchers use Wagenaar's framework for understanding the factors that people use to determine whether a process is random, the framework has never undergone empirical scrutiny. This paper uses Wagenaar's framework as a starting point and examines the three properties of his framework—independence of events, fixed alternatives, and equiprobability. We find strong evidence to suggest that independence of events is indeed used as a cue toward randomness. Equiprobability has an effect on randomness judgments. However, it appears to work only in a limited role. Fixedness of alternatives is a complex construct that consists of multiple sub-concepts. We find that each of these sub-concepts influences randomness judgments, but that they exert forces in different directions. Stability of outcome ratios increases randomness judgments, while knowledge of outcome ratios decreases randomness judgments. Future directions for development of a functional framework for understanding perceptions of randomness are suggested. Copyright © 2008 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
22. Boltzmann Kinetic Equation and Equiprobability Postulate.
- Author
-
Folin, K. G.
- Subjects
- *
ESSAYS , *EQUATIONS , *DYNAMICS , *PROBABILITY theory , *RELAXATION (Gas dynamics) - Abstract
It has been shown that the approaches based on the postulate of equiprobability of all acceptable microstates of a dynamical system (the Gibbs distributions and the Boltzmann method) do not satisfy criteria for a theory to be applicable (specifically, agreement with experimental data and inherent consistency), unlike the approach based on the Boltzmann equation. An approach that satisfies these criteria and that is based on the abandonment of the equiprobability postulate is offered. Concepts of spontaneous and stimulated microstates, which imply the failure of the equiprobability of microstates, are introduced. The arguments for the following assertions are suggested: probability and, consequently, one-particle and many-particle distribution functions in principle cannot depend on time; the molecular chaos hypothesis leads to the inherent inconsistency of the Boltzmann equation; the relaxation process results in some ordering of gas particle behavior; the distinguishability of identical particles is a necessary condition to describe the behavior of ensemble particles in terms of probability. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
23. The meaning of justified subjectivism and its role in the reconciliation of recent disagreements over forensic probabilism
- Author
-
Franco Taroni, Colin Aitken, Silvia Bozza, and Alex Biedermann
- Subjects
Evaluative precepts ,Probability assignment ,010401 analytical chemistry ,Assertion ,Subjective probability, Probability assignment, Justification, Evaluative precepts ,Subjective expected utility ,Probabilism ,Imprecise probability ,01 natural sciences ,Justification ,Subjective probability ,0104 chemical sciences ,Pathology and Forensic Medicine ,Epistemology ,Constraint (information theory) ,Equiprobability ,03 medical and health sciences ,0302 clinical medicine ,Subjectivism ,030216 legal & forensic medicine ,Meaning (existential) ,Settore SECS-S/01 - Statistica ,Psychology ,Social psychology - Abstract
In this paper we reply to recent comments in this Special Issue according to which subjective probability is not considered to be a concept fit for use in forensic evaluation and expert reporting. We identify the source of these criticisms to lie in a misunderstanding of subjective probability as unconstrained subjective probability; a lack of constraint that neither corresponds to the way in which we referred to subjective probability in our previous contributions, nor to the way in which probability assignment is understood by current evaluative guidelines (e.g., of ENFSI). Specifically, we explain that we understand subjective probability as a justified assertion, i.e. a conditional assessment based on task-relevant data and information, that may be thought of as a constrained subjective probability. This leads us to emphasise again the general conclusion that there is no gap between justified (or, reasonable) subjective probability and other concepts of probability in terms of its ability to provide assessments that are soundly based on whatever relevant information available. We also note that the challenges an expert faces in reporting probabilities apply equally to all interpretations of probability, not only to subjective probability.
- Published
- 2017
24. Computing the exact distributions of some functions of the ordered multinomial counts: maximum, minimum, range and sums of order statistics
- Author
-
Pasquale Cirillo, Marco Bonetti, and Anton Ogay
- Subjects
Multivariate random variable ,0603 philosophy, ethics and religion ,Poisson distribution ,01 natural sciences ,Equiprobability ,010104 statistics & probability ,symbols.namesake ,URNS ,EXACT PROBABILITY ,Range (statistics) ,Applied mathematics ,0101 mathematics ,lcsh:Science ,Mathematics ,Statistical hypothesis testing ,Multidisciplinary ,Order statistic ,MULTINOMIAL COUNTS, EXACT PROBABILITY, URNS ,06 humanities and the arts ,MULTINOMIAL COUNTS ,Distribution (mathematics) ,symbols ,Multinomial distribution ,lcsh:Q ,060301 applied ethics ,Research Article - Abstract
Starting from seminal neglected work by Rappeport (Rappeport 1968 Algorithms and computational procedures for the application of order statistics to queuing problems. PhD thesis, New York University), we revisit and expand on the exact algorithms to compute the distribution of the maximum, the minimum, the range and the sum of the J largest order statistics of a multinomial random vector under the hypothesis of equiprobability. Our exact results can be useful in all those situations in which the multinomial distribution plays an important role, from goodness-of-fit tests to the study of Poisson processes, with applications spanning from biostatistics to finance. We describe the algorithms, motivate their use in statistical testing and illustrate two applications. We also provide the codes and ready-to-use tables of critical values.
- Published
- 2019
25. Jeffreys-prior penalty, finiteness and shrinkage in binomial-response generalized linear models
- Author
-
Ioannis Kosmidis and David Firth
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,Generalized linear model ,Binomial (polynomial) ,General Mathematics ,Binomial regression ,HA ,Inference ,Probit ,Mathematics - Statistics Theory ,Statistics Theory (math.ST) ,Logistic regression ,01 natural sciences ,Methodology (stat.ME) ,Equiprobability ,010104 statistics & probability ,Statistics ,FOS: Mathematics ,050602 political science & public administration ,Statistics::Methodology ,0101 mathematics ,QA ,Statistics - Methodology ,Mathematics ,Applied Mathematics ,05 social sciences ,62J12, 62F10, 62F12, 62F03 ,Agricultural and Biological Sciences (miscellaneous) ,0506 political science ,Statistics::Computation ,Statistics, Probability and Uncertainty ,General Agricultural and Biological Sciences ,Jeffreys prior - Abstract
Summary Penalization of the likelihood by Jeffreys’ invariant prior, or a positive power thereof, is shown to produce finite-valued maximum penalized likelihood estimates in a broad class of binomial generalized linear models. The class of models includes logistic regression, where the Jeffreys-prior penalty is known additionally to reduce the asymptotic bias of the maximum likelihood estimator, and models with other commonly used link functions, such as probit and log-log. Shrinkage towards equiprobability across observations, relative to the maximum likelihood estimator, is established theoretically and studied through illustrative examples. Some implications of finiteness and shrinkage for inference are discussed, particularly when inference is based on Wald-type procedures. A widely applicable procedure is developed for computation of maximum penalized likelihood estimates, by using repeated maximum likelihood fits with iteratively adjusted binomial responses and totals. These theoretical results and methods underpin the increasingly widespread use of reduced-bias and similarly penalized binomial regression models in many applied fields.
- Published
- 2018
26. Approximating the Moments and Distribution of the Likelihood Ratio Statistic for Multinomial Goodness of Fit.
- Author
-
Smith, Paul J., Rae, Donald S., Manderscheid, Ronald W., and Silbergeld, Sam
- Subjects
- *
APPROXIMATION theory , *ANALYSIS of variance , *CHARACTERISTIC functions , *ASYMPTOTIC expansions , *DISTRIBUTION (Probability theory) , *GOODNESS-of-fit tests , *STOCHASTIC convergence , *NUMERICAL analysis , *MATHEMATICAL analysis - Abstract
Approximations were derived for the mean and variance of G[sup 2], the likelihood ratio statistic for testing goodness of fit in a k cell multinomial distribution. These approximate moments, accurate to O(N[sup -3]), may be used in fitting the distribution of G[sup 2]. Extensive numerical studies of the exact and approximate distributions were performed in the special case of equiprobability. The asymptotic chi-squared distribution was found to fit poorly for k >/= 4. Satisfactory results were obtained using a multiplicative correction to the chi-squared fit. More sophisticated procedures using the beta distribution produced increased accuracy, but at the cost of excessive computational labor. [ABSTRACT FROM AUTHOR]
- Published
- 1981
- Full Text
- View/download PDF
27. Gravity modulates behaviour control strategy
- Author
-
Elisa Raffaella Ferrè, Iqra Arshad, and Maria Gallagher
- Subjects
Adult ,Male ,Supine position ,Posture ,Affect (psychology) ,Choice Behavior ,050105 experimental psychology ,psyc ,Equiprobability ,Thinking ,03 medical and health sciences ,Young Adult ,0302 clinical medicine ,Orientation (mental) ,Humans ,0501 psychology and cognitive sciences ,Randomness ,Balance (ability) ,Vestibular system ,General Neuroscience ,05 social sciences ,Cognition ,Exploratory Behavior ,Female ,Vestibule, Labyrinth ,Psychology ,030217 neurology & neurosurgery ,Cognitive psychology ,Gravitation - Abstract
Human behaviour is a trade-off between exploitation of familiar resources and exploration of new ones. In a challenging environment—such as outer space—making the correct decision is vital. On Earth, gravity is always there, and is an important reference for behaviour. Thus, altered gravitational signals may affect behaviour control strategies. Here, we investigated whether changing the body’s orientation to the gravitational vector would modulate the balance between routine and novel behaviour. Participants completed a random number generation task while upright or supine. We found decreased randomness when participants were supine. In particular, the degree of equiprobability of pairs of consecutive responses was reduced in the supine orientation. Online gravitational signals may shape the balance between exploitation and exploration, in favour of more stereotyped and routine responses.
- Published
- 2018
28. The Duhem-Quine problem for equiprobable conjuncts
- Author
-
Vikram Singh Sirola and Abhishek Kashyap
- Subjects
History ,Computer science ,05 social sciences ,Bayesian probability ,06 humanities and the arts ,Quine ,0603 philosophy, ethics and religion ,050105 experimental psychology ,Conjunction (grammar) ,Epistemology ,Equiprobability ,History and Philosophy of Science ,060302 philosophy ,Prior probability ,0501 psychology and cognitive sciences ,Construal level theory ,Holism - Abstract
In this paper, we distinguish Quine’s thesis of holism from the related Duhem-Quine problem. We discuss the construal of holism which claims that the effect of falsification is felt on a conjunction of hypotheses. The Duhem-Quine problem claims that there is no principled way of knowing how falsification affects individual conjuncts. This latter claim relies on holism and an additional commitment to the hypothetico-deductive model of theory confirmation such that it need not arise in non-deductive accounts. While existing personalist Bayesian treatments of the problem make this point by assuming values of priors for the conjuncts, we arrive at the same conclusion without invoking such assumptions. Our discussion focuses on the falsification of equiprobable conjuncts and highlights the role played by their alternatives in ascertaining their relative disconfirmation. The equiprobability of conjuncts is discussed alongside a historical case study
- Published
- 2018
29. On a Dynamical Approach to Some Prime Number Sequences
- Author
-
Bartolo Luque, Lucas Lacasa, Ignacio Cadavid Gómez, and Octavio Miramontes
- Subjects
Dynamical systems theory ,chaos ,Symbolic dynamics ,General Physics and Astronomy ,lcsh:Astrophysics ,Dynamical Systems (math.DS) ,01 natural sciences ,Article ,Equiprobability ,symbolic dynamics ,0103 physical sciences ,lcsh:QB460-466 ,Prime gap ,FOS: Mathematics ,Number Theory (math.NT) ,Mathematics - Dynamical Systems ,0101 mathematics ,010306 general physics ,lcsh:Science ,complex systems ,Mathematics ,Discrete mathematics ,Mathematics - Number Theory ,010102 general mathematics ,Prime number ,prime numbers ,Chaos game ,nonlinearity ,gap residues ,Divisibility rule ,lcsh:QC1-999 ,Number theory ,entropy ,fractals ,lcsh:Q ,lcsh:Physics - Abstract
In this paper we show how the cross-disciplinary transfer of techniques from Dynamical Systems Theory to Number Theory can be a fruitful avenue for research. We illustrate this idea by exploring from a nonlinear and symbolic dynamics viewpoint certain patterns emerging in some residue sequences generated from the prime number sequence. We show that the sequence formed by the residues of the primes modulo $k$ are maximally chaotic and, while lacking forbidden patterns, display a non-trivial spectrum of Renyi entropies which suggest that every block of size $m>1$, while admissible, occurs with different probability. This non-uniform distribution of blocks for $m>1$ contrasts Dirichlet's theorem that guarantees equiprobability for $m=1$. We then explore in a similar fashion the sequence of prime gap residues. This sequence is again chaotic (positivity of Kolmogorov-Sinai entropy), however chaos is weaker as we find forbidden patterns for every block of size $m>1$. We relate the onset of these forbidden patterns with the divisibility properties of integers, and estimate the densities of gap block residues via Hardy-Littlewood $k$-tuple conjecture. We use this estimation to argue that the amount of admissible blocks is non-uniformly distributed, what supports the fact that the spectrum of Renyi entropies is again non-trivial in this case. We complete our analysis by applying the Chaos Game to these symbolic sequences, and comparing the IFS attractors found for the experimental sequences with appropriate null models., 18 pages, 20 figures
- Published
- 2018
30. Prospective Teachers’ Probabilistic Reasoning in the Context of Sampling
- Author
-
José Miguel Contreras, Carmen Díaz, Juan Jesús Ortiz, and Emilse Gómez-Torres
- Subjects
Equiprobability ,education.field_of_study ,Sample size determination ,Proportional reasoning ,Statistics ,Population ,Probabilistic logic ,Primary education ,Population proportion ,Psychology ,education ,Representativeness heuristic - Abstract
In this paper, we analyse the knowledge of sampling in 157 prospective primary school teachers in Spain. Using two different tasks, and taking into account common and horizon content knowledge (described in the model proposed by Ball et al. in J Teacher Educ 59:389–407, 2008), we assess the teachers’ understanding of the following concepts: population and sample, frequency, proportion, estimation, variability of estimates, and the effect of sample size on this variability. Our results suggest that these prospective teachers have correct intuitions when estimating the sample proportion when the population proportion is known. However, they tend to confuse samples and populations, sometimes fail to apply proportional reasoning, misinterpret unpredictability, and show the representativeness heuristic and the equiprobability bias.
- Published
- 2018
31. Using and Interpreting the Probability Calculus
- Author
-
Matthew D. Lund
- Subjects
Equiprobability ,Calculus ,medicine ,Probability calculus ,medicine.disease ,Formal system ,Calculus (medicine) ,Mathematics - Abstract
We have been discussing some of the fundamental features of the classical calculus of probability. The equiprobability of rival events was seen to be a major assumption of the calculus. Moreover, it is an assumption which the pure mathematician need not bother to justify. He need only present his formal system as follows
- Published
- 2018
32. How do high school students solve probability problems? A mixed methods study on probabilistic reasoning
- Author
-
Patrick Onghena, Maarten Deleye, Wim Van Dooren, Mieke Heyvaert, and Lore Saenen
- Subjects
mixed methods research ,Logical reasoning ,Management science ,Multimethodology ,Probabilistic logic ,Representativeness heuristic ,Outcome (probability) ,Education ,initiation ,Equiprobability ,expansion ,Mathematics education ,Cluster sampling ,Set (psychology) ,Psychology ,development ,probabilistic reasoning - Abstract
When studying a complex research phenomenon, a mixed methods design allows to answer a broader set of research questions and to tap into different aspects of this phenomenon, compared to a monomethod design. This paper reports on how a sequential equal status design (QUAN â QUAL) was used to examine studentsâ reasoning processes when solving probability problems. Aselect clustered sampling resulted in the inclusion of 168 high school students in a first, quantitative phase, in which a questionnaire was used to assess how they solved probability problems. This questionnaire included probability items that were based on the outcome orientation, the representativeness misconception, and the equiprobability bias. In a second, qualitative phase, 18 students who were purposefully sampled from the first research phase were interviewed in order to in-depth study their probabilistic reasoning processes. In this paper we illustrate and discuss how several mixed methods research purposes were realized throughout our study: development, expansion, and initiation. ispartof: International Journal of Research & Method in Education vol:41 issue:2 pages:184-206 status: published
- Published
- 2018
33. An extended TODIM method under probabilistic dual hesitant fuzzy information and its application on enterprise strategic assessment
- Author
-
Zhiliang Ren, Zeshui Xu, and Hai Wang
- Subjects
Mathematical optimization ,021103 operations research ,Fuzzy set ,0211 other engineering and technologies ,Probabilistic logic ,Score ,02 engineering and technology ,Fuzzy logic ,Equiprobability ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Decision-making ,Uncertainty quantification ,Axiom - Abstract
In this paper, an extended TODIM method under the probabilistic dual hesitant fuzzy environment is proposed based on a revised score function and an equiprobability distance measure. The TODIM method can deal with multi-criteria decision making problems considering the DMs' psychological behavior. The probabilistic dual hesitant fuzzy set (PDHFS) is a very useful tool to handle the uncertainty in decision making process due to its ability that can describe the aleatory uncertainty and epistemic uncertainty in a single framework simultaneously. A revised score function of the probabilistic dual hesitant fuzzy element (PDHFE) is proposed to distinguish different probabilistic dual hesitant fuzzy information. In addition, we give an axiomatic definition about the distance measure of the PDHFEs and propose an equiprobability distance measure, which satisfies people's intuition better. Finally, we develop a new TODIM method and use a numerical case on enterprise strategic assessment to show its effectiveness and availability.
- Published
- 2017
34. Quantifying Changes in Reconnaissance Drought Index using Equiprobability Transformation Function
- Author
-
S. Zahra Samadi, Abolfazl Mosaedi, Mohammad Ghabaei Sough, and Hamid Zare Abyaneh
- Subjects
Equiprobability ,Transformation (function) ,Goodness of fit ,Threshold limit value ,Evapotranspiration ,Log-normal distribution ,Statistics ,Econometrics ,Probability distribution ,Probability density function ,Water Science and Technology ,Civil and Structural Engineering ,Mathematics - Abstract
The Reconnaissance Drought Index (RDI) is obtained by fitting a lognormal probability density function (PDF) to the ratio of accumulated precipitation over potential evapotranspiration values (αk) at different time scales. This paper aims to address the question of how a probability distribution may fit better to the αk values than a lognormal distribution and how RDI values may change in shorter (i.e.,3-month, and 6-month) and longer (i.e., 9-month, and annual) time scales during 1960–2010 period over various climate conditions (arid, semi-arid, and humid) in Iran. For this purpose, the series of RDI were initially computed by fitting a lognormal PDF to the αk values and the Kolmogorov–Smirnov (K-S) test was implemented to choose the best probability function in different window sizes from 3 to 12-months. The corresponding RDI values for the best distribution were then deriven based on an equiprobability transformation function. The differences between RDI values (the lognormal (RDIlog) and the best (RDIApp) distributions) were compared based on Nash-Sutcliffe efficiency (NSE) criterion. The results of goodness of fit test based on threshold value in the K-S test showed that the goodness of fit in the lognormal distribution may not be rejected at 0.01 and 0.05 significance levels while may only be rejected in a short term (Apr.-Jun.) period at humid station (Rasht station), and three-month (Oct.-Dec. and Apr.-Jun.) and six-month (Apr.-Sep.) periods in semi-arid station (Shiraz station) at significance levels of 0.10 and 0.20, correspondingly. Further a difference between RDIlog and RDIApp performed that RDI values may change if the best distribution employs and this may therefore lead to significant discrepant and/or displacement of drought severity classes in the RDI estimation.
- Published
- 2015
35. El sesgo de equiprobabilidad como dificultad para comprender la incertidumbre en futuros docentes argentinos.
- Author
-
Cardeñoso Domingo, José María, Moreno, Amable, García González, Esther, Jiménez Fontana, Rocío, Cardeñoso Domingo, José María, Moreno, Amable, García González, Esther, and Jiménez Fontana, Rocío
- Abstract
This paper describes the use of the equiprobability bias, which has been identified in the reasoning of 908 prospective secondary school teachers of Mathematics and Biology, in the province of Mendoza, Argentina. We analyse the arguments justifying the responses given to twelve research questions in which students are asked to justify the assignment of a certain degree of probability for a given phenomenon and the events that arise. The situations were selected taking into account the content of the item and the degree of probability so that the items used are representative of the context of games, daily life and physical-natural context., Dans cet article, l'utilisation de équiprobabilité biais identifié dans le raisonnement de 908 étudiants pour enseignant de mathématiques et de biologie, dans la province de Mendoza, en Argentine est décrite. Nous analysons l'utilisation de la justification des réponses aux questions de douze questions de recherche qui sont posées aux élèves pour justifier l'attribution d'un certain degré d'estimation de la probabilité d'un phénomène et les événements qui sont présentés. Les situations ont été choisi compte tenu du contenu de l'article et le degré de probabilité; est idéalement effectuée pour les éléments utilisés sont représentatifs des contextes du jeu, le contexte de tous les jours et le contexte physique naturel, Neste artigo o uso de viés equiprobabilidade identificados no raciocínio de 908 alunos para o professor do ensino médio de Matemática e Biologia, na província de Mendoza, Argentina é descrito. Analisamos o uso de justificação para as respostas às doze perguntas de investigação que são feitas para os alunos para justificar a atribuição de um certo grau de estimativa de probabilidade de um fenómeno e eventos que são apresentados. A escolha de situações foi seleccionado considerando o teor do produto e do grau de probabilidade; Realiza-se convenientemente para os itens utilizados são representativos dos contextos do jogo, cotidiano e o contexto físico-natural, En este trabajo se describe el uso del sesgo de equiprobabilidad, identificado en el razonamiento de 908 estudiantes para profesor de secundaria de Matemática y de Biología, de la provincia de Mendoza, Argentina. Analizamos la utilización de argumentos justificativos de las respuestas dadas a doce preguntas de investigación en las que se les pide a los estudiantes justificar la asignación de un cierto grado de la estimación de la probabilidad a un determinado fenómeno que se le plantean. La elección de los fenómenos se realizó teniendo en cuenta el contexto en el que se presentaba y el grado de probabilidad; se realiza convenientemente para que los ítems usados sean representativos de los contextos de juego, el contexto cotidiano y el contexto físico-natural
- Published
- 2017
36. Minimum yield principle under incomplete prediction of financial markets
- Author
-
G. A. Agasandyan
- Subjects
Equiprobability ,Microeconomics ,Yield (finance) ,Financial market ,Economics ,Econometrics ,Portfolio ,Function (mathematics) ,Investment (macroeconomics) ,Beta distribution ,Exponential function - Abstract
The work investigates the properties of the solutions derived from the minimum yield principle in problems of constructing optimal in continuous VaR-criterion (CC-VaR) portfolio for an investor with own partial market forecast and own risk preferences function. Fundamental theoretical results are adduced and illustrated by examples of two-sided exponential, equiprobability, and beta distributions both for more underly's price and market forecast.
- Published
- 2017
37. High-order evaluation and modelling of cross-polarization discrimination on earth-satellite propagation paths at Ka and V-bands
- Author
-
Armando Rocha, Carlo Riva, and Flavio Jorge
- Subjects
Computer Networks and Communications ,XPD ,02 engineering and technology ,Interference (wave propagation) ,Data modeling ,Equiprobability ,modelling ,depolarization ,0202 electrical engineering, electronic engineering, information engineering ,High order ,Instrumentation ,attenuation ,Remote sensing ,Physics ,Attenuation ,Cumulative distribution function ,020208 electrical & electronic engineering ,020206 networking & telecommunications ,Spectral efficiency ,measurements ,satellite propagation ,Safety Research ,Signal Processing ,Computational physics ,Path (graph theory) - Abstract
The performance of the satellite communication systems employing either polarization diversity or frequency-reuse schemes to improve the spectral efficiency is degraded due to the depolarization-induced interference originated by raindrops and ice particles present along the Earth-Satellite propagation path. Two models are able to account for both rain and ice contributions: one enables the prediction of the long-term first-order statistics (Cumulative Distribution Function, CDF) of cross-polarization discrimination (XPD), and the second enables the prediction of the relationship between XPD and co-polar attenuation (CPA). The second was developed considering exclusively data at V-band and so, it not only requires independent validation but also extension to other frequency-bands. The predictions provided by the first are usually converted on the corresponding XPD-CPA relationship using the long-term first-order statistics of rain attenuation, (incorrectly) considering that the equiprobability hypothesis applies. Using 8 years of measurements both models are tested, a new XPD-CPA relationship is proposed for the Ka-band and a novel approach to the conversion from the CDF of XPD to the corresponding XPD-CPA relationship is presented.
- Published
- 2017
38. Exact Algorithms for the Multinomial Extremes: Maximum, Minimum, Range and Sums
- Author
-
Anton Ogay, Marco Bonetti, and Pasquale Cirillo
- Subjects
Equiprobability ,symbols.namesake ,Exact algorithm ,Distribution (mathematics) ,Multivariate random variable ,Order statistic ,symbols ,Range (statistics) ,Multinomial distribution ,Poisson distribution ,Algorithm ,Mathematics - Abstract
Starting from a neglected work by Rappeport (1968), we re-propose an exact algorithm to compute the distribution of the maximum of a multinomial random vector under the hypothesis of equiprobability. We then show how to compute the exact probabilities of the sum of the J largest order statistics of the vector, following the suggestions and correcting the errors in the same article. Finally, we introduce brand new ways of computing the exact probabilities of the multinomial minimum and of the multinomial range. The exact probabilities we derive can be used in all those situations in which the multinomial distribution plays an important role, from goodness-of-fit tests to the study of Poisson processes, with applications spanning from biostatistics to finance. For all algorithms, we provide Matlab codes and ready-to-use tables of critical values.
- Published
- 2017
39. TEACHING PROBABILITY WITH THE SUPPORT OF THE R STATISTICAL SOFTWARE
- Author
-
Monica Karrer, Verônica Yumi Kataoka, and Robson Dos Santos Ferreira
- Subjects
Statistics and Probability ,Theoretical computer science ,Computer science ,Teaching method ,media_common.quotation_subject ,Probabilistic logic ,Context (language use) ,Tree diagram ,Literacy ,Education ,Equiprobability ,Task analysis ,Constructionism ,Mathematics education ,media_common - Abstract
The objective of this paper is to discuss aspects of high school students’ learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal’s probabilistic literacy and Papert’s constructionism. The results show improvement in students’ learning of basic concepts, such as: random experiment, estimation of probabilities, and calculation of probabilities using a tree diagram. The use of R allowed students to extend their reasoning beyond that developed from paper-and-pencil approaches, since it made it possible for them to work with a larger number of simulations, and go beyond the standard equiprobability assumption in coin tosses. First published November 2014 at Statistics Education Research Journal Archives
- Published
- 2014
40. The Real ‘Letter to Arbuthnot’? a Motive For Hume's Probability Theory in an Early Modern Design Argument
- Author
-
Catherine Kemp
- Subjects
Value (ethics) ,Game of chance ,Equiprobability ,Philosophy ,Extension (metaphysics) ,Probability theory ,Teleological argument ,Natural (music) ,Epistemology - Abstract
John Arbuthnot's celebrated but flawed paper in the Philosophical Transactions of 1711–12 is a philosophically and historically plausible target of Hume's probability theory. Arbuthnot argues for providential design rather than chance as a cause of the annual birth ratio, and the paper was championed as a successful extension of the new calculations of the value of wagers in games of chance to wagers about natural and social phenomena. Arbuthnot replaces the earlier anti-Epicurean notion of chance with the equiprobability assumption of Huygens's mathematics of games of chance, and misrepresents the birth ratio data to rule out chance in favour of design. The probability sections of Hume's Treatise taken together correct the equiprobability assumption and its extension to other kinds of phenomena in the estimation of wagers or expectations about particular events. Hume's probability theory demonstrates the flaw in this version of the design argument.
- Published
- 2014
41. The Effect of Activity-Based Teaching on Remedying the Probability-Related Misconceptions: A Cross-Age Comparison
- Author
-
Emrullah Erdem, Selçuk Fırat, and Ramazan Gürbüz
- Subjects
Equiprobability ,Computer science ,Intervention (counseling) ,education ,Significant difference ,Mathematics education ,Experimental Instructions ,General Medicine ,Session (computer science) ,Representativeness heuristic ,Developmental psychology - Abstract
The aim of this paper is to compare the effect of activity-based teaching on remedying probability-related misconceptions of students at different grades. Thus, a cross-sectional/age study was conducted with a total of 74 students in 6th-8th grades. Experimental instructions were given to all the groups three times/ week, 40 min/session, for 2 weeks. Students’ progress was examined by pre-test and post-test measurements. The results of the analysis showed that, as a result of the intervention, all graders’ post-test scores regarding all the concepts (PC: Probability Comparison, E: Equiprobability and R: Representativeness) showed a significant increase when compared to pre-test scores. It was found out that this increase did not create a significant difference based on age in PC concept, but that in 8th grade students, it showed a significant difference in E and R concepts compared to 6th graders. On the other hand, it was also assessed that the increases observed between 7th and 8th graders with regard to E and R concepts were not significant. In summary, the implemented intervention can be suggested to have different effects depending on age and the concept.
- Published
- 2014
42. A consistent set of infinite-order probabilities
- Author
-
David Atkinson, Jeanne Peijnenburg, Faculty of Philosophy, Theoretical Philosophy, and High-Energy Frontier
- Subjects
Higher-order probability ,Chain rule (probability) ,Infinite regress ,Applied Mathematics ,Law of total probability ,Conditional probability ,Symmetric probability distribution ,Tree diagram ,Theoretical Computer Science ,Combinatorics ,Equiprobability ,Regular conditional probability ,Artificial Intelligence ,Probability distribution ,Applied mathematics ,Software ,Model ,Mathematics - Abstract
Some philosophers have claimed that it is meaningless or paradoxical to consider the probability of a probability. Others have however argued that second-order probabilities do not pose any particular problem. We side with the latter group. On condition that the relevant distinctions are taken into account, second-order probabilities can be shown to be perfectly consistent.May the same be said of an infinite hierarchy of higher-order probabilities? Is it consistent to speak of a probability of a probability, and of a probability of a probability of a probability, and so on, ad infinitum? We argue that it is, for it can be shown that there exists an infinite system of probabilities that has a model. In particular, we define a regress of higher-order probabilities that leads to a convergent series which determines an infinite-order probability value. We demonstrate the consistency of the regress by constructing a model based on coin-making machines. We show that an infinite hierarchy of probabilities of probabilities is consistent.The proof consists in a model involving coin-making machines.Weak conditions are given for the convergence of the infinite system.
- Published
- 2013
43. Contractarian ethics and Harsanyi’s two justifications of utilitarianism
- Author
-
Michael Moehler
- Subjects
Economics and Econometrics ,Sociology and Political Science ,Welfare economics ,Rationality ,Rational agent ,Contractualism ,Equiprobability ,Philosophy ,Meaning (philosophy of language) ,Original position ,Utilitarianism ,Economics ,Mathematical economics ,Axiom - Abstract
Harsanyi defends utilitarianism by means of an axiomatic proof and by what he calls the ‘equiprobability model’. Both justifications of utilitarianism aim to show that utilitarian ethics can be derived from Bayesian rationality and some weak moral constraints on the reasoning of rational agents. I argue that, from the perspective of Bayesian agents, one of these constraints, the impersonality constraint, is not weak at all if its meaning is made precise and that generally it even contradicts individual rational agency. Without the impersonality constraint, Harsanyi’s two justifications of utilitarianism on the grounds of Bayesian rationality fail. As an alternative, I develop a contractarian framework that is compatible with individual rational agency and Harsanyi’s central assumptions, and that allows the derivation of moral conclusions on the grounds of Bayesian rationality. The developed framework offers a novel justification of contractarian ethics and may best be described as a combined version of Harsanyi’s equiprobability model and Rawls’s original position.
- Published
- 2013
44. Maximal entropy random walk in community detection
- Author
-
Zdzislaw Burda and Jeremi K. Ochab
- Subjects
Equiprobability ,Heterogeneous random walk in one dimension ,Maximal entropy ,Computer science ,Loop-erased random walk ,Stochastic matrix ,General Physics and Astronomy ,Entropy (information theory) ,General Materials Science ,Physical and Theoretical Chemistry ,Complex network ,Random walk ,Algorithm - Abstract
The aim of this paper is to check feasibility of using the maximal-entropy random walk in algorithms finding communities in complex networks. A number of such algorithms exploit an ordinary or a biased random walk for this purpose. Their key part is a (dis)similarity matrix, according to which nodes are grouped. This study en- compasses the use of a stochastic matrix of a random walk, its mean first-passage time matrix, and a matrix of weighted paths count. We briefly indicate the connection between those quantities and propose substituting the maximal-entropy random walk for the previously chosen models. This unique random walk maximises the entropy of ensembles of paths of given length and endpoints, which results in equiprobability of those paths. We compare the performance of the selected algorithms on LFR benchmark graphs. The results show that the change in performance depends very strongly on the particular algorithm, and can lead to slight improvements as well as to significant deterioration.
- Published
- 2013
45. Modelling Information by Probabilities
- Author
-
Carmen Batanero and Manfred Borovcnik
- Subjects
Equiprobability ,Interpretation (logic) ,Computer science ,Frequentist inference ,Conditional probability ,Experimental data ,Mathematical economics ,Frequency ,Independence (probability theory) ,Central limit theorem - Abstract
Probability embraces a cluster of ideas that help us to make predictions and judgements by modelling random situations suitably. Ideas such as experimental data, weight of uncertainty, and equiprobability contribute towards the concept of probability. The concept of independence is a basic prerequisite for the frequentist interpretation, whilst conditional probabilities are essential to adapt personal weights in view of new information.
- Published
- 2016
46. Asymptotic equidistribution of congruence classes with respect to the convolution iterates of a probability vector
- Author
-
Gilles Gnacadja
- Subjects
Statistics and Probability ,Discrete mathematics ,Doubly stochastic matrix ,Combinatorics ,Equiprobability ,Integer ,Iterated function ,Congruence (manifolds) ,Statistics, Probability and Uncertainty ,Circulant matrix ,Probability vector ,Convolution ,Mathematics - Abstract
Consider a positive integer d and a positive probability vector f over the numbers 0 , … , l . The n -fold convolution f ∗ n of f is a probability vector over the numbers 0 , … , n l , and these can be partitioned into congruence classes modulo d . The main result of this paper is that, asymptotically in n , these d congruence classes have equiprobability 1 / d . In the motivating application, one has N containers of capacity d and repeatedly retrieves one item from each of M randomly selected containers ( 0 M N ); containers are replenished to full capacity when emptied. The result implies that, over the long term, the number of containers requiring replenishment is M / d . This finding is relevant wherever one would be interested in the steady-state pace of replenishing fixed-capacity containers.
- Published
- 2012
47. The effect of computer-assisted teaching on remedying misconceptions: The case of the subject 'probability'
- Author
-
Ramazan Gürbüz, Osman Birgin, and Uşak Üniversitesi, Eğitim Fakültesi, Matematik ve Fen Bilimleri Eğitimi Bölümü
- Subjects
Secondary education ,General Computer Science ,Group study ,Computer science ,Interactive learning environments ,Representativeness heuristic ,Education ,Equiprobability ,Teaching/learning strategies ,Intervention (counseling) ,Improving classroom teaching ,Mathematics education ,Control (linguistics) ,Research method - Abstract
The aim of this study is to determine the effects of computer-assisted teaching (CAT) on remedying misconceptions students often have regarding some probability concepts in mathematics. Toward this aim, computer-assisted teaching materials were developed and used in the process of teaching. Within the true-experimental research method, a pre- and post-test control group study was carried out with 37 seventh-grade students-18 in the experimental group (CAT) and 19 in the control group (traditional teaching). A 12-item instrument, made up of 4 items related to each of the concepts "Probability Comparisons (PC)," "Equiprobability (E)," and "Representativeness (R)," was developed and implemented with the participants. After the teaching intervention, the same instrument was again administered to both groups as a post-test. In light of the findings, it can be concluded that computer-assisted teaching was significantly more effective than traditional methods in terms of remedying students' misconceptions. Highlights? We try to remedy misconceptions regarding probability. ? We design two different sets of computer-assisted teaching (CAT) materials. ? We assume that using materials together will reduce each other's disadvantages. ? CAT is more effective than traditional teaching in remedying misconceptions.
- Published
- 2012
48. Making heads or tails of probability: An experiment with random generators
- Author
-
Sylvie Serpell, Simon J. Handley, and Kinga Morsanyi
- Subjects
education ,Probabilistic logic ,Sample (statistics) ,Cognition ,Representativeness heuristic ,Session (web analytics) ,Education ,Equiprobability ,Developmental and Educational Psychology ,Heuristics ,Psychology ,Social psychology ,Randomness ,Cognitive psychology - Abstract
Background. The equiprobability bias is a tendency for individuals to think of probabilistic events as 'equiprobable' by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based on a misunderstanding of the concept of randomness. Aims. The aim of the present study was to examine whether experimenting with random generators would decrease the equiprobability bias. Sample. The participants were 108 psychology students whose performance was measured either immediately after taking part in a training session ( n= 55), or without doing any training exercises ( n= 53). Method. The training session consisted of four activities. These included generating random sequences of events, and learning about the law of large numbers. Subsequently, the participants were tested on a series of equiprobability problems, and a number of other problems with similar structure and content. Results. The results indicated that the training successfully decreased the equiprobability bias. However, this effect was moderated by participants' cognitive ability (i.e., higher ability participants benefitted from the training more than participants with lower cognitive ability). Finally, the training session had the unexpected side effect of increasing students' susceptibility to the representativeness heuristic. Conclusions. Experimenting with random generators has a positive effect on students' general understanding of probability, but the same time it might increase their susceptibility to certain biases (especially, to the representativeness heuristic). These findings have important implications for using training methods to improve probabilistic reasoning performance.
- Published
- 2012
49. Dissecting Perceptual Processes with a New Tri-Stable Reversible Figure
- Author
-
Gerald M. Long and Jared M. Batterman
- Subjects
Male ,Volition ,Optical Illusions ,Optical illusion ,media_common.quotation_subject ,Experimental and Cognitive Psychology ,Small sample ,Fixation, Ocular ,Sensory Systems ,Equiprobability ,Ophthalmology ,Discrimination, Psychological ,Pattern Recognition, Visual ,Artificial Intelligence ,Orientation ,Perception ,Fixation (visual) ,Humans ,Attention ,Female ,Cues ,Percept ,Psychology ,media_common ,Cognitive psychology - Abstract
Five experiments are presented that examine observers' reports with a new tri-stable reversible figure using two measures of observers' experience with the figure: observers' initial percept upon figure presentation in the test period and the total number of reversals reported in the test period. Experiment 1 demonstrates the equiprobability of the three alternatives for the figure. Experiment 2 demonstrates the powerful effect of fixation location on observers' reported organization of the tri-stable figure. Experiment 3 demonstrates clear priming effects following brief presentation of particular components of the tri-stable figure. Experiment 4 demonstrates clear adaptation effects following prolonged presentation of the same components of the figure used in experiment 3 as well as the transient nature of this adaptation. Experiment 5 demonstrates observers' ability to “hold” each of the three percepts regardless of fixation location. The special sensitivity of the tri-stable figure to these manipulations even with naive subjects and small sample sizes is discussed, and the interplay of both bottom–up and top–down processes on figural reversal is emphasized.
- Published
- 2012
50. Perception of probabilities in situations of risk: A case based approach
- Author
-
Gabrielle Gayer
- Subjects
Economics and Econometrics ,media_common.quotation_subject ,Law of total probability ,Function (mathematics) ,Equiprobability ,Perception ,Mental process ,Statistics ,Similarity (psychology) ,Econometrics ,Value (mathematics) ,Finance ,Mathematics ,media_common ,Simple (philosophy) - Abstract
This paper provides a description of a possible mental process individuals go through in their attempt to comprehend stated probabilities in simple lotteries. The evaluation of probabilities is based on the following main components: lotteries encountered in the past, the realizations of these lotteries, and the similarity between stated probabilities. A probability is evaluated based on the experienced relative frequencies of outcomes that had that stated probability, as well as outcomes of other lotteries that had similar stated probabilities. This process may result in distortion of probabilities as observed in the literature, and in particular, in overvaluing low probabilities and undervaluing high probabilities. If the decision maker uses a less permissive similarity function as the size of memory grows, she will learn the real value of the stated probabilities. If, however, the similarity function is independent of memory, biases persist even when data are accumulated.
- Published
- 2010
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.