389 results
Search Results
2. Diffusing Wave Microrheology in Polymeric Fluids.
- Author
-
Phillies, George David Joseph
- Subjects
POLYMER solutions ,DISTRIBUTION (Probability theory) ,COMPLEX fluids ,STATISTICAL correlation ,DIFFUSION coefficients - Abstract
Recently, there has been interest in determining the viscoelastic properties of polymeric liquids and other complex fluids by means of Diffusing Wave Spectroscopy (DWS). In this technique, light-scattering spectroscopy is applied to highly turbid fluids containing optical probe particles. The DWS spectrum is used to infer the time-dependent mean-square displacement and time-dependent diffusion coefficient D of the probes. From D, values for the storage modulus G ′ (ω) and the loss modulus G ′ ′ (ω) are obtained. This paper is primarily concerned with the inference of the mean-square displacement from a DWS spectrum. However, in much of the literature, central to the inference that is said to yield D is an invocation g (1) (t) = exp (− 2 q 2 X (t) 2 ¯) of the Gaussian Approximation for the field correlation function g (1) (t) of the scattered light in terms of the mean-square displacement X (t) 2 ¯ of a probe particle during time t. Experiment and simulation both show that the Gaussian approximation is invalid for probes in polymeric liquids and other complex fluids. In this paper, we obtain corrections to the Gaussian approximation that will assist in interpreting DWS spectra of probes in polymeric liquids. The corrections reveal that these DWS spectra receive contributions from higher moments X (t) 2 n ¯ , n > 1 , of the probe displacement distribution function. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Observations of SO2 and NO2 by mobile DOAS in the Guangzhou Eastern Area during the Asian Games 2010.
- Author
-
Wu, F. C., Xie, P. H., Li, A., Chan, K. L., Hartl, A., Wang, Y., Si, F. Q., Zeng, Y., Qin, M., Xu, J., Liu, J. G., Liu, W. Q., and Wenig, M.
- Subjects
SULFUR oxides ,LIGHT absorption ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,ASIAN Games - Abstract
Mobile Differential Optical Absorption Spectroscopy measurements of SO
2 and NO2 were performed in the Guangzhou Eastern Area (GEA) during the Guangzhou Asian Games 2010 from November 2010 to December 2010. Spatial and temporal distributions of SO2 and NO2 in this area were obtained and emission sources were determined by using wind field data. The NO2 vertical column densities were found to agree with OMI values. The correlation coefficient (R2 ) was 0.88 after cloud filtering. During the Guangzhou Asian Games and Asian Paralympics (Para) Games, the SO2 and NO2 emissions in the area were quantified using averaged wind speed and wind direction. For times outside the Games the average SO2 emission was estimated to be 9.50±0.90 tons per hour and the average NO2 emission was estimated to be 3.50±1.89 tons per hour. During the phases of the Asian and Asian Para Games, the SO2 and NO2 emissions were reduced by 53.5 and 46 %, respectively, compared to the usual condition. We also investigated the influence of GEA on Guangzhou University Town, the main venue located northwest of the GEA, and found that SO2 concentrations here were about tripled by emissions from the GEA. [ABSTRACT FROM AUTHOR]- Published
- 2013
- Full Text
- View/download PDF
4. The use of logit and probit models in strategic management research: Critical issues.
- Author
-
Hoetker, Glenn
- Subjects
MANAGEMENT science ,MANAGEMENT ,PROBITS ,LEAST squares ,MATHEMATICAL models of decision making ,RESEARCH methodology ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,PAIRED comparisons (Mathematics) - Abstract
The logit and probit models have become critical parts of the management researcher's analytical arsenal, growing rapidly from almost no use in the 1980s to appearing in 15% of all articles published in Strategic Management Journal in 2005. However, a review of three top strategy journals revealed numerous areas in their use and interpretation where current practice fell short of ideal. Failure to understand how these models differ from ordinary least squares can lead researchers to misunderstand their statistical results and draw incorrect conclusions regarding the theory they are testing. Based on a review of the methodological literature and recent empirical papers in three leading strategy journals, this paper identifies four critical issues in their use: interpreting coefficients, modeling interactions between variables, comparing coefficients between groups (e.g., foreign and domestic firms), and measures of model fit. For each issue, the paper provides a background, a review of current practice, and recommendations for best practice. A concluding section presents overall implications for the conduct of research with logit and probit models, which should assist both authors and readers of strategic management research. Copyright © 2007 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
5. The Linear Skew-t Distribution and Its Properties.
- Author
-
Adcock, C. J.
- Subjects
DISTRIBUTION (Probability theory) ,NUMERICAL integration ,PARAMETERS (Statistics) ,STATISTICAL correlation ,STOCHASTIC convergence - Abstract
The aim of this expository paper is to present the properties of the linear skew-t distribution, which is a specific example of a symmetry modulated-distribution. The skewing function remains the distribution function of Student's t, but its argument is simpler than that used for the standard skew-t. The linear skew-t offers different insights, for example, different moments and tail behavior, and can be simpler to use for empirical work. It is shown that the distribution may be expressed as a hidden truncation model. The paper describes an extended version of the distribution that is analogous to the extended skew-t. For certain parameter values, the distribution is bimodal. The paper presents expressions for the moments of the distribution and shows that numerical integration methods are required. A multivariate version of the distribution is described. The bivariate version of the distribution may also be bimodal. The distribution is not closed under marginalization, and stochastic ordering is not satisfied. The properties of the distribution are illustrated with numerous examples of the density functions, table of moments and critical values. The results in this paper suggest that the linear skew-t may be useful for some applications, but that it should be used with care for methodological work. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Nearest neighbour analysis as a new probe for fuzzy dark matter.
- Author
-
Kousha, Hamed Manouchehri, Ansarifard, Mohammad, and Abolhasani, Aliakbar
- Subjects
- *
DARK matter , *DISTRIBUTION (Probability theory) , *STATISTICAL correlation , *LARGE scale structure (Astronomy) - Abstract
Fuzzy dark matter (FDM) is a promising candidate for dark matter (DM), characterized by its ultra-light mass, which gives rise to wave effects at astrophysical scales. These effects offer potential solutions to the small-scale issues encountered within the standard cold dark matter (CDM) paradigm. In this paper, we investigate the large-scale structure of the cosmic web using FDM simulations, comparing them to CDM-only simulations and a simulation incorporating baryonic effects. Our study employs the nearest neighbour (NN) analysis as a new statistical tool for examining the structure and statistics of the cosmic web in an FDM universe. This analysis could capture the information absent in the two-point correlation functions. In particular, we analyse data related to the spherical contact, nearest neighbour distances (NND), and the angle between the first and second nearest neighbours of haloes (NNA). Specifically, we utilize probability distribution functions, statistical moments, and fitting parameters, as well as G (x), F (x), and J (x) functions to analyse the above data. Remarkably, the results from the FDM simulations differ significantly from the others across these analyses, while no noticeable distinction is observed between the baryonic and CDM-only simulations. Moreover, the lower FDM mass leads to more significant deviations from the CDM simulations. These compelling results highlight the efficiency of the NN analysis – mainly through the use of the J (x) function, |$s_3$| , |$l_{3}$| , and |$a_4$| parameters – as a prominent new tool for investigating FDM on large scales and making observational predictions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Oxygen exchange and ice melt measured at the ice-water interface by eddy correlation.
- Author
-
Long, M. H., Koopmans, D., Berg, P., Rysgaard, S., Glud, R. N., and Søgaard, D. H.
- Subjects
SNOWMELT ,ATMOSPHERIC turbulence ,SEA ice ,ATMOSPHERIC oxygen ,WATER temperature ,STATISTICAL correlation ,DISTRIBUTION (Probability theory) - Abstract
This study uses the eddy correlation technique to examine fluxes across the ice-water interface. Temperature eddy correlation systems were used to determine rates of ice melting and freezing, and O
2 eddy correlation systems were used to examine O2 exchange rates as driven by biological and physical processes. The research was conducted below 0.7m thick sea ice in mid March 2010 in a southwest Greenland fjord and revealed low average rates of ice melt amounting to a maximum of 0.80±0.09 mm d-1 (SE, n = 31). The corresponding calculated O2 flux associated with release of O2 depleted melt water was less than 13% of the average daily O2 respiration rate. Ice melt and insufficient vertical turbulent mixing due to low current velocities caused periodic stratification immediately below the ice. This prevented the determination of fluxes during certain time periods, amounting to 66% of total deployment time. The identification of these conditions was evaluated by examining the velocity and the linearity and stability of the cumulative flux. The examination of unstratified conditions through velocity and O2 spectra and their cospectra revealed characteristic fingerprints of well-developed turbulence. From the observed O2 fluxes, a photosynthesis/ irradiance curve was established by least-squares fitting. This relation showed that light limitation of net photosynthesis began at 4.2 µmol photons m-2 s-1 , and that the algal communities were well-adapted to low-light conditions as they were light satu20 rated for 75% of the day during this early spring period. However, the sea ice associated microbial and algal community was net heterotrophic with a daily gross primary production of 0.69±0.02 mmol O2 m-2 d-1 (SE, n = 4) and a respiration rate of -2.13 mmol O2 m-2 d-1 (no SE, see text for details) leading to a net primary production of -1.45±0.02 mmol O2 m-2 d-1 (SE, n=4). Modeling the observed fluxes allowed for the calculation of fluxes during time periods when no O2 fluxes were extracted. This application of the eddy correlation technique produced high temporal resolution O2 fluxes and ice melt rates that were measured without disturbing the environmental conditions while integrating over a large area of approximately 50 m² which encompassed the highly variable activity and spatial distributions of sea ice algal communities. [ABSTRACT FROM AUTHOR]- Published
- 2011
- Full Text
- View/download PDF
8. Incremental Dynamic Analysis Considering Main Aftershock of Structures Based on the Correlation of Maximum and Residual Inter-Story Drift Ratios.
- Author
-
Qu, Jiting and Pan, Chuyun
- Subjects
EARTHQUAKE aftershocks ,EARTHQUAKE intensity ,DISTRIBUTION (Probability theory) ,EARTHQUAKE hazard analysis ,STATISTICAL correlation - Abstract
Aftershocks often occur after strong earthquakes and aggravate structural damage. Commonly, the incremental dynamic analysis (IDA) considering the main aftershocks only used a single index such as the maximum or the residual inter-story drift ratio. However, results of IDA using different indices may suggest that a structure has collapsed but is still repairable, which is not realistic. Given these shortcomings, this paper proposes selecting two indices in the IDA method based on the correlation between the maximum and the residual inter-story drift ratio, considering the main aftershocks. The influence of the double-indices model on the structural vulnerability analysis was discussed by establishing a commercial building in SAP2000 software, and single-index and double-index IDA were carried out, respectively. The joint distribution probability of the two indices under fixed seismic intensity was also calculated. The difference between the single-index and double-index IDA results was compared considering both the main shock and the main aftershock. The results showed that the effect of aftershocks would improve the correlation coefficient between the maximum and the residual inter-story drift ratio, and the building model has a higher probability of overrun after considering the correlation of the two indices. This paper provides a new method for IDA and vulnerability analysis using multiple indices. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. Analysis on the station-based and grid- based integration for dynamic-statistic combined predictions.
- Author
-
Yang, Zihan, Bai, Huimin, Tuo, Ya, Yang, Jie, Gong, Zhiqiang, Wu, Yinzhong, and Feng, Guolin
- Subjects
- *
DISTRIBUTION (Probability theory) , *STATISTICAL correlation , *FORECASTING , *ECONOMIC development , *ECONOMIC forecasting - Abstract
Summer precipitation prediction in China is important to society and economic development, but still a challenging issue in current meteorological studies, due to the uncertainty of the climate system. This paper developed a weighted integration method founded on dynamic-statistic prediction methods. The advantages and disadvantages of station-based and grid-based integration on summer precipitation prediction, along with the underlying reasons, are respectively analyzed by calculating the anomaly correlation coefficients (ACC), prediction score (PS), and root-mean-square errors (RMSE). The main manifestations indicated that 1) The weighted integration method can provide better skill of summer rainfall prediction in China than the single dynamic-statistic combined prediction method; 2) For the station-based integration, the 10-year ACC mean of summer precipitation prediction is 0.098–0.106, passing the 90% significance level, which is also obviously higher than that of the grid-based integration. The station-based integration has a higher symbol consistency ratio (SCR) than the grid-based integration, and the probability distribution of anomaly percentiles of station-based integration is closer to the observation, which causes the corresponding PS score to be 69.2–70.0 and higher than the grid-based integration. The independent sample validation of 2020 and 2021 further confirmed that station-based integration had a higher ACC than grid-based integration. It is indicated that station-based integration may have better performance in improving the accuracy of the summer precipitation prediction in China, which needs to be deeply considered in the scientific study and real prediction issues. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Momentum Distribution Functions and Pair Correlation Functions of Unpolarized Uniform Electron Gas in Warm Dense Matter Regime.
- Author
-
Larkin, Alexander, Filinov, Vladimir, and Levashov, Pavel
- Subjects
MOMENTUM distributions ,STATISTICAL correlation ,DISTRIBUTION (Probability theory) ,MONTE Carlo method ,QUANTUM statistics ,ELECTRON gas ,MOMENTUM transfer - Abstract
In this paper we continued our research of the uniform electron gas in a warm dense matter regime, focusing on the momentum distribution functions and pair correlation functions. We use the single–momentum path integral Monte Carlo method, based on the Wigner formulation of quantum statistics to calculate both momentum- and coordinate-depending distributions and average values of quantum operators for many-fermion Coulomb systems. We discovered that the single-particle momentum distribution function deviates from the ideal Fermi distribution and forms the so-called "quantum tails" at high momenta, if non-ideality is strong enough in both degenerate and non-degenerate cases. This effect is always followed by the appearance of the short-range order on pair correlation functions and can be explained by the tunneling through the effective potential wells surrounding the electrons. Furthermore, we calculated the average kinetic and potential energies in the wide range of states, expanding our previous results significantly. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
11. Statistical Characteristics of 3D MIMO Channel Model for Vehicle-to-Vehicle Communications.
- Author
-
Saleem, Asad, Xu, Yan, Khan, Rehan Ali, Rasheed, Iftikhar, Jaffri, Zain Ul Abidin, and Layek, Md Abu
- Subjects
PROBABILITY density function ,COMMUNICATION models ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,MIMO systems - Abstract
Spatial and temporal characteristics of the propagation channel have a significant influence on multiantenna method applicability for fifth-generation- (5G-) enabled Internet of Things (IoT). In this paper, the statistical characteristics of a novel three-dimensional (3D) geometric-based stochastic model for next-generation vehicle-to-vehicle (V2V) multiple-input multiple-output (MIMO) communications under the nonisotropic scattering environment are investigated. In both line-of-sight (LoS) and non-line-of-sight (NLoS) conditions, the proposed model investigates the spatial, frequency, and temporal domain statistical distribution of multipath received signals by using the time-variant transfer function for indoor environments. The probability density function (PDF) of separation distance between the transceiver antennas, angle-of-arrival (AoA), and angle-of-departure (AoD) in the azimuth and elevation planes is derived by using closed-form expressions. For the space, time, and frequency correlation function (STF-CF), a precise analytical expression is derived based on MIMO antenna system. We further determine the effects of several model parameters on the V2V channel performance, such as tunnel width, antenna array spacing, Ricean K -factor, and moving velocity. The statistical characteristics of the MIMO channel model are validated by simulation results, confirming the flexibility and effectiveness of our proposed model in the tunnel scenario. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
12. Joint data and key distribution of simple, multiple, and multidimensional linear cryptanalysis test statistic and its impact to data complexity.
- Author
-
Blondeau, Céline and Nyberg, Kaisa
- Subjects
DISTRIBUTION (Probability theory) ,COMPUTATIONAL complexity ,CRYPTOGRAPHY ,CODING theory ,STATISTICAL models ,STATISTICAL correlation - Abstract
The power of a statistical attack is inversely proportional to the number of plaintexts needed to recover information on the encryption key. By analyzing the distribution of the random variables involved in the attack, cryptographers aim to provide a good estimate of the data complexity of the attack. In this paper, we analyze the hypotheses made in simple, multiple, and multidimensional linear attacks that use either non-zero or zero correlations, and provide more accurate estimates of the data complexity of these attacks. This is achieved by taking, for the first time, into consideration the key variance of the statistic for both the right and wrong keys. For the family of linear attacks considered in this paper, we differentiate between the attacks which are performed in the known-plaintext and those in the distinct-known-plaintext model. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
13. Information-Theoretic Caching: Sequential Coding for Computing.
- Author
-
Wang, Chien-Yi, Lim, Sung Hoon, and Gastpar, Michael
- Subjects
INFORMATION theory ,CACHE memory ,CHANNEL coding ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation - Abstract
Under the paradigm of caching, partial data are delivered before the actual requests of users are known. In this paper, this problem is modeled as a canonical distributed source coding problem with side information, where the side information represents the users’ requests. For the single-user case, a single-letter characterization of the optimal rate region is established, and for several important special cases, closed-form solutions are given, including the scenario of uniformly distributed user requests. In this case, it is shown that the optimal caching strategy is closely related to total correlation and Wyner’s common information. Using the insight gained from the single-user case, three two-user scenarios admitting single-letter characterization are considered, which draw connections to existing source coding problems in the literature: the Gray–Wyner system and distributed successive refinement. Finally, the model studied by Maddah-Ali and Niesen is rephrased to make a comparison with the considered information-theoretic model. Although the two caching models have a similar behavior for the single-user case, it is shown through a two-user example that the two caching models behave differently in general. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
14. Crossing empirical trend analysis (CETA) at risk levels in hydro-meteorological time series.
- Author
-
Şen, Zekâi
- Subjects
DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,TREND analysis ,SAMPLE size (Statistics) ,TIME series analysis - Abstract
Trend identification procedures are used to identify systematic monotonic trendlines in a given hydro-meteorological time series recording to represent time-dependent variations as increases or decreases. Different methodologies have been proposed for such descriptions, but most of them require restrictive assumptions such as normal (Gaussian) probability distribution function (PDF), serial independence, and long sample sizes. In particular, pre-whitening and over-whitening are recommended to meet the need for serial independence, but they cannot transform a serially dependent series into a completely independent one. In this paper, a new trend methodology is proposed based on the characteristics of crossings along any given straight line within the given time series, and the sought-after trend component is the one with the maximum number of crossings. This approach does not require any restrictive assumptions. Unlike previous trend algorithms, the proposed cross-empirical trend analysis (CETA) does not give a single trend, but a series of trends at different levels within the variation range of hydro-meteorological time series records. For the sake of brevity, only three levels are considered in this article, at 10%, 50%, and 90% risk levels. The comparison of the CETA approach is presented with the classical and frequently used Mann–Kendall (MK) trend determination procedure method based on Sen's slope calculation. For very small series correlation coefficients and normal PDF function cases, CETA and the classical technique give almost the same trendline within the ± 5% error band. The application of this methodology is presented for monthly and annual discharge records of the Danube River and annual precipitation records from seven geographical regions of Turkey. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Upper Bounds of Error Probabilities for Stationary Gaussian Channels With Feedback.
- Author
-
Ihara, Shunsuke
- Subjects
GAUSSIAN channels ,GAUSSIAN distribution ,DISTRIBUTION (Probability theory) ,MATHEMATICAL programming ,STATISTICAL correlation - Abstract
In this paper, we discuss the coding schemes and error probabilities in information transmission over additive Gaussian noise channels with feedback, where the Gaussian noise processes are stationary but not necessarily white. In the case of the white Gaussian channel it is known that the minimum error probability, under the average power constraint, decreases faster than the exponential of any order. Recently Gallager and Nakiboğlu (2010) proposed a coding scheme for the white Gaussian channel and successfully showed the multiple-exponential decay of the error probability for all rates below capacity. This paper aims to prove that, without any special assumptions on the noise of the stationary Gaussian channel, the minimum error probability decreases multiple-exponentially fast. In general, no explicit formulas are known for the capacity of the stationary Gaussian channel. In this paper, we introduce a lower bound C^* on the capacity C. Then we prove that the minimum error probability decreases multiple-exponentially fast for all rates below C^*, but not for all rates below C. In the process of proving, the scheme proposed by Gallager and Nakiboğlu proves itself to be quite useful, even though the Gaussian channel is not white. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
16. Asymptotic Performance of Composite Lognormal-X Fading Channels.
- Author
-
Zhu, Bingcheng
- Subjects
ASYMPTOTES ,TELECOMMUNICATION systems ,LOGNORMAL distribution ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,WIRELESS communications - Abstract
It is challenging to derive closed-form performance expressions for composite channels considering both the large-scale lognormal fading and small-scale fading, thus most existing analyses on the composite channels are based on numerical approaches such as the Gauss–Hermite quadrature. In this paper, we develop a simple framework to achieve closed-form asymptotic expressions for the outage probabilities and the error rates of the composite lognormal-X fading channels, where the “X” can be Rayleigh, Rician, Nakagami- $m$ or any other fading channels that have finite diversity orders. It is proved that the lognormal fading contributes an exponential factor to the asymptotic expressions, and the exponential factor is related to the diversity order of the X channel and the lognormal parameters. The new analytical tool is shown to be versatile in evaluating the performance of radio-frequency communications suffering both the large-scale and small-scale fading, including diversity reception systems, relaying systems, and distributed antenna systems. It is also shown to be useful in evaluating free-space optical communication systems with pointing error, and in deriving asymptotic signal-to-noise ratio gaps between different modulation formats over lognormal fading channels. The elegant asymptotic expressions reveal insights into the composite fading channels and can be a criterion for various system designs. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
17. A CAUTIONARY NOTE ON THE RELIABILITY OF ADVERTISING TEST-RETEST SCORES.
- Author
-
Adams, Arthur J.
- Subjects
STUDY & teaching of advertising ,STATISTICAL correlation ,STATISTICAL reliability ,RELIABILITY (Personality trait) ,STATISTICS ,MARKETING research ,CONSUMER research ,QUANTITATIVE research ,DISTRIBUTION (Probability theory) ,MEASUREMENT errors ,STATISTICAL bias ,STATISTICAL sampling - Abstract
This paper examines a commonly used indicator of reliability in advertising studies (the test-retest correlation coefficient), and shows that reliability values obtained by this method are not as simple to interpret nor as closely related to the idea of stability or reproducibility as many casual readers in the area might presume them to be. Examples from the advertising literature are provided to demonstrate these points, and several cautions in using test-retest correlations as reliability estimates are offered. It is also suggested that other statistics may prove useful to the researcher as supplements to the limited amount of information contained in the estimated reliability number. [ABSTRACT FROM AUTHOR]
- Published
- 1984
- Full Text
- View/download PDF
18. Research on the reliability allocation calculation method of a wind turbine generator set based on a vine copula correlation model.
- Author
-
Wu, Yuanyuan and Sun, Wenlei
- Subjects
- *
TURBINE generators , *WIND turbines , *STATISTICAL correlation , *DISTRIBUTION (Probability theory) - Abstract
To consider the failure correlation among key subsystems, based on the reliability allocation method of the series system, a wind turbine reliability allocation calculation method based on the vine copula correlation model is proposed. The reliability modeling method of the series system is adopted to model the reliability of each subsystem, combined with the copula function to model the correlation between the variables, and then, the multidimensional variable problem is converted into the two‐dimensional variable problem through the vine copula function. The reliability calculation model of the copula function model, under the influence of factors such as the complexity, importance, and failure hazard of each subsystem, allocates the reliability of the key component systems of the wind turbine generator set. Finally, through three allocation methods based on the independent, multivariate copula function and vine copula function and other distribution methods, the calculation and analysis of the wind turbine generator set are performed. The results show that the allocated control system has the lowest reliability, and the tower has the highest reliability; compared with the assumption of the independent allocation method, the allocation result of this paper's method is lower; compared with the allocation method of the multidimensional copula function, the allocated failure rate is increased by more than 20%. This paper verifies that the proposed method is not only effective and reasonable but also more consistent with the actual situation. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
19. The Output of an M/D/1 Queue.
- Author
-
Pack, Charles D.
- Subjects
QUEUING theory ,DISTRIBUTION (Probability theory) ,PRODUCTION scheduling ,PRODUCTION (Economic theory) ,RANDOM variables ,MULTIVARIATE analysis ,STATISTICAL correlation ,MATHEMATICAL statistics - Abstract
In this paper we investigate the output process of the M/D/1 queuing system. We derive expressions for the distributions and first two moments, in both steady-state and transient conditions, of the following random variables: (1) the time until the nth departure measured from a departure epoch, To, (2) the time between the n--1st and nth departures after To, and (3) the number of departures in (T[sub 0], T[sub 0]+t]. Further we study the autocorrelation functions of random variables (1) and (2) in the steady state. [ABSTRACT FROM AUTHOR]
- Published
- 1975
- Full Text
- View/download PDF
20. SCORING RULES FOR CONTINUOUS PROBABILITY DISTRIBUTIONS.
- Author
-
Matheson, James E. and Winkler, Robert L.
- Subjects
DECISION making ,DISTRIBUTION (Probability theory) ,SIMULATION methods & models ,STATISTICAL correlation ,MATHEMATICAL statistics ,STATISTICAL hypothesis testing ,PROBABILITY theory ,SCORING rubrics ,MATHEMATICAL models - Abstract
Personal, or subjective, probabilities are used as inputs to many inferential and decision-making models, and various procedures have been developed for the elicitation of such probabilities. Included among these elicitation procedures are scoring rules, which revolve the computation of a score based on the assessor's stated probabilities and on the event that actually occurs. The development of scoring rules has, in general, been restricted to the elicitation of discrete probability distributions. In this paper, families of scoring rules for the elicitation of continuous probability distributions are developed and discussed. [ABSTRACT FROM AUTHOR]
- Published
- 1976
- Full Text
- View/download PDF
21. A QUEUING SYSTEM WITH SERVICE-TIME DISTRIBUTION OF MIXED CHI-SQUARED TYPE.
- Author
-
Wishart, David M. G.
- Subjects
MARKOV processes ,QUEUING theory ,STATISTICAL correlation ,DISTRIBUTION (Probability theory) ,PRODUCTION scheduling ,PROBABILITY theory ,STOCHASTIC processes - Abstract
In this paper Kendall's technique of the embedded Markov chain
[5] is applied to a queuing system `with general independent input and a wide class of service-time distributions The matrix of transition probabilities is found to be formally identical `with that discussed in our earlier study,[9] which will be taken as read in the present paper Using the results of reference 9 we can write down the equilibrium distribution of waiting-times for customers in the more general system in terms of the roots of a transcendental equation An example is considered that arose in Bailey's study of hospital systems[1] [ABSTRACT FROM AUTHOR]- Published
- 1959
- Full Text
- View/download PDF
22. Correlations among some clay parameters -- the multivariate distribution.
- Author
-
Jianye Ching and Kok-Kwang Phoon
- Subjects
CLAY ,MULTIVARIATE analysis ,DISTRIBUTION (Probability theory) ,DATA analysis ,STATISTICAL correlation ,GEOTECHNICAL engineering ,MATERIAL plasticity ,SHEAR strength - Abstract
Copyright of Canadian Geotechnical Journal is the property of Canadian Science Publishing and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2014
- Full Text
- View/download PDF
23. Optimal Distribution-Free Tests and Further Evidence of Heteroscedasticity in the Market Model: A Comment.
- Author
-
LEHMANN, BRUCE and WARGA, ARTHUR
- Subjects
ECONOMETRIC models ,HETEROSCEDASTICITY ,RANDOM variables ,DEPENDENCE (Statistics) ,DISTRIBUTION (Probability theory) ,QUANTITATIVE research ,STATISTICAL sampling ,REGRESSION analysis ,RECURSIVE functions ,STATISTICAL correlation - Abstract
The article presents a commentary on the paper "Optimal Distribution-Free Tests and Further Evidence of Heteroscedasticity in the Market Model," by Carmelo Giacotto and Mukhtar Ali. The authors point out a flaw in the analysis' statistical reasoning. They explain that uncorrelated random variables that are not normally distributed are not necessarily independent. This observation renders the paper's claims involving the distribution-free nature of their tests incorrect. The authors contend that it is not clear whether any of the test statistics possess well-defined distributions in large samples when based on recursive residuals.
- Published
- 1985
- Full Text
- View/download PDF
24. Side-Information-Dependent Correlation Channel Estimation in Hash-Based Distributed Video Coding.
- Author
-
Deligiannis, Nikos, Barbarien, Joeri, Jacobs, Marc, Munteanu, Adrian, Skodras, Athanassios, and Schelkens, Peter
- Subjects
VIDEO coding ,STATISTICAL correlation ,HASHING ,DISTRIBUTION (Probability theory) ,NOISE measurement ,ALGORITHMS ,ESTIMATION theory ,HARMONIC functions - Abstract
In the context of low-cost video encoding, distributed video coding (DVC) has recently emerged as a potential candidate for uplink-oriented applications. This paper builds on a concept of correlation channel (CC) modeling, which expresses the correlation noise as being statistically dependent on the side information (SI). Compared with classical side-information-independent (SII) noise modeling adopted in current DVC solutions, it is theoretically proven that side-information-dependent (SID) modeling improves the Wyner–Ziv coding performance. Anchored in this finding, this paper proposes a novel algorithm for online estimation of the SID CC parameters based on already decoded information. The proposed algorithm enables bit-plane-by-bit-plane successive refinement of the channel estimation leading to progressively improved accuracy. Additionally, the proposed algorithm is included in a novel DVC architecture that employs a competitive hash-based motion estimation technique to generate high-quality SI at the decoder. Experimental results corroborate our theoretical gains and validate the accuracy of the channel estimation algorithm. The performance assessment of the proposed architecture shows remarkable and consistent coding gains over a germane group of state-of-the-art distributed and standard video codecs, even under strenuous conditions, i.e., large groups of pictures and highly irregular motion content. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
25. Efficient Generation of Random Bits From Finite State Markov Chains.
- Author
-
Zhou, Hongchao and Bruck, Jehoshua
- Subjects
MARKOV processes ,RANDOM numbers ,MATHEMATICAL sequences ,VECTOR analysis ,STATISTICAL correlation ,NEUMANN problem ,DISTRIBUTION (Probability theory) ,INFORMATION theory - Abstract
The problem of random number generation from an uncorrelated random source (of unknown probability distribution) dates back to von Neumann's 1951 work. Elias (1972) generalized von Neumann's scheme and showed how to achieve optimal efficiency in unbiased random bits generation. Hence, a natural question is what if the sources are correlated? Both Elias and Samuelson proposed methods for generating unbiased random bits in the case of correlated sources (of unknown probability distribution), specifically, they considered finite Markov chains. However, their proposed methods are not efficient or have implementation difficulties. Blum (1986) devised an algorithm for efficiently generating random bits from degree-2 finite Markov chains in expected linear time, however, his beautiful method is still far from optimality on information-efficiency. In this paper, we generalize Blum's algorithm to arbitrary degree finite Markov chains and combine it with Elias's method for efficient generation of unbiased bits. As a result, we provide the first known algorithm that generates unbiased random bits from an arbitrary finite Markov chain, operates in expected linear time and achieves the information-theoretic upper bound on efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
26. DURATION DISTRIBUTION OF THE CONJUNCTION OF TWO INIDEPENDENT F PROCESSES.
- Author
-
Alodat, M. T., Al-Rawwash, M., and Jebrini, M. A.
- Subjects
DISTRIBUTION (Probability theory) ,GAUSSIAN distribution ,PROBABILITY theory ,STATISTICAL correlation ,MATHEMATICAL variables ,RANDOM variables - Abstract
In this paper we obtain an approximation for the duration distribution of the excursion set generated by the minimum of two independent F random processes above a high threshold ii. Moreover, we obtain a closed-form approximation for the mean duration of the conjunction of these two F processes. As an illustration, we conduct a simulation study to compare the performances of the approximated distribution and the exact distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
27. Reduced-Dimension Linear Transform Coding of Distributed Correlated Signals With Incomplete Observations.
- Author
-
Nurdin, Hendra I., Mazumdar, Ravi R., and Bagchi, Aruhabha
- Subjects
STATISTICAL correlation ,RANDOM variables ,DISTRIBUTION (Probability theory) ,GAUSSIAN processes ,MATHEMATICAL models ,HILBERT space - Abstract
We study the problem of optimal reduced-dimension linear transform coding and reconstruction of a signal based on distributed correlated observations of the signal. In the mean square estimation context this involves finding the optimal signal representation based on multiple incomplete or only partial observations that are correlated. In particular, this leads to the study of finding the optimal Karhunen-Loève basis based on the censored observations. The problem has been considered previously by Gastpar, Dragotti, and Vetterli in the context of jointly Gaussian random variables based on using conditional covariances. In this paper, we derive the estimation results in the more general setting of second-order random variables with arbitrary distributions, using entirely different techniques based on the idea of innovations. We explicitly solve the single transform coder case, give a characterization of optimality in the multiple distributed transform coders scenario and provide additional insights into the structure of the problem. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
28. On the Performance Analysis of Composite Multipath/Shadowing Channels Using the G-Distribution.
- Author
-
Laourine, Amine, Alouini, Mohamed-Slim, Affes, Sofiène, and Stéphenne, Alex
- Subjects
TELECOMMUNICATION systems ,COMMUNICATION methodology ,DIFFERENTIABLE dynamical systems ,DISTRIBUTION (Probability theory) ,SHADOWING theorem (Mathematics) ,GENERATING functions ,STATISTICAL correlation ,COMMUNICATION ,PDF (Computer file format) - Abstract
Composite multipath fading/shadowing environments are frequently encountered in different realistic scenarios. These channels are generally modeled as a mixture of Nakagamim multipath fading and log-normal shadowing. The resulting composite probability density function (pdf) is not available in closed form, thereby making the performance evaluation of communication links in these channels cumbersome. In this paper, we propose to model composite channels by the G- distribution. This pdf arises when the log-normal shadowing is substituted by the Inverse-Gaussian one. This substitution will prove to be very accurate for several shadowing conditions. In this paper we conduct a performance evaluation of single-user communication systems operating in a composite channel. Our study starts by deriving an analytical expression for the outage probability. Then, we derive the moment generating function of the G-distribution, hence facilitating the calculation of average bit error probabilities. We also derive analytical expressions for - the channel capacity for three adaptive transmission techniques, namely, i) optimal rate adaptation with constant power, ii) optimal power and rate adaptation, and iii) channel inversion with fixed rate. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
29. Bivariate Gompertz generator of distributions: statistical properties and estimation with application to model football data.
- Author
-
Eliwa, M. S., Alhussain, Z. A., Ahmed, E. A., Salah, M. M., Ahmed, H. H., and El-Morshedy, M.
- Subjects
DISTRIBUTION (Probability theory) ,PROBABILITY density function ,MAXIMUM likelihood statistics ,BIVARIATE analysis ,CONDITIONAL probability ,STATISTICAL correlation - Abstract
In this paper, the bivariate extension of the so called Gompertz-G family was introduced and studied in detail. Marshall and Olkin shock model was used to build the proposed bivariate family. The new family was constructed from three independent Gompertz-H families using a minimisation process. Some of its statistical properties such as joint probability density function, coefficient of median correlation, moments, product moment, covariance, conditional probability density function, joint reliability function, stress-strength reliability and joint reversed (hazard) rate function were derived. After introducing the general class, three special models of the new family were discussed. Maximum likelihood method was used to estimate the family parameters. A simulation study was carried out to examine the bias and mean square error of the maximum likelihood estimators. Finally, the importance of the proposed bivariate family was illustrated by means of real dataset, and it was found that the proposed model provides better fi t than other well-known models in the statistical literature such as bivariate Gompertz, bivariate generalized Gompertz, bivariate Gumbel Gompertz, bivariate Burr X Gompertz and bivariate exponentiated Weibull-Gomperz. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
30. The Correlation Distribution of Quaternary Sequences of Period 2(2n - 1).
- Author
-
Johansen, Ama, Helleseth, Tor, and Xiaohu Tang
- Subjects
STATISTICAL correlation ,EXPONENTIAL sums ,MATHEMATICAL models ,GALOIS modules (Algebra) ,DISTRIBUTION (Probability theory) ,MULTIPLICITY (Mathematics) - Abstract
Family A is a family of sequences of period 2
n - 1 over Z4 , the ring of integers modulo 4. This family has optimal correlation properties and its correlation distribution is well known. Two related families of quaternary sequences are the families B and C. These are families of sequences over Z4 of period 2 (2n - 1). In recent years, new families of quaternary sequences of period 2(2n - 1) have been constructed by modifying the sequence families B and C in a nonlinear way. This has resulted in a new family D of sequences of period 2(2n - 1) which has optimal correlation properties, but until now the correlation distribution of this family has not been known. In this paper, we completely determine the correlation distribution of family D by making use of properties of exponential sums. [ABSTRACT FROM AUTHOR]- Published
- 2008
- Full Text
- View/download PDF
31. Asymptotic Distribution of the Number of Isolated Nodes in Wireless Ad Hoc Networks With Bernoulli Nodes.
- Author
-
Chih-Wei Yi, Peng-Jun Wan, Xiang-Yang Li, and Frieder, Ophir
- Subjects
RADIO transmitter-receivers ,RADIO transmitters & transmission ,PROBABILITY theory ,STATISTICAL correlation ,ANALYSIS of variance ,ASYMPTOTIC expansions ,DIFFERENCE equations ,BINOMIAL distribution ,DISTRIBUTION (Probability theory) - Abstract
Nodes in wireless ad hoc networks may become inactive or unavailable due to, for example, internal breakdown or being in the sleeping state. The inactive nodes cannot take part in routing/relaying, and thus may affect the connectivity. A wireless ad hoc network containing inactive nodes is then said to be connected, if each inactive node is adjacent to at least one active node and all active nodes form a connected network. This paper is the first installment of our probabilistic study of the connectivity of wireless ad hoc networks containing inactive nodes. We assume that the wireless ad hoc network consists of n nodes which are distributed independently and uniformly in a unit-area disk, and are active (or available) independently with probability p for some constant 0 < p ≤ 1. We show that if all nodes have a maximum transmission radius r
n = √(1n n + ϵ)/πrpn for some constant ϵ, then the total number of isolated nodes is asymptotically Poisson with mean e-ϵ , and the total number of isolated active nodes is also asymptotically Poisson with mean pe-ϵ . [ABSTRACT FROM AUTHOR]- Published
- 2006
- Full Text
- View/download PDF
32. Parameter Estimation for a Modified Weibull Distribution, for Progressively Type-II Censored Samples.
- Author
-
Ng, H. K. T.
- Subjects
WEIBULL distribution ,DISTRIBUTION (Probability theory) ,REGRESSION analysis ,MONTE Carlo method ,STATISTICAL sampling ,STATISTICAL correlation - Abstract
In this paper, the estimation of parameters based on a progressively Type-II censored sample from a modified Weibull distribution is studied. The likelihood equations, and the maximum likelihood estimators are derived. The estimators based on a least-squares fit of a multiple linear regression on a Weibull probability paper plot are compared with the MLE via Monte Carlo simulations. The observed Fisher information matrix, as well as the asymptotic variance-covariance matrix of the MLE are derived. Approximate confidence intervals for the parameters are constructed based on the s-normal approximation to the asymptotic distribution of MLE, and log-transformed MLE. The coverage probabilities of the individual s-normal-approximation confidence intervals for the parameters are examined numerically. Some recommendations are made from the results of a Monte Carlo simulation study, and a numerical example is presented to illustrate all of the methods of inference developed here. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
33. Data Mapping and the Prediction of Common Cause Failure Probability.
- Author
-
Liyang Xie, Jinyu Zhou, and Xuemin Wang
- Subjects
DISTRIBUTION (Probability theory) ,PROBABILITY theory ,STATISTICAL correlation ,MULTIPLICITY (Mathematics) ,MATHEMATICS ,ALGORITHMS - Abstract
General failure event data from various sources are often used to estimate the failure probability for the system of interest, especially when s-dependence exists among component failures, where common cause failure plays an important role. Failure event data from different sources must be reasonably explained, and correctly applied, so that the information about load environment, and component/system property can be used correctly. In estimating the probability for s-dependent system failure, both the load distribution, and component strength distribution are much more important than component failure probability index. Based on the relationship among different multiple failures, this paper presents a data mapping approach to estimating dependent system failure probability through multiple failure event data of other systems with different sizes. The underlying assumption on data mapping is that failures of different multiples (including single) are correlated with each other for a group of components if they are subjected to the same or correlated random load (loads). Taking the situation of a group of s-independent components operating under the same random load as an example, the likelihood of a component failure at a trial depends not only on the strength of the individual component but also on the realization of the random load. The likelihood of a specific multiple failure at a trial is also determined by both the component strengths, and the realization of the random toad. Further- more, if a larger load sample appears, the likelihoods for failure are higher. Conversely, if a smaller load sample appears, the likelihoods of failure are lower. We emphasized in this paper that system failure event data should be interpreted & applied under the principle that various multiple failures are distinguished by their respective failure multiplicity and/or system size, and are inherently interrelated through correlated load environments. The approach starts with determining the load parameter, and component strength parameter according to multiple (including single) failure event data available. Then, these parameters are used to calculate the probability of multiple failures for systems of different sizes. This approach is applicable to predict high multiple failure probability based on low multiple failure event data. Examples of estimating multiple failure probabilities of EDG (emergency diesel generators) with mapped data illustrate that the proposed approach is desirable. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
34. Semisupervised Classification Through the Bag-of-Paths Group Betweenness.
- Author
-
Lebichot, Bertrand, Kivimaki, Ilkka, Francoisse, Kevin, and Saerens, Marco
- Subjects
BETWEENNESS relations (Mathematics) ,SET theory ,STATISTICAL correlation ,DISTRIBUTION (Probability theory) ,MAXWELL-Boltzmann distribution law - Abstract
This paper introduces a novel and well-founded betweenness measure, called the bag-of-paths (BoP) betweenness, as well as its extension, the BoP group betweenness, to tackle semisupervised classification problems on weighted directed graphs. The objective of semisupervised classification is to assign a label to unlabeled nodes using the whole topology of the graph and the labeled nodes at our disposal. The BoP betweenness relies on a BoP framework, assigning a Boltzmann distribution on the set of all possible paths through the network such that long (high-cost) paths have a low probability of being picked from the bag, while short (low-cost) paths have a high probability of being picked. Within that context, the BoP betweenness of node $j$ is defined as the sum of the a posteriori probabilities that node $j$ lies in between two arbitrary nodes $(i,k)$ when picking a path starting in $i$ and ending in $k$ . Intuitively, a node typically receives a high betweenness if it has a large probability of appearing on paths connecting two arbitrary nodes of the network. This quantity can be computed in closed form by inverting an $n\times n$ matrix where $n$ is the number of nodes. For the group betweenness, the paths are constrained to start and end in nodes within the same class, thereby defining a within-class group betweenness for each class. Unlabeled nodes are then classified according to the class showing the highest group betweenness. Experiments on various real-world datasets show that the BoP group betweenness performs competitively compared to all the tested state-of-the-art methods. The benefit of the BoP betweenness is particularly noticeable when only a few labeled nodes are available. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
35. Unsupervised logic mining with a binary clonal selection algorithm in multi-unit discrete Hopfield neural networks via weighted systematic 2 satisfiability.
- Author
-
Romli, Nurul Atiqah, Syaqina Zulkepli, Nur Fariha, Mohd Kasihmuddin, Mohd Shareduwan, Zamri, Nur Ezlin, Rusdi, Nur 'Afifah, Manoharam, Gaeithry, Mansor, Mohd. Asyraf, Mohd Jamaludin, Siti Zulaikha, and Malik, Amierah Abdul
- Subjects
HOPFIELD networks ,DISTRIBUTION (Probability theory) ,VALUES (Ethics) ,DATA mining ,STATISTICAL correlation - Abstract
Evaluating behavioral patterns through logic mining within a given dataset has become a primary focus in current research. Unfortunately, there are several weaknesses in the research regarding the logic mining models, including an uncertainty of the attribute selected in the model, random distribution of negative literals in a logical structure, non-optimal computation of the best logic, and the generation of overfitting solutions. Motivated by these limitations, a novel logic mining model incorporating the mechanism to control the negative literal in the systematic Satisfiability, namely Weighted Systematic 2 Satisfiability in Discrete Hopfield Neural Network, is proposed as a logical structure to represent the behavior of the dataset. For the proposed logic mining models, we used ratio of r to control the distribution of the negative literals in the logical structures to prevent overfitting solutions and optimize synaptic weight values. A new computational approach of the best logic by considering both true and false classification values of the learning system was applied in this work to preserve the significant behavior of the dataset. Additionally, unsupervised learning techniques such as Topological Data Analysis were proposed to ensure the reliability of the selected attributes in the model. The comparative experiments of the logic mining models by utilizing 20 repository real-life datasets were conducted from repositories to assess their efficiency. Following the results, the proposed logic mining model dominated in all the metrics for the average rank. The average ranks for each metric were Accuracy (7.95), Sensitivity (7.55), Specificity (7.93), Negative Predictive Value (7.50), and Mathews Correlation Coefficient (7.85). Numerical results and in-depth analysis demonstrated that the proposed logic mining model consistently produced optimal induced logic that best represented the real-life dataset for all the performance metrics used in this study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. DISCRETE APPROXIMATIONS OF PROBABILITY DISTRIBUTIONS.
- Author
-
Miller III, Allen C. and Rice, Thomas R.
- Subjects
APPROXIMATION theory ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,GAUSSIAN quadrature formulas ,GAUSSIAN distribution ,DECISION making ,MATHEMATICAL models of decision making ,MATHEMATICAL models ,STATISTICAL correlation ,DECISION theory ,MANAGEMENT science ,NUMERICAL analysis - Abstract
Practical limits on the size of most probabilistic models require that probability distributions be approximated by a few representative values and associated probabilities. This paper demonstrates that methods commonly used to determine discrete approximations of probability distributions systematically underestimate the moments of the original distribution. A new procedure based on gaussian quadrature is developed in this paper. It can be used to decrease the error in the approximation to any desired level. [ABSTRACT FROM AUTHOR]
- Published
- 1983
- Full Text
- View/download PDF
37. A Voyage of Discovery.
- Author
-
Billard, Lynne
- Subjects
STATISTICS ,ASSOCIATIONS, institutions, etc. ,PERIODICALS ,CENSUS ,ARITHMETIC mean ,VARIATIONAL principles ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation - Abstract
This article highlights the historical events that took place within 50 years since the American Statistical Association was founded in 1839 and the Journal of the American Statistical Association (JASA) was first published in 1888. For the first years, JASA contained almost exclusively nonmathematical papers. Many were mere repositories of extensive data sets, including many compilations from census counts with interpretations of what these data purportedly revealed. Others were from investigations undertaken by sociologists, economists, political scientists and historians. One area that attracted theoretical attention dealt with the concepts of averages, variation and distributions. The second area that received theoretical attention during these years was correlation and related concepts.
- Published
- 1997
- Full Text
- View/download PDF
38. Characterizing corridor-level travel time distributions based on stochastic flows and segment capacities.
- Author
-
Lei, Hao, Zhou, Xuesong, List, George F., and Taylor, Jeffrey
- Subjects
TRAVEL time (Traffic engineering) ,STOCHASTIC processes ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,ESTIMATION theory - Abstract
Trip travel time reliability is an important measure of transportation system performance and a key factor affecting travelers' choices. This paper explores a method for estimating travel time distributions for corridors that contain multiple bottlenecks. A set of analytical equations are used to calculate the number of queued vehicles ahead of a probe vehicle and further capture many important factors affecting travel times: the prevailing congestion level, queue discharge rates at the bottlenecks, and flow rates associated with merges and diverges. Based on multiple random scenarios and a vector of arrival times, the lane-by-lane delay at each bottleneck along the corridor is recursively estimated to produce a route-level travel time distribution. The model incorporates stochastic variations of bottleneck capacity and demand and explains the travel time correlations between sequential links. Its data needs are the entering and exiting flow rates and a sense of the lane-by-lane distribution of traffic at each bottleneck. A detailed vehicle trajectory data-set from the Next Generation SIMulation (NGSIM) project has been used to verify that the estimated distributions are valid, and the sources of estimation error are examined. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
39. On the Time Required for Conception.
- Author
-
Sheps, Mindel C.
- Subjects
DATE of conception ,POPULATION statistics ,HUMAN fertility ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation ,STATISTICS - Abstract
Assuming that the monthly chance of conceiving, fecundability, is constant for any one woman during the period of observation, but varies from one woman to another in an unspecified way, this paper investigates the expected incidence of first conceptions and the distribution of conception delays. The expected monthly incidence of first conceptions is a decreasing function of time. The mean, variance and higher moments of the distribution of fecundability (p) may be estimated from incomplete data referring to only the first few months of exposure. The distribution of conception delays is distinguished from the distribution of the number of trials needed for conception, and the moments of each are derived. When appropriate data are available to calculate the moments of the conception delay or the number of trials, the squared and cubed deviations of i/p about the harmonic mean fecundability can be estimated. These statistics may be useful descriptions of the discrepancy between an observed distribution and theoretical values expected in a homogeneous population. The correlation between two successive conception delays is also discussed in the article.
- Published
- 1964
- Full Text
- View/download PDF
40. Probabilistic constitutive law for masonry veneer wall ties.
- Author
-
Muhit, Imrose B., Stewart, Mark G., and Masia, Mark J.
- Subjects
- *
STATISTICAL correlation , *CUMULATIVE distribution function , *DISTRIBUTION (Probability theory) , *MAXIMUM likelihood statistics , *MASONRY , *GOODNESS-of-fit tests , *TENSION loads - Abstract
In a masonry veneer wall system, tie strengths and stiffnesses vary randomly and so are not consistent for all ties throughout the wall. To ensure an economical and safe design, this paper uses tie calibration experimental approach in accordance with the standard AS2699.1 to investigate the tie failure load under compression and tension loading. Probabilistic wall tie characterisations are accomplished by estimating the mean, coefficient of variation and characteristic axial compressive and tensile strength from 50 specimens. The displacement across the cavity is recorded, which resulted the complete load versus displacement response. Using the maximum likelihood method, a range of probability distributions are fitted to tie strengths at different displacement histogram data sets, and a best-fitted probability distribution is selected for each case. The inverse cumulative distribution function plots are also used along with the Anderson-Darling test to infer a goodness-of-fit for the probabilistic models. An extensive statistical correlation analysis is also conducted to check the correlation between different tie strengths and associated displacement for both compression and tension loading. Based on the findings, a wall tie constitutive law is proposed to define probabilistic tie behaviour in numerical modelling. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. On Measure Transformed Canonical Correlation Analysis.
- Author
-
Todros, Koby and Hero, Alfred O.
- Subjects
STATISTICAL correlation ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,KERNEL functions ,GRAPHICAL modeling (Statistics) ,MULTIVARIATE analysis ,DATA analysis - Abstract
In this paper, linear canonical correlation analysis (LCCA) is generalized by applying a structured transform to the joint probability distribution of the considered pair of random vectors, i.e., a transformation of the joint probability measure defined on their joint observation space. This framework, called measure transformed canonical correlation analysis (MTCCA), applies LCCA to the data after transformation of the joint probability measure. We show that judicious choice of the transform leads to a modified canonical correlation analysis, which, in contrast to LCCA, is capable of detecting non-linear relationships between the considered pair of random vectors. Unlike kernel canonical correlation analysis, where the transformation is applied to the random vectors, in MTCCA the transformation is applied to their joint probability distribution. This results in performance advantages and reduced implementation complexity. The proposed approach is illustrated for graphical model selection in simulated data having non-linear dependencies, and for measuring long-term associations between companies traded in the NASDAQ and NYSE stock markets. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
42. Multiclass Imbalance Problems: Analysis and Potential Solutions.
- Author
-
Wang, Shuo and Yao, Xin
- Subjects
DISTRIBUTION (Probability theory) ,GENERALIZATION ,PERFORMANCE evaluation ,COMPARATIVE studies ,STATISTICAL correlation ,GENETIC algorithms ,CYBERNETICS - Abstract
Class imbalance problems have drawn growing interest recently because of their classification difficulty caused by the imbalanced class distributions. In particular, many ensemble methods have been proposed to deal with such imbalance. However, most efforts so far are only focused on two-class imbalance problems. There are unsolved issues in multiclass imbalance problems, which exist in real-world applications. This paper studies the challenges posed by the multiclass imbalance problems and investigates the generalization ability of some ensemble solutions, including our recently proposed algorithm AdaBoost.NC, with the aim of handling multiclass and imbalance effectively and directly. We first study the impact of multiminority and multimajority on the performance of two basic resampling techniques. They both present strong negative effects. “Multimajority” tends to be more harmful to the generalization performance. Motivated by the results, we then apply AdaBoost.NC to several real-world multiclass imbalance tasks and compare it to other popular ensemble methods. AdaBoost.NC is shown to be better at recognizing minority class examples and balancing the performance among classes in terms of G-mean without using any class decomposition. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
43. Generalized Cross-Correlation Properties of Chu Sequences.
- Author
-
Kang, Jae Won, Whang, Younghoon, Ko, Byung Hoon, and Kim, Kwang Soon
- Subjects
GENERALIZATION ,STATISTICAL correlation ,MATHEMATICAL sequences ,DISTRIBUTION (Probability theory) ,NUMERICAL analysis ,ALGORITHMS ,ELECTRICAL engineering ,ARGON - Abstract
In this paper, detailed cross-correlation properties for Chu sequences are investigated. All possible values of the cross-correlation function of Chu sequences are derived for any given sequence length and lag, and the maximum magnitude distribution function \rho N(x), which is defined as the number of all Chu sequence pairs with length-N whose maximum magnitude of the cross-correlation function is \sqrt Nx, is obtained. Also, good lower and upper bounds on the maximum number of available Chu sequences and a construction algorithm for the corresponding partial Chu sequence set are proposed when the maximum magnitude of the cross-correlation among the sequences is constrained. Numerical examples show that the proposed bounds are quite tight and the proposed construction algorithm is near-optimal up to fairly large value of N. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
44. MEASURES OF DEPENDENCE FOR ORNSTEIN--UHLENBECK PROCESSES WITH TEMPERED STABLE DISTRIBUTION.
- Author
-
WYŁOMAŃSKA, AGNIESZKA
- Subjects
ORNSTEIN-Uhlenbeck process ,DISTRIBUTION (Probability theory) ,GAUSSIAN processes ,STATISTICAL correlation ,CALIBRATION ,DATA analysis - Abstract
In this paper we investigate the dependence structure for Ornstein-Uhlenbeck process with tempered stable distribution that is natural extension of the classical Ornstein--Uhlenbeck process with Gaussian and α-stable behavior. However, for the α-stable models the correlation is not defined, therefore in order to compare the structure of dependence for Ornstein--Uhlenbeck process with tempered stable and α-stable distribution, we need another measures of dependence defined for infinitely divisible processes such as Lévy correlation cascade or codifference. We show that for analyzed tempered stable process the rate of decay of the Lévy correlation cascade is different than in the stable case, while the codifference of the α-stable Ornstein--Uhlenbeck process has the same asymptotic behavior as in tempered stable case. As motivation of our study we calibrate the Ornstein-Uhlenbeck process with tempered stable distribution to real financial data. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
45. Generalized modified Gold sequences.
- Author
-
Zhengchun Zhou and Xiaohu Tang
- Subjects
MATHEMATICAL sequences ,CODE division multiple access ,STATISTICAL correlation ,QUADRATIC forms ,WIRELESS communications ,DISTRIBUTION (Probability theory) ,SIMULATION methods & models ,DIOPHANTINE analysis ,FINITE fields - Abstract
In this paper, a large family $${\mathcal{F}^k(l)}$$ of binary sequences of period 2 − 1 is constructed for odd n = 2 m + 1, where k is any integer with gcd( n, k) = 1 and l is an integer with 1 ≤ l ≤ m. This generalizes the construction of modified Gold sequences by Rothaus. It is shown that $${\mathcal{F}^k(l)}$$ has family size $${2^{ln}+2^{(l-1)n}+\cdots+2^n+1}$$, maximum nontrivial correlation magnitude 1 + 2. Based on the theory of quadratic forms over finite fields, all exact correlation values between sequences in $${\mathcal{F}^k(l)}$$ are determined. Furthermore, the family $${\mathcal{F}^k(2)}$$ is discussed in detail to compute its complete correlation distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
46. On the Correlation Distributions of the Optimal Quaternary Sequence Family \cal U and the Optimal Binary Sequence Family \cal V.
- Author
-
Li, Nian, Tang, Xiaohu, Zeng, Xiangyong, and Hu, Lei
- Subjects
DISTRIBUTION (Probability theory) ,GALOIS theory ,QUATERNARY forms ,STATISTICAL correlation ,EXPONENTIAL sums ,BINARY number system - Abstract
Recently, new optimal Families \cal S and \cal U of quaternary sequences have been presented, and the optimal binary sequence Family \cal V obtained from Family \cal S under Gray map has been investigated as well. The two sequence Families \cal U and \cal V are optimal with respect to the well-known Sidelnikov bound and Welch bound, but their exact correlation distributions are not known until now. In this paper, their exact correlation distributions are completely determined in some cases by making use of exponential sums and the theory of \bf Z_4-valued quadratic forms. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
47. A New Data Processing Inequality and Its Applications in Distributed Source and Channel Coding.
- Author
-
Kang, Wei and Ulukus, Sennur
- Subjects
ELECTRONIC data processing ,MATHEMATICAL inequalities ,CODING theory ,DISTRIBUTION (Probability theory) ,RANDOM variables ,STATISTICAL correlation ,MARKOV processes - Abstract
In the distributed coding of correlated sources, the problem of characterizing the joint probability distribution of a pair of random variables satisfying an n-letter Markov chain arises. The exact solution of this problem is intractable. In this paper, we seek a single-letter necessary condition for this n-letter Markov chain. To this end, we propose a new data processing inequality on a new measure of correlation through a spectral method. Based on this new data processing inequality, we provide a single-letter necessary condition for the required joint probability distribution. We apply our results to two specific examples involving the distributed coding of correlated sources: multiple-access channel with correlated sources and multiterminal rate-distortion region, and propose new necessary conditions for these two problems. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
48. THE IMPACT OF A HAUSMAN PRETEST ON THE ASYMPTOTIC SIZE OF A HYPOTHESIS TEST.
- Author
-
Guggenberger, Patrik
- Subjects
ASYMPTOTIC theory in statistical hypothesis testing ,T-test (Statistics) ,DISTRIBUTION (Probability theory) ,LEAST squares ,ASYMPTOTIC distribution ,STATISTICAL correlation - Abstract
This paper investigates the asymptotic size properties of a two-stage test in the linear instrumental variables model when in the first stage a Hausman (1978) specification test is used as a pretest of exogeneity of a regressor. In the second stage, a simple hypothesis about a component of the structural parameter vector is tested, using a t-statistic that is based on either the ordinary least squares (OLS) or the twostage least squares estimator (2SLS), depending on the outcome of the Hausman pretest. The asymptotic size of the two-stage test is derived in a model where weak instruments are ruled out by imposing a positive lower bound on the strength of the instruments. The asymptotic size equals 1 for empirically relevant choices of the parameter space. The size distortion is caused by a discontinuity of the asymptotic distribution of the test statistic in the correlation parameter between the structural and reduced form error terms. The Hausman pretest does not have sufficient power against correlations that are local to zero while the OLS-based t-statistic takes on large values for such nonzero correlations. Instead of using the two-stage procedure, the recommendation then is to use a t-statistic based on the 2SLS estimator or, if weak instruments are a concern, the conditional likelihood ratio test by Moreira (2003). [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
49. A Study of the Extreme Effects of Fading Correlation and the Impact of Imperfect MMSE Channel Estimation on the Performance of SIMO Zero-Forcing Receivers and on the Capacity-Maximizing Strategy in SIMO Links.
- Author
-
Nobandegani, Khashayar Salehi and Azmi, Paeiz
- Subjects
STATISTICAL correlation ,RADIO transmitter fading ,ONLINE education ,DISTRIBUTION (Probability theory) ,RAYLEIGH model ,SIGNAL-to-noise ratio ,ERROR analysis in mathematics ,SYSTEM analysis ,PULSE amplitude modulation ,MONTE Carlo method - Abstract
In this paper, we shall consider best- and worst-case fading correlation scenarios, i.e., uncorrelated and fully correlated fading channels and will study the performance of a singleinput-multiple-output (SIMO) link that uses a practical imperfect online training-based minimum mean square error (MMSE) channel estimator and a zero-forcing (ZF) receiver under these two conditions. Considering the extremes of correlation and focusing on frequency-nonselective Rayleigh block fading, we will find the probability distribution function (pdf) of the postprocessing signal-to-noise ratio (SNR) in each case and shall provide closed-form expressions for the symbol error probability (SEP) of these systems when using M-ary pulse amplitude modulation (M-ary PAM). Moreover, a lower bound will be provided for the information capacity of fully correlated SIMO links with training-based MMSE channel estimation, which will further be optimized to yield its maximum under different conditions. Monte Carlo simulations perfectly verify the analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
50. SIMPLE, ROBUST, AND POWERFUL TESTS OF THE BREAKING TREND HYPOTHESIS.
- Author
-
Harvey, David I., Leybourne, Stephen J., and Robert Taylor, A. M.
- Subjects
GAUSSIAN distribution ,DISTRIBUTION (Probability theory) ,TREND analysis ,HYPOTHESIS ,SCIENTIFIC method ,STATISTICAL correlation ,MATHEMATICAL statistics ,PROBABILITY theory ,REGRESSION analysis - Abstract
In this paper we develop a simple procedure that delivers tests for the presence of a broken trend in a univariate time series that do not require knowledge of the form of serial correlation in the data and are robust as to whether the shocks are generated by an I(0) or an I(1) process. Two trend break models are considered: the first holds the level fixed while allowing the trend to break, while the latter allows for a simultaneous break in level and trend. For the known break date case, our proposed tests are formed as a weighted average of the optimal tests appropriate for I(0) and I(1) shocks. The weighted statistics are shown to have standard normal limiting null distributions and to attain the Gaussian asymptotic local power envelope, in each case regardless of whether the shocks are I(0) or I(1). In the unknown break date case, we adopt the method of Andrews (1993) and take a weighted average of the statistics formed as the supremum over all possible break dates, subject to a trimming parameter, in both the I(0) and I(1) environments. Monte Carlo evidence suggests that our tests are in most cases more powerful, often substantially so, than the robust broken trend tests of Sayginsoy and Vogelsang (2004). An empirical application highlights the practical usefulness of our proposed tests. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.