41 results on '"maximum likelihood estimation."'
Search Results
2. An Overview of Discrete Distributions in Modelling COVID-19 Data Sets.
- Author
-
Almetwally, Ehab M., Dey, Sanku, and Nadarajah, Saralees
- Abstract
The mathematical modeling of the coronavirus disease-19 (COVID-19) pandemic has been attempted by a large number of researchers from the very beginning of cases worldwide. The purpose of this research work is to find and classify the modelling of COVID-19 data by determining the optimal statistical modelling to evaluate the regular count of new COVID-19 fatalities, thus requiring discrete distributions. Some discrete models are checked and reviewed, such as Binomial, Poisson, Hypergeometric, discrete negative binomial, beta-binomial, Skellam, beta negative binomial, Burr, discrete Lindley, discrete alpha power inverse Lomax, discrete generalized exponential, discrete Marshall-Olkin Generalized exponential, discrete Gompertz-G-exponential, discrete Weibull, discrete inverse Weibull, exponentiated discrete Weibull, discrete Rayleigh, and new discrete Lindley. The probability mass function and the hazard rate function are addressed. Discrete models are discussed based on the maximum likelihood estimates for the parameters. A numerical analysis uses the regular count of new casualties in the countries of Angola,Ethiopia, French Guiana, El Salvador, Estonia, and Greece. The empirical findings are interpreted in-depth. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. Analysis of Copula Frailty defective models in presence of Cure Fraction.
- Author
-
Ola Abuelamayem
- Subjects
Cure fraction model ,Copula ,Defective distribution ,Frailty model ,Censoring ,Maximum likelihood Estimation. ,Biology (General) ,QH301-705.5 ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
Introduction: Analyzing long term survivors such as diabetic patients can't be done using the usual survival models. One approach to analyze it is using defective distribution that doesn't force a pre-assumption of cure fraction to the model. To study more than one random variable interacting together, multivariate distributions may be used. However, most of multivariate distributions have complicated forms, which make the computations difficult. Besides, it may be hard to find a multivariate distribution that fits the data properly, especially in health care field. To get over this problem, one can use copula approach. In literature, to the best of our knowledge, only one paper handled copula defective models and didn't consider the effect of covariates. In this paper, we take into consideration not only existed covariates but also unobserved ones by including frailty term. Methods: Two new models are introduced. The first model, used Gumbel copula to take the dependence into consideration together with the observed covariates. The second one take into consideration not only the dependence but also the unobserved covariates by integrating frailty term in to the model. Results: A diabetic retinopathy data is analyzed. The two models indicated the existence of long-term survivals through negative parameters without the need of pre-assuming the existence of it. Including frailty term to the model helped in capturing more dependence between the variables. We compared the results using goodness of fit methods, and the results suggested that the model with frailty term is the best to be used. Conclusion: The two introduced models correctly detected the existence of cure fraction with less estimated parameters than that in mixture cure fraction models. Also, it has the advantage of not pre-assuming the existence of cure fraction to the model. comparing both models, the model with frailty term fitted the data better.
- Published
- 2023
- Full Text
- View/download PDF
4. Modified signed log-likelihood test for the coefficient of variation of an inverse Gaussian population
- Author
-
Mohammad Reza Kazemi
- Subjects
coefficient of variation ,inverse gaussian population ,modified signed log likelihood method ,maximum likelihood estimation. ,Mathematics ,QA1-939 - Abstract
In this paper, we consider the problem of two sided hypothesis testing for the parameter of coefficient of variation of an inverse Gaussian population. An approach used here is the modified signed log-likelihood ratio (MSLR) method which is the modification of traditional signed log-likelihood ratio test. Previous works show that this proposed method has third-order accuracy whereas the traditional approach has first-order one. Indeed, these methods are based on likelihood with a higher order of accuracy. For this reason, we are interested in using this method for inference about the parameter of coefficient of variation of an inverse Gaussian distribution. All necessary formulas for obtaining MSLR statistic are provided. Numerically, the performances of this method are compared with classical approaches, in terms of empirical type-I error rate and empirical test power. Simulation results show that the empirical type-I error rates of MSLR are close to nominal type-I error rate, even for small sample sizes whereas the traditional approaches are reliable only for large sample sizes. Comparing the empirical power sizes shows that the power of MSLR method is superior to other considered methods in some settings, by regarding that the competing approaches cannot perform well in controlling the type-I error probability because their empirical type-I error rates are far from the nominal type-I error rate. Finally, we illustrate the proposed methods using a real data set and then we conclude the paper.
- Published
- 2022
5. Assessment of noise in time series analysis for Buoy tide observations
- Author
-
Saeed Farzaneh, Mohammad Ali Sharifi, Kamal Parvazi, and Bahare Namazi
- Subjects
buoy station’s time series ,least square estimation ,least square -harmonic estimation ,tides observation’s noise analysis ,maximum likelihood estimation. ,Naval architecture. Shipbuilding. Marine engineering ,VM1-989 - Abstract
To extract valid results from time series analysis of tides observations, noise reduction is vital. This study aimed to use a precise statistical model to investigate noise types. Noise component amplitude of the proposed model was studied by Least Square Estimation (LS-VCE) through different statistical models: (1) white noise and auto-regressive noise, (2) white noise and Flicker noise, (3) white noise and random walk noise, (4) white noise and Flicker noise and random walk, and (5) auto-regressive noise and Flicker noise. Based on the values obtained for the Likelihood Function, it was concluded that the noise model that can be considered for observations of the Buoy time series includes two white and Flicker noises. In addition, tide forecasting for all stations was done by extracting important frequency calculated in two cases: (1) the first case in which matrix of observation weight matrix was considered as the unit matrix or the noise model was just a white noise (2) the case in which matrix of observation weight matrix was considered as a combination of white and Flicker noises. The results show that use of precise observation weight matrix resulted in 11 millimeter difference compared to the case in which observation with unit weight matrix was used.
- Published
- 2020
6. Analysis of nonlinear state space model with dependent measurement noises.
- Author
-
Hajrajabi, A.
- Subjects
- *
MEASUREMENT errors , *NOISE measurement , *NONLINEAR analysis , *EXPECTATION-maximization algorithms , *ESTIMATES , *AUTOREGRESSIVE models , *MAXIMUM likelihood statistics - Abstract
This paper presents a nonlinear state space model with considering a first-order autoregressive model for measurement noises. A recursive method using Taylor series based approximations for filtering, prediction and smoothing problem of hidden states from the noisy observations is designed. Also, an expectation-maximization algorithm for calculating the maximum likelihood estimators of parameters is presented. The closed form solutions are obtained for estimating of the hidden states and the unknown parameters. Finally, the performance of the designed methods are verified in a simulation study. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
7. A New Bivariate Distribution Obtained by Compounding the Bivariate Normal and Geometric Distributions
- Author
-
Eisa Mahmoudi and Hamed Mahmoodian
- Subjects
Normal distribution ,Geometric distribution ,EM algorithm ,Maximum likelihood estimation. ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
Recently, Mahmoudi and Mahmoodian [7] introduced a new class of distributions which contains univariate normal–geometric distribution as a special case. This class of distributions are very flexible and can be used quite effectively to analyze skewed data. In this paper we propose a new bivariate distribution with the normal–geometric distribution marginals. Different properties of this new bivariate distribution have been studied. This distribution has five unknown parameters. The EM algorithm is used to determine the maximum likelihood estimates of the parameters. We analyze one series of real data set for illustrative purposes.
- Published
- 2017
- Full Text
- View/download PDF
8. Transmuted Generalized Gompertz distribution with application
- Author
-
Muhammad Shuaib Khan, Robert King, and Irene Lena Hudson
- Subjects
Reliability functions ,moment estimation ,maximum likelihood estimation. ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
This paper introduces the four parameter transmuted generalized Gompertz distribution which includes the transmuted Gompertz, transmuted generalized exponential, transmuted exponential, Gompertz, generalized exponential and exponential distributions as special cases and studies its statistical properties. Explicit expressions are derived for the quantile, moments, moment generating function and entropies. Maximum likelihood estimation is used to estimate the model parameters. Finally, two applications of the new distribution is illustrated using reliability data sets.
- Published
- 2017
- Full Text
- View/download PDF
9. An Investigation of Bayes Estimation Procedures for the Two-Parameter Logistic Model
- Author
-
Kim, Seock-Ho, Yanai, H., editor, Okada, A., editor, Shigemasu, K., editor, Kano, Y., editor, and Meulman, J. J., editor
- Published
- 2003
- Full Text
- View/download PDF
10. Generalized inverse Lindley distribution with application to Danish fire insurance data.
- Author
-
Asgharzadeh, A., Nadarajah, S., and Sharafi, F.
- Subjects
- *
DISTRIBUTION (Probability theory) , *INVERSE functions , *FIRE insurance , *DATA analysis , *MAXIMUM likelihood statistics - Abstract
The Danish fire insurance data have recently been modeled by composite distributions, i.e., distributions made up by piecing together two or more distributions. Here, we introduce a new non composite distribution that performs well with respect to the Danish fire insurance data. It fits better than almost all of the commonly known heavy-tailed distributions and some of the composite distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
11. Non parametric confidence intervals for the extinction probability of a Galton–Watson branching process.
- Author
-
From, Steven G.
- Subjects
- *
CONFIDENCE intervals , *PROBABILITY theory , *BRANCHING processes , *MAXIMUM likelihood statistics , *ANALYSIS of variance - Abstract
It is demonstrated that the confidence intervals (CIs) for the probability of eventual extinction and other parameters of a Galton–Watson branching process based upon the maximum likelihood estimators can often have substantially lower coverage when compared to the desired nominal confidence coefficient, especially in small, more realistic sample sizes. The same conclusion holds for the traditional bootstrap CIs. We propose several adjustments to these CIs, which greatly improves coverage in most cases. We also make a correction in an asymptotic variance formula given in Stigler (1971). The focus here is on implementation of the CIs which have good coverage, in a wide variety of cases. We also consider expected CI lengths. Some recommendations are made. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
12. Maximum Likelihood Estimation of the Parameters of Fractional Brownian Traffic with Geometrical Sampling
- Author
-
Vidács, Attila, Virtamo, Jorma T., Tsang, Danny H. K., editor, and Kühn, Paul J., editor
- Published
- 2000
- Full Text
- View/download PDF
13. A Note on Kumaraswamy Exponentiated Rayleigh distribution
- Author
-
Nasr Ibrahim Rashwan
- Subjects
Kumaraswamy exponentiated Rayleigh distribution ,quantile function ,moments generating function ,order statistics ,maximum likelihood estimation. ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
In this paper, a new four parameter continuous distribution, called the Kumaraswamy Rayleigh (KW-ER) distribution is proposed and studied. Some mathematical properties are presented and discussed, these properties involve expansions for the cumulative and density functions, skewness and kurtosis based on the quantile function and explicit expressions for the moments and generating function. The density function of the order statistics and obtain their moments are provided. An explicit expression for Renyi entropy is obtained. The method of maximum likelihood is used for estimating the distribution parameters and the observed information matrix is derived. A data set is used to illustrate the application of the new distribution.
- Published
- 2016
- Full Text
- View/download PDF
14. Three parameter Transmuted Rayleigh distribution with application to Reliability data
- Author
-
Muhammad Shuaib Khan, Robert King, and Irene Lena Hudson
- Subjects
2P-Rayleigh distribution ,weighted Rayleigh distribution ,moment estimation ,order statistics ,maximum likelihood estimation. ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
This research introduces the three parameter transmuted Rayleigh distribution with an application to fatigue fracture data. Using the quadratic rank transmutation map method proposed by Shaw et al. [25] we develop the three parameter transmuted Rayleigh distribution. This research also introduces the new class of weighted Rayleigh distribution by using Azzalini [2] method. Some structural properties of the new distribution are derived such as moments, incomplete moments, probability weighted moments, moment generating function, entropies, mean deviation and the kth moment of order statistics. The parameters of the proposed model are estimated using the maximum likelihood estimation and obtain the observed information matrix. The potentiality of the proposed model is illustrated using fatigue fracture data.
- Published
- 2016
- Full Text
- View/download PDF
15. Parameter Estimation for an Electric Arc Furnace Model Using Maximum Likelihood
- Author
-
Jesser J. Marulanda-Durango, Christian D. Sepúlveda-Londoño, and Mauricio A. Alvarez-López
- Subjects
Arc furnace ,harmonics ,dynamic models ,maximum likelihood estimation. ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
In this paper, we present a methodology for estimating the parame-ters of a model for an electrical arc furnace, by using maximum likelihood estimation. Maximum likelihood estimation is one of the most employed methods for parameter estimation in practical settings. The model for the electrical arc furnace that we consider, takes into account the non-periodic and non-linear variations in the voltage-current characteristic. We use NETLAB, an open source MATLAB® toolbox, for solving a set of non-linear algebraic equations that relate all the parameters to be estimated. Results obtained through simulation of the model in PSCADTM, are contrasted against real measurements taken during the furnance's most critical operating point. We show how the model for the electrical arc furnace, with appropriate parameter tuning, captures with great detail the real voltage and current waveforms generated by the system. Results obtained show a maximum error of 5% for the current's root mean square error.
- Published
- 2012
16. Ranging and Doppler Localization Using Golay Codes as the Phase Coding Signal
- Author
-
Aamir Hussain and Muhammad Bilal Malik
- Subjects
Correlation ,Detection ,Doppler Frequency ,Golay Codes ,Phase Coding ,Range Estimation ,Quadrature Phase Shift Keying ,Side Lobe Suppression ,Serial Search, Frequency Banks ,Maximum Likelihood Estimation. ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Science - Abstract
In this paper we develop a signal processing algorithm for the detection, ranging and Doppler localization of moving target using Golay code as the phase coding signal. We exploit the side-lobes suppression property of Golay code, that occurs in complementary code pair, for side-lobe free target detection, range estimation and Doppler localization. Golay code based phase coding is achieved using QPSK (Quadrature Phase Shift Keying) scheme. When the transmitted signal is reflected from a moving target, its complex demodulation is carried out at the receiver. This is followed by correlation of each component of demodulated signal with one member of the complementary Golay code pair separately. The correlation results of the two channels are added together, leading to effective side-lobes suppression when phase and frequency of the demodulating carriers match with that of the received signal. The shift in the time index of the correlation peak gives the range to the target. Frequency of the demodulating carriers at which side-lobe free correlation peak occurs at the output indicates Doppler frequency corresponding to the target speed. The developed technique exhibits excellent Doppler localization, target detection and range estimation performance by overcoming the side-lobe limitations of the conventional phase coding signals in ranging applications.
- Published
- 2012
17. Estimating Fatigue Life of Structural Components from Accelerated Data via a Birnbaum-saunders Model with Shape and Scale Stress Dependent Parameters.
- Author
-
D’Anna, Giuseppe, Giorgio, Massimiliano, and Riccio, Aniello
- Subjects
FATIGUE life ,MATERIAL fatigue ,FATIGUE (Physiology) ,PHYSIOLOGY ,FUNCTIONAL analysis - Abstract
The Birnbaum-Saunders model is widely applied to model fatigue failures caused by cyclic stresses both in the case of standard and accelerated life tests. This latter kind of tests are adopted when the product of interest is very reliable, in order to obtain failure data in a reasonably short amount of time. Estimates of the product's reliability or the long-term performances at normal use condition are then obtained, from accelerated failure data, adopting functional relationships accounting for the effect of the accelerating variables on the product's lifetime distribution. Customarily these models are formulated assuming that the accelerating variables affect the values of the lifetime distribution parameters, and not its form . In particular, in literature the Birnbaum-Saunders distribution is usually applied to accelerated data under the hypothesis that the scale parameter only depends on the stress conditions, while the shape parameter doesn’t depend on them. In this paper, an applicative example is presented in which this standard model does not work satisfactorily. In fact, it is shown as, in the case of the considered real set of accelerated fatigue failure data, the Birnbaum-Saunders distribution, in which both scale and shape parameters depend on the stress conditions, fits the data significantly better than the abovementioned standard option. Difference among lifetime distribution estimates provided by the two different considered models are highlighted and discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
18. KVANTIFIKACE RIZIK PRO ÚRAZOVÉ POJIŠTĚNÍ
- Author
-
Jindrová, Pavla, Kopecká, Lucie, Jindrová, Pavla, and Kopecká, Lucie
- Abstract
Lidé všech věkových kategorií jsou ohrožováni nastáním úrazu. Jednou z možností ochrany před finančními dopady, které jsou způsobeny úrazem, je úrazové pojištění, které je v pojistné praxi nejčastěji uzavíraným pojištěním. Tento článek se zaměřuje na kvantifikaci rizik pro úrazové pojištění. V příspěvku jsou popsány možnosti měření rizik a použití vhodných modelů. V článku jsou modelovány počty a výše pojistného plnění s použitím reálných dat ze slovenských pojišťoven. Data ze slovenských pojišťoven jsou vybrána konkrétně pro případ úmrtí v důsledku úrazu. Tato data obsahují informace o počtu pojištěných osob a počtu pojistných událostí podle pohlaví. Díky regulaci EU pojišťovny nemohou odhadnout pojistné zvlášť pro muže a zvlášť pro ženy. V tomto článku je nejprve odhadnuto pojistné zvlášť pro muže a ženy a dále je tento odhad navržen pro obě pohlaví dohromady. Ukazuje se, že rozhodnutí EU ovlivňuje výši pojistného., People of all ages are threatened by accident. They have not many options of protection against financial impacts which are caused by accident. The most effective protection it is accident insurance. This article is focused on quantification of risks associated with accident insurance. There are described possibilities of measuring risks and after that by using suitable models there are modelled number and amount of insurance benefit using real data of the Slovak insurance companies. The data of the Slovak insurance companies are selected specifically for the case of death due to accident. Data contain information about the number of insured and number of insured events. The insurance companies can not estimate premium separately for men and women due to EU Regulation. In this article on the one hand side the premium is estimated separately for men and women and on the other hand side this estimation is designed for both sexes. There is showen how EU decision affect amount of premium.
- Published
- 2020
19. An Overview of Discrete Distributions in Modelling COVID-19 Data Sets.
- Author
-
Almetwally EM, Dey S, and Nadarajah S
- Abstract
The mathematical modeling of the coronavirus disease-19 (COVID-19) pandemic has been attempted by a large number of researchers from the very beginning of cases worldwide. The purpose of this research work is to find and classify the modelling of COVID-19 data by determining the optimal statistical modelling to evaluate the regular count of new COVID-19 fatalities, thus requiring discrete distributions. Some discrete models are checked and reviewed, such as Binomial, Poisson, Hypergeometric, discrete negative binomial, beta-binomial, Skellam, beta negative binomial, Burr, discrete Lindley, discrete alpha power inverse Lomax, discrete generalized exponential, discrete Marshall-Olkin Generalized exponential, discrete Gompertz-G-exponential, discrete Weibull, discrete inverse Weibull, exponentiated discrete Weibull, discrete Rayleigh, and new discrete Lindley. The probability mass function and the hazard rate function are addressed. Discrete models are discussed based on the maximum likelihood estimates for the parameters. A numerical analysis uses the regular count of new casualties in the countries of Angola,Ethiopia, French Guiana, El Salvador, Estonia, and Greece. The empirical findings are interpreted in-depth., Competing Interests: Conflict of InterestThe authors declare no conflicts of interest, (© Indian Statistical Institute 2022, Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.)
- Published
- 2022
- Full Text
- View/download PDF
20. Transmuted Modified Weibull Distribution: A Generalization of the Modified Weibull Probability Distribution.
- Author
-
Khan, Muhammad Shuaib and King, Robert
- Subjects
- *
WEIBULL distribution , *PROBABILITY theory , *MATHEMATICAL functions , *ESTIMATION theory , *LEAST squares , *ORDER statistics , *MAXIMUM likelihood statistics - Abstract
This paper introduces a transmuted modified Weibull distribution as an important competitive model which contains eleven life time distributions as special cases. We generalized the three parameter modified Weibull distribution using the quadratic rank transmutation map studied by Shaw et al. [12] to develop a transmuted modified Weibull distribution. The properties of the transmuted modified Weibull distribution are discussed. Least square estimation is used to evaluate the parameters. Explicit expressions are derived for the quantiles. We derive the moments and examine the order statistics. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. This model is capable of modeling of various shapes of aging and failure criteria. [ABSTRACT FROM AUTHOR]
- Published
- 2013
21. Robust frequency synchronization for OFDM-based cognitive radio systems.
- Author
-
Morelli, M. and Moretti, M.
- Abstract
Cognitive radio employs spectrum sensing to facilitate coexistence of different communication systems over a same frequency band. A peculiar feature of this technology is the possible presence of interference within the signal bandwidth, which considerably complicates the synchronization task. This paper investigates the problem of carrier frequency estimation in an orthogonal frequency-division multiplexing (OFDM)-based cognitive radio system that operates in the presence of narrowband interference (NBI). Synchronization algorithms devised for conventional OFDM transmissions are expected to suffer from significant performance degradation when the received signal is plagued by NBI. To overcome this difficulty, we propose a novel scheme in which the carrier frequency offset (CFO) and interference power on each subcarrier are jointly estimated through maximum likelihood (ML) methods. In doing so we exploit two pilot blocks. The first one is composed of several repeated parts in the time-domain and provides a CFO estimate which may be affected by a certain residual ambiguity. The second block conveys a known pseudo-noise sequence in the frequency-domain and is used to resolve the ambiguity. The performance of the proposed algorithm is assessed by simulation in a scenario inspired by the IEEE 802.11g WLAN system in the presence of a Bluetooth interferer. [ABSTRACT FROM PUBLISHER]
- Published
- 2008
- Full Text
- View/download PDF
22. 4π Compton Imaging Using a 3-D Position-Sensitive CdZnTe Detector Via Weighted List-Mode Maximum Likelihood.
- Author
-
Lehner, Carolyn E., Zhong He, and Feng Zhang
- Subjects
- *
COMPTON effect , *DETECTORS , *ELECTRONS , *IMAGE processing , *OPTICAL resolution , *ELECTRONICS - Abstract
In this paper, we describe a 4π Compton imager composed of a single 15 mm × 15 mm × 10 mm CdZnTe detector. Full 4π images are reconstructed via list-mode maximum likelihood (ML). A new weighting method for ML reconstruction is proposed in which the contributions of small-uncertainty sequences are enhanced relative to sequences with large uncertainties. The new reconstruction method is compared with traditional ML techniques for measured imaging data. The 4π Compton imager has a measured intrinsic imaging efficiency of nearly 2% and an imaging resolution using the weighted ML reconstruction method of 17° at 662 keV after 10 iterations. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
23. An EM algorithm for the estimation of parametric and nonparametric hierarchical nonlinear models.
- Author
-
Vermunt, Jeroen K.
- Subjects
- *
ALGORITHMS , *NONLINEAR statistical models , *MATHEMATICAL models , *MULTILEVEL models , *EMPIRICAL research , *REGRESSION analysis - Abstract
It is shown how to implement an EM algorithm for maximum likelihood estimation of hierarchical nonlinear models for data sets consisting of more than two levels of nesting. This upward–downward algorithm makes use of the conditional independence assumptions implied by the hierarchical model. It cannot only be used for the estimation of models with a parametric specification of the random effects, but also to extend the two-level nonparametric approach – sometimes referred to as latent class regression – to three or more levels. The proposed approach is illustrated with an empirical application. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
24. Modeling Breastmilk Infectivity in HIV-1 Infected Mothers.
- Author
-
Richardson, Barbra A. and Hughes, James P.
- Subjects
- *
CONTAMINATION of human milk , *HIV-positive women , *BREASTFEEDING , *ESTIMATION theory - Abstract
Summary. Estimation of breastmilk infectivity in HIV-1 infected mothers is difficult because transmission can occur while the fetus is in utero, during delivery, or through breastfeeding. Since transmission can only be detected through periodic testing, however, it may be impossible to determine the actual mode of transmission in any individual child. In this article we develop a model to estimate breastmilk infectivity, along with the probabilities of in-utero and intrapartum transmission. In addition, the model allows separate estimation of early and late breastmilk infectivity, and individual variation in maternal infectivity. Methods for hypothesis testing of binary risk factors and a method for assessing goodness of fit are also described. Data from a randomized trial of breastfeeding versus formula feeding among HIV-1 infected mothers in Nairobi, Kenya, are used to illustrate the methods. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
25. Statistical inference on series of atmospheric chemistry data*.
- Author
-
Mohapl, Jaroslav
- Subjects
MATHEMATICAL statistics ,LINEAR statistical models ,ATMOSPHERIC chemistry ,POLLUTANTS ,GAUSSIAN processes ,MULTIVARIATE analysis ,ESTIMATION theory - Abstract
This study is motivated by a problem concerning comparison of atmospheric chemistry data obtained from two different measurement networks. The data describe pollutant concentrations in air. One network analyzes gases and particles captured on filters daily, the other weekly. The networks use similar but not identical instrumentation and measurement methods. Chemical concentrations form usually non-stationary time series. The goal of the paper is to present a multivariate linear Gaussian model with time dependent correlation matrix for description of such series, a method for parameter estimation and fitting of the model and the procedure used for inference about the mentioned data. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
26. Nonstationary Binary Choice.
- Author
-
Park, Joon Y. and Phillips, Peter C. B.
- Subjects
STOCHASTIC orders ,TIME series analysis ,LOGISTIC regression analysis ,BROWNIAN motion ,DECISION making - Abstract
This paper develops an asymptotic theory for time series binary choice models with nonstationary explanatory variables generated as integrated processes. Both logit and probit models are covered. The maximum likelihood (ML) estimator is consistent but a new phenomenon arises in its limit distribution theory. The estimator consists of a mixture of two components, one of which is parallel to and the other orthogonal to the direction of the true parameter vector, with the latter being the principal component. The ML estimator is shown to converge at a rate of n3/4 along its principal component but has the slower rate of n1/4 convergence in all other directions. This is the first instance known to the authors of multiple convergence rates in models where the regressors have the same (full rank) stochastic order and where the parameters appear in linear forms of these regressors. It is a consequence of the fact that the estimating equations involve nonlinear integrable transformations of linear forms of integrated processes as well as polynomials in these processes, and the asymptotic behavior of these elements is quite different. The limit distribution of the ML estimator is derived and is shown to be a mixture of two mixed normal distributions with mixing variates that are dependent upon Brownian local time as well as Brownian motion. It is further shown that the sample proportion of binary choices follows an arc sine law and therefore spends most of its time in the neighborhood of zero or unity. The result has implications for policy decision making that involves binary choices and where the decisions depend on economic fundamentals that involve stochastic trends. Our limit theory shows that, in such conditions, policy is likely to manifest streams of little intervention or intensive intervention. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
27. An empirical Bayes model for Markov-dependent binary sequences with randomly missing observations.
- Author
-
Cole, Bernard F., Lee, Mei-Ling T., Whitmore, G. Alex, and Zaslavsky, Alan M.
- Subjects
- *
MATHEMATICAL models , *MATHEMATICAL statistics , *ESTIMATION theory , *MARKOV processes , *MULTIVARIATE analysis , *STATISTICAL correlation , *STOCHASTIC processes - Abstract
We develop an improved empirical Bayes estimation methodology for the analysis of two-state Markov chains observed from heterogeneous individuals. First, the two transition probabilities corresponding to each chain are assumed to be drawn from a common, bivariate distribution that has beta marginals. Second, randomly missing observations are incorporated into the likelihood for the hyperparameters by efficiently summing over all possible values for the missing observations. A likelihood ratio test is used to test for dependence between the transition probabilities. Posterior distributions for the transition probabilities are also derived, as is an approximation for the equilibrium probabilities. The proposed procedures are illustrated in a numerical example and in an analysis of longitudinal store display data. [ABSTRACT FROM AUTHOR]
- Published
- 1995
- Full Text
- View/download PDF
28. Parameter Orthogonality for a Family of Discrete Distributions.
- Author
-
Willmot, Gordon E.
- Subjects
- *
CONTAGIOUS distributions , *DISTRIBUTION (Probability theory) , *AUTOMOBILE insurance , *ESTIMATION theory , *ORTHOGONALIZATION , *STATISTICAL correlation , *GAUSSIAN distribution , *PROBABILITY theory , *STATISTICS - Abstract
The standard contagious distributions (see Douglas 1980) have been used in such varied fields as biology and automobile insurance, often to model various physical phenomena as well as provide a good fit to count data when other models are inadequate. Unfortunately, the parameterizations often used when working with these distributions normally lead to extremely high correlations of the maximum likelihood estimators (MLE's). This tends to lead to mathematical complexities, and causes difficulty or even errors in their interpretation. Furthermore, numerical difficulties may arise when using numerical procedures to locate the estimates. Some of these difficulties were discussed by Douglas (1980, pp. 171, 204-205), who suggested that a reparameterization to reduce or even eliminate such correlation is desirable. If the MLE's are asymptotically uncorrelated, the parameterization is orthogonal. Philpot (1964) derived an orthogonal parameterization for the Neyman Type A distribution; Stein, Zucchini, and Juritz (1987) derived for the Poisson mixture by the inverse Gaussian distribution. Parameter orthogonality has several attractive features in the present context. Since there is no correlation asymptotically, the estimates (with their standard errors) provide a simpler summary of the data than in the absence of such orthogonality. The use of a parameterization where the MLEs are highly correlated can lead to a misleading analysis, or at best a more complicated analysis that would be necessary if an orthogonal parameterization had been used. To the extent that a high correlation exists, the parameters involved tend to measure similar quantities, and orthogonality separates information about the parameters from each other. This article gives an... [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
29. Small Sample Properties of Probit Model Estimators.
- Author
-
Griffiths, William E., Pope, Peter J., and Hill, R. Carter
- Subjects
- *
MATRICES (Mathematics) , *ESTIMATION theory , *MONTE Carlo method , *ALGORITHMS , *STATISTICAL correlation , *REGRESSION analysis , *MULTICOLLINEARITY , *STATISTICS - Abstract
When maximum likelihood estimates of the coefficients in a nonlinear model such as the probit model are obtained there are a number of asymptotically equivalent covariance matrix estimators that can be used. These covariance matrix estimators are typically associated with different computer algorithms. For example, with the Newton-Raphson algorithm the inverse of the negative of the Hessian matrix from the log-likelihood function is used; with the method of scoring the inverse of the information matrix is used; and with a procedure proposed by Berndt, Hall, Hall, and Hausman (1974), the inverse of the outer product of the first derivatives of the log-likelihood function is used. Although these three estimators are asymptotically equivalent, their performance can vary in finite samples. The main objective of this article is to use a Monte Carlo experiment to investigate the finite sample properties of the three covariance matrix estimators, in the context of maximum likelihood estimation of the probit model. Related questions concerning the empirical distributions of test statistics and the properties of a preliminary test estimator are also examined, under varying degrees of multicollinearity. We find that, on average, the Hessian matrix and the information matrix give almost identical results and lead to more accurate estimates of the asymptotic covariance matrix than does the estimator based on first derivatives. The finite sample mean squared error of the maximum likelihood estimator, however, is considerably greater than the asymptotic covariance matrix, and the estimator based on first derivatives provides a better estimate of finite sample mean squared error. All three estimators lead to empirical distributions that can be approximated by an asymptotic normal distribution. The pretest estimator formed by testing for the... [ABSTRACT FROM AUTHOR]
- Published
- 1987
- Full Text
- View/download PDF
30. On the Bias in Estimates of Forecast Mean Squared Error.
- Author
-
Ansley, Craig F. and Newbold, Paul
- Subjects
- *
FORECASTING , *TIME series analysis , *FINITE simple groups , *MATHEMATICAL statistics , *LEAST squares , *PROBABILITY theory , *BOX-Jenkins forecasting , *ERROR analysis in mathematics , *ESTIMATES , *PARAMETER estimation , *ESTIMATION theory - Abstract
We examine the forecasting problem for a finite series of observations from either a nonseasonal or a seasonal autoregressive-moving average process. Four sources of bias in the usual estimator of forecast mean squared error are identified and analyzed for particular models using maximum likelihood and least squares parameter estimates. The usual estimator is found to be biased downwards, especially near the boundary of a stationary or invertible region, and the bias is severe for least squares estimators. An alternative estimator is proposed for the maximum likelihood case, which is shown generally to have reduced bias. [ABSTRACT FROM AUTHOR]
- Published
- 1981
- Full Text
- View/download PDF
31. A Structural Probit Model With Latent Variables.
- Author
-
Muthén, Bengt
- Subjects
- *
LATENT structure analysis , *STRUCTURAL frame models , *LATENT variables , *STATISTICAL correlation , *ESTIMATION theory , *MULTIVARIATE analysis , *PROBITS , *MATHEMATICAL variables , *SOCIOLOGY - Abstract
A model with dichotomous indicators of latent variables is developed. The latent variables are related to each other and to a set of exogenous variables in a system of structural relations. Identification and maximum likelihood estimation of the model are treated. A sociological application is presented in which a theoretical construct (an attitude) is related to a set of background variables. The construct is not measured directly, but is indicated by the answers to a pair of questionnaire statements. [ABSTRACT FROM AUTHOR]
- Published
- 1979
- Full Text
- View/download PDF
32. NON-HOMOGENEOUS CONTINUOUS-TIME MARKOV AND SEMI-MARKOV MANPOWER MODELS.
- Author
-
McClean, Sally, Montgomery, Erin, and Ugwuowo, Fidelis
- Subjects
MARKOV processes ,STOCHASTIC processes ,PROBABILITY theory ,DISTRIBUTION (Probability theory) ,RANDOM variables ,LABOR supply - Abstract
We develop estimation methods for continuous-time Markov and semi-Markov non-homogeneous man- power systems using the notion of calendar time divided into 'time windows' by change points. The model parameters may only change at these change points but remain constant between them. Our estimation methods employ a competing risks approach and allow for left truncated and right censored data. Maximum likelihood estimators are given for the hazard and survivor functions describing length of stay in any grade of the manpower system. The models are fitted to data from the Northern Ireland nursing service. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
33. New Algorithmic Developments for Estimation Problems in Time Series
- Author
-
Ansley, C. F., Kohn, R., Havránek, T., editor, Šidák, Z., editor, and Novák, M., editor
- Published
- 1984
- Full Text
- View/download PDF
34. A Bivariate Failure Model.
- Author
-
Tosch, Thomas J. and Holmes, Paul T.
- Subjects
- *
ESTIMATION theory , *EXPONENTIAL families (Statistics) , *DISTRIBUTION (Probability theory) , *STATISTICS , *PARAMETER estimation , *MATHEMATICAL statistics , *STOCHASTIC processes - Abstract
A bivariate failure model is proposed in which the residual lifetime of one component is dependent on the working status of the other. General properties of the model are discussed, and the maximum likelihood estimates of the parameters are found in a bivariate exponential-life special case. [ABSTRACT FROM AUTHOR]
- Published
- 1980
- Full Text
- View/download PDF
35. A Note on the Efficient Estimation of the Linear Expenditure System.
- Author
-
Ham, John C.
- Subjects
- *
ESTIMATION theory , *LINEAR statistical models , *PARAMETER estimation , *ECONOMIC models , *STATISTICS , *VECTOR analysis , *MATHEMATICAL statistics , *MATHEMATICAL models - Abstract
An efficient method of estimating the linear expenditure system (LES) by maximum likelihood is introduced. A test for insuring that parameter estimates obtained by numerical maximization satisfy the first-order conditions is given. In some experiments the new method of estimating the LES is compared to the more conventional method of estimation and is found to use significantly less computer time than the conventional method. [ABSTRACT FROM AUTHOR]
- Published
- 1978
- Full Text
- View/download PDF
36. The Burr XII Generalized Weibull Distribution with Applications
- Author
-
Ayoku, Sarah A
- Subjects
- Burr XII distribution, Weibull distribution, Burr-XII Weibull distribution, Maximum likelihood estimation., Applied Statistics, Statistical Models, Statistical Theory, Jack N. Averitt College of Graduate Studies, Electronic Theses & Dissertations, ETDs, Student Research
- Abstract
In this thesis, a new class of generalized distributions which includes several new distributions is proposed and developed. This new class of distributions called the Burr-XII Weibull (BXIIW) distribution has several other distributions as special cases and includes several well known distributions such as Burr-XII Rayleigh (BXIIR), Burr-XII Exponential (BXIIE), log-logistic-Weibull (LLoGW), and several others. The Burr-XII Weibull (BXIIW) class of distribution is studied in details. Some mathematical properties of this new class of BXIIW distribution including Moments, Conditional Moments, Mean Deviations, Bonferroni and Lorenz curves, distribution of the order statistics and Renyi entropy are presented. Maximum likelihood estimation technique is used to estimate the BXIIW model parameters. A simulation study is conducted to examine the bias, mean square error of the Maximum likelihood estimators and width of the confidence interval for each parameter. Applications to real data sets illustrating the applicability and usefulness of the proposed distribution is presented.
- Published
- 2017
37. Service Life Prediction of Composite Structures Through Fiber Testing.
- Author
-
NAVAL POSTGRADUATE SCHOOL MONTEREY CA, Morin, Gregory S., NAVAL POSTGRADUATE SCHOOL MONTEREY CA, and Morin, Gregory S.
- Abstract
Increasing the severity of the stress history of a structure reduces it's service life. Feasibility studies to increase the zero fuel weight of the P-3 Orion depend heavily on the resulting decrease in service life of the wing box and airframe. One option of extending the service life of existing aircraft is through the replacement or augmentation of critical structural members with composite materials. Since structural composites do not yet have adequate service life statistics, life predication must be through probability modeling. Such modeling can begin with experimental data on accelerated testing of fiber life under several sustained load levels. This data can be the basis for an appropriate strength-life model of the fiber which can in term be related to the strength-life model of the composite by the local-load sharing model. The local load sharing model captures the physical failure sequence of fiber failure within a composite. Such a strength-life model, when combined with structural analysis, can be used to predict an airframe's service life under the changed conditions associated with the zero fuel weight increase. Service Life, S-N Curve, Weibull, Maximum likelihood estimators break down rule.
- Published
- 1993
38. Exponential Error Bounds on Codes for Noisy Channels with Inaccurately Known Statistics and for Generalized Decision Rules
- Author
-
ARMY BALLISTIC RESEARCH LAB ABERDEEN PROVING GROUND MD, Kazakos, D., Cooper, A. B., III, ARMY BALLISTIC RESEARCH LAB ABERDEEN PROVING GROUND MD, Kazakos, D., and Cooper, A. B., III
- Abstract
Generalized decoding decision rules provide added flexibility in a decoding scheme and some advantages. In a generalized decoding decision rule, the following possibilities are considered: (1) The decoder has the option of not deciding at all, or rejecting all estimates. This is termed an erasure; (2) The decoder has the option of putting out more than one estimate. The resulting output is called a list. Only if the correct code word is not on the list do we have a list error. Foreney developed error bounds in his seminal paper of 1968 in which he used Gallager's ingenious 1965 method of bounding error probabilities. In this paper, we consider another realistic factor, the lack of exact knowledge of the channel statistics. We assume a mismatch between the true channel transition probabilities and the nominal probabilities used in the decoding metric. We then develop error bounds under mismatch for generalized decision rules. We also establish conditions under which the error probabilities converge to zero exponentially with the block length, in spite of the presence of mismatch.
- Published
- 1992
39. Data-Based Nonparametric Estimation of the Hazard Function with Applications to Model Diagnostics and Exploratory Analysis
- Author
-
WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER, Tanner,M. A., Wong,W. H., WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER, Tanner,M. A., and Wong,W. H.
- Published
- 1984
40. Ballistic Reentry Vehicle Aerodynamic Coefficient Estimation Using an Advanced System Identification Technique.
- Author
-
SYSTEMS CONTROL INC (VT) PALO ALTO CA, Gupta, Narendra K., Hall, W. Earl, Jr, SYSTEMS CONTROL INC (VT) PALO ALTO CA, Gupta, Narendra K., and Hall, W. Earl, Jr
- Abstract
This report develops post-flight data processing methods for aerodynamic coefficient estimation of ballistic reentry vehicles based on advanced system identification technology. This post-flight data processing involves reconstruction of unmeasured channels, aerodynamic coefficient model structure determination and maximum likelihood parameter estimation. The methods are verified on a simulation trajectory and on a flight test trajectory. It is shown that the techniques are general enough to handle a variety of regimes encountered by a typical ballistic reentry vehicle. (Author)
- Published
- 1977
41. Evaluating Weapon System Accuracy from a Classical-Bayesian Approach.
- Author
-
AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOOL OF ENGINEERING, Corbisiero, John V., AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOOL OF ENGINEERING, and Corbisiero, John V.
- Abstract
A real-world weapon system test situation is used to show how Circular Error Probability (CEP) point estimates may be obtained. The discussion includes the use of the Kolmogorov-Smirnov test for normality, chi-square confidence intervals, non-parametric confidence intervals, a priori sample size determination, stop or continue decision making during testing, and a method for combining classical and Bayesian techniques for estimating CEP. Confidence intervals involving the use of the chi-square distribution are presented in graphical form for ease of implementation. (Author)
- Published
- 1970
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.