30 results on '"Eric V. Slud"'
Search Results
2. Combining estimators of a common parameter across samples
- Author
-
Abram Kagan, Eric V. Slud, and Ilia Vonta
- Subjects
Statistics and Probability ,Applied Mathematics ,05 social sciences ,Estimator ,Estimating equations ,01 natural sciences ,010104 statistics & probability ,symbols.namesake ,Multiple data ,Efficient estimator ,Computational Theory and Mathematics ,0502 economics and business ,Statistics ,symbols ,0101 mathematics ,Statistics, Probability and Uncertainty ,Fisher information ,Analysis ,050205 econometrics ,Mathematics - Abstract
In many settings, multiple data collections and analyses on the same topic are summarised separately through statistical estimators of parameters and variances, and yet there are scientific...
- Published
- 2018
- Full Text
- View/download PDF
3. Spatial modeling of rainfall accumulated over short periods of time
- Author
-
Eric V. Slud, Victor De Oliveira, and Binbin Wang
- Subjects
Statistics and Probability ,Numerical Analysis ,Random field ,010504 meteorology & atmospheric sciences ,Stochastic dominance ,Estimator ,Covariance ,01 natural sciences ,010104 statistics & probability ,Exploratory data analysis ,Identifiability ,Applied mathematics ,0101 mathematics ,Statistics, Probability and Uncertainty ,Gauss–Hermite quadrature ,0105 earth and related environmental sciences ,Mathematics ,Generalized method of moments - Abstract
This article proposes a new random field model to describe the spatial variation of rainfall amounts accumulated over short periods of time. The model is intended to satisfy a set of desiderata motivated by the understanding of rainfall generating mechanisms and exploratory data analysis of datasets of this type. First and second order properties of the proposed model are derived, including the mean and covariance functions, as well as the families of marginal and bivariate distributions. Properties of the proposed model are shown by a mix of analytical derivations and numerical exploration that use Gauss–Hermite quadrature to approximate the required integrals. The proposed model also satisfies a stochastic dominance property, which is argued to be sensible and consistent with most rainfall data of this type. A study of identifiability is carried out, which strongly suggests all model parameters are identifiable. The generalized method of moments is proposed to estimate the parameters, and the properties of these estimators are explored based on simulated data.
- Published
- 2018
- Full Text
- View/download PDF
4. Parametric survival densities from phase-type models
- Author
-
Jiraphan Suntornchost and Eric V. Slud
- Subjects
Computer science ,Breast Neoplasms ,Type (model theory) ,Machine learning ,computer.software_genre ,symbols.namesake ,Simple (abstract algebra) ,Expectation–maximization algorithm ,Econometrics ,Humans ,Computer Simulation ,Fisher information ,Parametric statistics ,Likelihood Functions ,Models, Statistical ,Markov chain ,business.industry ,Applied Mathematics ,General Medicine ,Survival Analysis ,Markov Chains ,Latent class model ,Data Interpretation, Statistical ,Parametric model ,symbols ,Female ,Artificial intelligence ,business ,computer - Abstract
After a brief historical survey of parametric survival models, from actuarial, biomedical, demographical and engineering sources, this paper discusses the persistent reasons why parametric models still play an important role in exploratory statistical research. The phase-type models are advanced as a flexible family of latent-class models with interpretable components. These models are now supported by computational statistical methods that make numerical calculation of likelihoods and statistical estimation of parameters feasible in theory for quite complicated settings. However, consideration of Fisher Information and likelihood-ratio type tests to discriminate between model families indicates that only the simplest phase-type model topologies can be stably estimated in practice, even on rather large datasets. An example of a parametric model with features of mixtures, multiple stages or 'hits', and a trapping-state is given to illustrate simple computational tools in R, both on simulated data and on a large SEER 1992-2002 breast-cancer dataset.
- Published
- 2013
- Full Text
- View/download PDF
5. Small-area estimation based on survey data from a left-censored Fay–Herriot model
- Author
-
Tapabrata Maiti and Eric V. Slud
- Subjects
Statistics and Probability ,Estimation ,Small area estimation ,Mean squared error ,Applied Mathematics ,Statistics ,Econometrics ,Survey data collection ,Estimator ,Bias correction ,Context (language use) ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
This paper develops methodology for survey estimation and small-area prediction using Fay–Herriot (1979) models in which the responses are left-censored. Parameter and small-area estimators are derived both by censored-data likelihoods and by an estimating-equation approach which adjusts a Fay–Herriot analysis restricted to the uncensored observations. Formulas for variances of estimators and mean-squared errors of small-area predictions are provided and supported by a simulation study. The methodology is applied to provide diagnostics for the left-censored Fay–Herriot model which are illustrated in the context of the Census Bureau's ongoing Small-Area Income and Poverty Estimation (SAIPE) project.
- Published
- 2011
- Full Text
- View/download PDF
6. Efficient semiparametric estimators via modified profile likelihood
- Author
-
Filia Vonta and Eric V. Slud
- Subjects
Statistics and Probability ,Applied Mathematics ,Nonparametric statistics ,Estimator ,Semiparametric model ,Efficient estimator ,Statistics ,Linear regression ,Consistent estimator ,Applied mathematics ,Nuisance parameter ,Semiparametric regression ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
A new strategy is developed for obtaining large-sample efficient estimators of finite-dimensional parameters β within semiparametric statistical models. The key idea is to maximize over β a nonparametric log-likelihood with the infinite-dimensional nuisance parameter λ replaced by a consistent preliminary estimator λ ˜ β of the Kullback–Leibler minimizing value λ β for fixed β . It is shown that the parametric submodel with Kullback–Leibler minimizer substituted for λ is generally a least-favorable model. Results extending those of Severini and Wong (Ann. Statist. 20 (1992) 1768) then establish efficiency of the estimator of β maximizing log-likelihood with λ replaced for fixed β by λ ˜ β . These theoretical results are specialized to censored linear regression and to a class of semiparametric survival analysis regression models including the proportional hazards models with unobserved random effect or `frailty', the latter through results of Slud and Vonta (Scand. J. Statist. 31 (2004) 21) characterizing the restricted Kullback–Leibler information minimizers.
- Published
- 2005
- Full Text
- View/download PDF
7. Exact calculation of power and sample size in bioequivalence studies using two one-sided tests
- Author
-
Eric V. Slud, Meiyu Shen, and Estelle Russek-Cohen
- Subjects
Pharmacology ,Statistics and Probability ,Cross-Over Studies ,Models, Statistical ,Monte Carlo method ,Crossover ,Univariate ,Context (language use) ,Bivariate analysis ,Bioequivalence ,Pharmaceutical Preparations ,Therapeutic Equivalency ,Sample size determination ,Sample Size ,Statistics ,Applied mathematics ,Humans ,Pharmacology (medical) ,Power function ,Mathematics - Abstract
The number of subjects in a pharmacokinetic two-period two-treatment crossover bioequivalence study is typically small, most often less than 60. The most common approach to testing for bioequivalence is the two one-sided tests procedure. No explicit mathematical formula for the power function in the context of the two one-sided tests procedure exists in the statistical literature, although the exact power based on Owen's special case of bivariate noncentral t-distribution has been tabulated and graphed. Several approximations have previously been published for the probability of rejection in the two one-sided tests procedure for crossover bioequivalence studies. These approximations and associated sample size formulas are reviewed in this article and compared for various parameter combinations with exact power formulas derived here, which are computed analytically as univariate integrals and which have been validated by Monte Carlo simulations. The exact formulas for power and sample size are shown to improve markedly in realistic parameter settings over the previous approximations.
- Published
- 2014
8. Miscellanea. Semiparametric two-sample tests in clinical trials with a post-randomisation response indicator
- Author
-
Edward L. Korn and Eric V. Slud
- Subjects
Statistics and Probability ,Clinical trial ,Applied Mathematics ,General Mathematics ,Statistics ,Econometrics ,Estimator ,Observational study ,Two sample ,Statistics, Probability and Uncertainty ,General Agricultural and Biological Sciences ,Agricultural and Biological Sciences (miscellaneous) ,Mathematics - Abstract
In many clinical trials involving survival endpoints, one has additional data on some binary indicator of'response', such as initial tumour shrinkage in cancer trials. This paper studies the case of randomised clinical trials where the response indicator is available shortly after randomisation, and where one can assume that, within each stratum defined by the response indicator, a two treatment-group proportional-hazards model holds. The same model may also describe some incompletely randomised or observational studies. Asymptotic relative efficiencies for Kaplan-Meier-based estimators versus maximum partial likelihood estimators are examined under this model for estimating either the difference in survival probabilities at a specified time or the parameter estimated by the logrank numerator. It is shown that the efficiency gains using the model are more promising when estimating the difference in survival probabilities. An example is given comparing the long-term survival experience of two groups of patients with advanced Hodgkin's disease.
- Published
- 1997
- Full Text
- View/download PDF
9. Mixing for stationary processes with finite-order multiple Wiener-Itô integral representation
- Author
-
Daniel W. Chambers and Eric V. Slud
- Subjects
Degree (graph theory) ,Computer Science::Information Retrieval ,Applied Mathematics ,General Mathematics ,Mathematical analysis ,Zero (complex analysis) ,Process (computing) ,Term (logic) ,symbols.namesake ,symbols ,Order (group theory) ,Representation (mathematics) ,Gaussian process ,Mixing (physics) ,Mathematics - Abstract
Necessary and sufficient analytical conditions are given for homogeneous multiple Wiener-Itô integral processes (MWIs) to be mixing, and sufficient conditions are given for mixing of general square-integrable Gaussian-subordinated processes. It is shown that every finite or infinite sum Y of MWIs (i.e. every real square-integrable stationary polynomial form in the variables of an underlying weakly mixing Gaussian process) is mixing if the process defined separately by each homogeneous-order term is mixing, and that this condition is necessary for a large class of Gaussian-subordinated processes. Moreover, for homogeneous MWIs Y1, for sums of MWIs of order ≤ 3, and for a large class of square-integrable infinite sums Y1, of MWIs, mixing holds if and only if Y2 has correlation-function decaying to zero for large lags. Several examples of the criteria for mixing are given, including a second-order homogeneous MWI, i.e. a degree two polynomial form, orthogonal to all linear forms, which has auto-correlations tending to zero for large lags but is not mixing.
- Published
- 1996
- Full Text
- View/download PDF
10. Inaccuracy rates and Hodges-Lehmann large deviation rates for parametric inferences with nuisance parameters
- Author
-
Eric V. Slud and Antonis Koutsoukos
- Subjects
Statistics and Probability ,Score test ,Hodges–Lehmann estimator ,Applied Mathematics ,Test score ,Scalar (mathematics) ,Statistics ,Nuisance parameter ,Statistics, Probability and Uncertainty ,Implicit function theorem ,Probability measure ,Parametric statistics ,Mathematics - Abstract
In the context of parametric inference for a scalar parameter β in the presence of a finite-dimensional nuisance parameter λ based on a large random sample X 1 , …, X n , this paper calculates an exact one-sided inaccuracy rate for maximum-likelihood and M-estimators, as well as the Hodges-Lehmann (1956) large deviation rate for type-II error probabilities under fixed alternatives. The method is to couple the large-deviation theorems of Groeneboom et al. (1979) for empirical measures with a characterization via the Implicit Function Theorem of ‘least favorable measures’ extremizing the Kullback-Leibler information functional over statistically interesting sets of measures.
- Published
- 1995
- Full Text
- View/download PDF
11. Maximin efficiency-robust tests and some extensions
- Author
-
Sudip Bose and Eric V. Slud
- Subjects
Statistics and Probability ,Asymptotic power ,Applied Mathematics ,Bayes test ,Decision theory ,Rank (computer programming) ,Statistics ,Bayesian probability ,Score ,Statistics, Probability and Uncertainty ,Minimax ,Mathematics - Abstract
The Maximin Efficiency-Robust Test idea of Gastwirth (1966) was to maximize the minimum asymptotic power (for fixed size) versus special local families of alternatives over some specially chosen families of score statistics. This approach is reviewed from a general decision-theoretical perspective, including some Bayesian variants. For two-sample censored-data rank tests and stochastically ordered but not proportional-hazard alternatives, the MERT approach leads to customized weighted-logrank tests for which the weights depend on estimated random-censoring distributions. Examples include statistics which perform well against both Lehmann and logistic alternatives or against families of alternatives which include increasing, decreasing, and ‘bathtub-shaped’ hazards.
- Published
- 1995
- Full Text
- View/download PDF
12. Stability of stochastic integrals under change of filtration
- Author
-
Eric V. Slud
- Subjects
filtration ,Statistics and Probability ,Pure mathematics ,Applied Mathematics ,p4 semimartingale ,Mathematical analysis ,p4 quadratic variation ,Stability (probability) ,Stochastic integral ,Quadratic variation ,p4 predictable projection ,Probability space ,Semimartingale ,p4 purely discontinuous ,Modelling and Simulation ,Modeling and Simulation ,Bounded function ,Filtration (mathematics) ,p4 usual conditions ,p4 predictable process ,p4 special semimartingale ,Mathematics - Abstract
Let (Ω, F , P ) be a probability space equipped with two filtrations { F t } and { G t } satisfying the usual conditions. Assume that X is a semimartingale and that h is locally bounded and predictable for each of the two filtrations { F t } and { G t }. New examples of such processes are given. Utilizing and extending partial results of Zheng (1982), this paper extends the available results on the relationship between the stochastic integral processes ∫ t h s d X s taken respectively in the sense of { F t } and of { G t }. In particular, it is shown that these stochastic integrals differ at most by a continuous process with quadratic variation defined and equal to 0. If both stochastic integrals are { F t ∩ F t } semimartingales, then it is proved that the stochastic integral ∫ t h s d X s taken in { F t } sense is indistinguishable from that taken in { G t } sense.
- Published
- 1994
- Full Text
- View/download PDF
13. On autocorrelation estimation in mixed-spectrum Gaussian processes
- Author
-
Eric V. Slud and Benjamin Kedem
- Subjects
Statistics and Probability ,filter ,Autocorrelation technique ,Applied Mathematics ,spectral atoms ,Mathematical analysis ,Autocorrelation ,spectral measure ,zero-crossing rate ,Estimator ,Maximum entropy spectral estimation ,Wiener–Khinchin theorem ,Spectral line ,ergodic ,Random measure ,symbols.namesake ,Modeling and Simulation ,Modelling and Simulation ,symbols ,Wiener-Itô integrals ,Statistical physics ,Gaussian process ,Mathematics - Abstract
Consistency issues related to autocorrelation estimation for Gaussian processes with mixed spectra are clarified. The sample autocorrelation is known not to be consistent when the spectrum contains spectral atoms. This fact is verified by computing explicitly its mean-square limit in terms of the random measure assigned to the atoms of the process. The alternative estimator constructed from the zero-crossing rate is likewise not consistent in general. However, it is consistent for a spectrum supported at a single frequency, or a spectrum for which the ‘signal’ and ‘noise’ have the exact same first-order autocorrelation. This is proved, using recent results from multiple Wiener-Ito integral expansions for level-crossing counts, by direct computation of the spectrum of the zero-crossing indicator process when the underlying process has a mixed spectrum. In general, regardless of the spectral type, the asymptotic zero-crossing rate admits values between the lowest and highest positive frequencies with probability one. For band-limited processes, this fact provides an easy way to assess the precision of functions of the zero-crossing rate.
- Published
- 1994
- Full Text
- View/download PDF
14. Best precedence tests for censored data
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Optimal test ,Applied Mathematics ,Rank (computer programming) ,Sample (statistics) ,Extension (predicate logic) ,Survival data ,Fixed duration ,Statistics ,Econometrics ,Statistics, Probability and Uncertainty ,Kaplan–Meier estimator ,Quantile ,Mathematics - Abstract
The prevalence of survival analyses based on a fixed duration of time-on-test, together with the need for generally powerful two-sample censored-data rank tests against stochastically ordered but not proportional-hazard alternatives, are used to motivate an extension of ‘precedence tests’ (Nelson (1963), Lin and Sukhatme (1989)) to right-censored survival data. The idea is to compare the r -th Kaplan-Meier quantile from the first sample with the s -th Kaplan-Meier quantile from the second sample, where r and s are nearby values chosen to give size α and best power against local proportional-hazards alternatives.
- Published
- 1992
- Full Text
- View/download PDF
15. Multiple Wiener-itô integral expansions for level-crossing-count functionals
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Hermite polynomials ,Mathematical finance ,Mathematical analysis ,Stochastic calculus ,Recursion (computer science) ,Asymptotic distribution ,symbols.namesake ,symbols ,Applied mathematics ,Statistics, Probability and Uncertainty ,Hypergeometric function ,Gaussian process ,Analysis ,Mathematics ,Central limit theorem - Abstract
This paper applies the stochastic calculus of multiple Wiener-Ito integral expansions to express the number of crossings of the mean level by a stationary (discrete- or continuous-time) Gaussian process within a fixed time interval [0,T]. The resulting expansions involve a class of hypergeometric functions, for which recursion and differential relations and some asymptotic properties are derived. The representation obtained for level-crossing counts is applied to prove a central limit theorem of Cuzick (1976) for level crossings in continuous time, using a general central limit theorem of Chambers and Slud (1989a) for processes expressed via multiple Wiener-Ito integral expansions in terms of a stationary Gaussian process. Analogous results are given also for discrete-time processes. This approach proves that the limiting variance is strictly positive, without additional assumptions needed by Cuzick.
- Published
- 1991
- Full Text
- View/download PDF
16. Relative efficiency of the log rank test within a multiplicative intensity model
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Hazard (logic) ,Score test ,Applied Mathematics ,General Mathematics ,Multiplicative function ,Agricultural and Biological Sciences (miscellaneous) ,Intensity (physics) ,Log-rank test ,Efficiency ,Statistics ,Covariate ,Log-linear model ,Statistics, Probability and Uncertainty ,General Agricultural and Biological Sciences ,Mathematics - Abstract
SUMMARY For large-sample clinical trials with independent individuals randomly allocated to two treatment groups, in which survival times follow a log linear multiplicative intensity model with treatment group as one covariate, this paper calculates the asymptotic relative efficiency of the log rank test for treatment effect as compared with the optimal score test. The method is to exhibit the failure hazard intensity, not of proportional hazards form, obtained by ignoring all covariates other than treatment group. The efficiency formulae are illustrated in two examples, and estimation from data of the loss of efficiency is illustrated for two clinical trial datasets.
- Published
- 1991
- Full Text
- View/download PDF
17. Graphical Models: Methods for Data Analysis and Mining
- Author
-
Eric V Slud
- Subjects
Statistics and Probability ,Computer science ,Applied Mathematics ,Modeling and Simulation ,Graphical model ,Data mining ,computer.software_genre ,computer - Published
- 2003
- Full Text
- View/download PDF
18. Analysis of Factorial Survival Experiments
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Factorial ,General Immunology and Microbiology ,Proportional hazards model ,Applied Mathematics ,General Medicine ,Factorial experiment ,Asymptotic theory (statistics) ,General Biochemistry, Genetics and Molecular Biology ,Covariate ,Statistics ,Econometrics ,Main effect ,General Agricultural and Biological Sciences ,Null hypothesis ,Mathematics ,Statistical hypothesis testing - Abstract
Several new methodological issues that arise within two-way factorial designs for survival experiments are discussed within the framework of asymptotic theory for the proportional hazards model with two binary treatment covariates. These issues include: the proper formulation of null hypotheses and alternatives, the choice among log-rank and adjusted or stratified log-rank statistics, the asymptotic correlation between test statistics for the separate main effects, the asymptotic power (under the various possible methods of analysis) of tests to detect main effects and interactions, the comparison of power to detect main effects within a 2 x 2 factorial design with power in a three-group trial where no patients are randomized simultaneously to both treatments, and the problems of analysis arising when accrual or exposure to one of the treatments is terminated early for ethical reasons.
- Published
- 1994
- Full Text
- View/download PDF
19. Consistency and efficiency of inferences with the partial likelihood
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Score test ,Consistency (statistics) ,Applied Mathematics ,General Mathematics ,Statistics ,Statistics, Probability and Uncertainty ,General Agricultural and Biological Sciences ,Likelihood function ,Agricultural and Biological Sciences (miscellaneous) ,Likelihood principle ,Marginal likelihood ,Mathematics - Published
- 1982
- Full Text
- View/download PDF
20. Moderate- and large-deviation probabilities in actuarial risk theory
- Author
-
Craig Hoesman and Eric V. Slud
- Subjects
Statistics and Probability ,Actuarial science ,Mathematical model ,Applied Mathematics ,010102 general mathematics ,Ruin theory ,01 natural sciences ,010104 statistics & probability ,Superposition principle ,Bounded function ,Portfolio ,Renewal theory ,0101 mathematics ,Project portfolio management ,Constant (mathematics) ,Mathematics - Abstract
A general model for the actuarial risk-reserve process as a superposition of compound delayed-renewal processes is introduced and related to previous models which have been used in collective risk theory. It is observed that non-stationarity of the portfolio 'age-structure' within this model can have a significant impact upon probabilities of ruin. When the portfolio size is constant and the policy agedistribution is stationary, the moderate- and large-deviation probabilities of ruin are bounded and calculated using the strong approximation results of Cs6rg6 et al. (1987a, b) and a large-deviation theorem of Groeneboom et al. (1979). One consequence is that for non-Poisson claim-arrivals, the large-deviation probabilities of ruin are noticeably affected by the decision to model many parallel policy lines in place of one line with correspondingly faster claim-arrivals. RISK-RESERVE PROCESS; COMPOUND DELAYED-RENEWAL PROCESS; SUPEPPOSITION; STRONG APPROXIMATION
- Published
- 1989
- Full Text
- View/download PDF
21. Perturbations of random matrix products in a reducible case
- Author
-
Eric V. Slud and Yuri Kifer
- Subjects
Independent and identically distributed random variables ,Pure mathematics ,Applied Mathematics ,General Mathematics ,Matrix function ,Symmetric matrix ,Nonnegative matrix ,Random matrix ,Square matrix ,Pascal matrix ,Eigendecomposition of a matrix ,Mathematics - Abstract
It is known that for any sequence X1, X2…, of identically distributed independent random matrices with a common distribution μ. the limitexists with probability 1. We study some conditions under which Λ(μk)→Λ(μ) provided μk → μ in the weak sense.
- Published
- 1982
- Full Text
- View/download PDF
22. Multivariate dependent renewal processes
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Multivariate statistics ,Multivariate analysis ,Mathematical model ,Applied Mathematics ,Markov process ,Regression analysis ,Bivariate analysis ,Regression ,symbols.namesake ,Econometrics ,Statistical inference ,symbols ,Applied mathematics ,Mathematics - Abstract
A new class of reliability point-process models for dependent components is introduced. The dependence is expressed through a regression, following a form suggested by Cox (1972) for survival data analysis involving the current life-length of the components. After formulating the current-life process as a Markov process with stationary transitions and stating some general results on asymptotic behavior, we describe the stationary distributions in some bivariate examples. Finally, we discuss statistical inference for the new models, exhibiting and justifying full- and partial-likelihood methods for their analysis.
- Published
- 1984
- Full Text
- View/download PDF
23. Stability of Exponential Rate of Growth of Products of Random Matrices Under Local Random Perturbations
- Author
-
Eric V. Slud
- Subjects
Combinatorics ,General Mathematics ,Applied mathematics ,Stability (probability) ,Random matrix ,Mathematics ,Exponential function ,Rate of growth - Published
- 1986
- Full Text
- View/download PDF
24. Dependent competing risks and summary survival curves
- Author
-
Eric V. Slud and Larry Rubinstein
- Subjects
Statistics and Probability ,Waiting time ,Applied Mathematics ,General Mathematics ,Nonparametric statistics ,Estimator ,Conditional probability distribution ,Competing risks ,Agricultural and Biological Sciences (miscellaneous) ,Censoring (clinical trials) ,Statistics ,Econometrics ,Statistics, Probability and Uncertainty ,Marginal distribution ,General Agricultural and Biological Sciences ,Survival analysis ,Mathematics - Abstract
SUMMARY In many contexts where there is interest in inferring the marginal distribution of a survival time T subject to censoring embodied in a latent waiting time C, the times T and C may not be independent. This paper presents a new class of nonparametric assumptions on the conditional distribution of T given C and shows how they lead to consistent generalizations of the Kaplan & Meier (1958) survival curve estimator. The new survival curve estimators are used under weak assumptions to construct bounds on the marginal survival which can be much narrower than those of Peterson (1976). In stratified populations where T and C are independent only within strata examples indicate that the Kaplan-Meier estimator is often approximately consistent.
- Published
- 1983
- Full Text
- View/download PDF
25. Generalization of an inequality of Birnbaum and Marshall, with applications to growth rates for submartingales
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Inequality ,Differential equation ,Generalization ,Stochastic process ,media_common.quotation_subject ,Applied Mathematics ,Mathematical analysis ,predictable compensator ,Doob–Meyer decomposition theorem ,Stochastic differential equation ,Mathematics::Probability ,Lenglart Inequality ,strong asymptotic growth rate ,Modeling and Simulation ,Modelling and Simulation ,continuous-time submartingale ,Applied mathematics ,Predictable process ,occupation times for Brownian Motion ,Brownian motion ,media_common ,Mathematics - Abstract
The well-known submartingale maximal inquality of Birnbaum and Marshall (1961) is generalized to provide upper tail inequalities for suprema of processes which are products of a submartingale by a nonincreasing nonnegative predictable process. The new inequalities are proved by applying an inequality of Lenglart (1977), and are then used to provide best-possible universal growth-rates for a general submartingale in terms of the predictable compensator of its positive part. Applications of these growth rates include strong asymptotic upper bounds on solutions to certain stochastic differential equations, and strong asymptotic lower bounds on Brownian-motion occupation-times.
- Published
- 1987
- Full Text
- View/download PDF
26. Necessary conditions for nonlinear functionals of Gaussian processes to satisfy central limit theorems
- Author
-
Daniel W. Chambers and Eric V. Slud
- Subjects
Statistics and Probability ,Pure mathematics ,Applied Mathematics ,Mathematical analysis ,Stochastic integral ,Nonlinear system ,symbols.namesake ,Modeling and Simulation ,Modelling and Simulation ,symbols ,Martingale (probability theory) ,Gaussian process ,Mathematics ,Central limit theorem - Abstract
and let H(X) = L’(f), u(X), P) denote the space of square-integrable functionals of X. Say that Y t H(X) with EY =O satisfies the Central Limit Theorem (CLT) if A family of martingales (Z,(t), t 2 0) is exhibited for which 2, (“c) = Z,, and martingale tech- niques and results are used to provide suficient conditions on X and Y for the CLT. These conditions are then shown to be necessary for slightly more restrictive central limit behavior of Y. AMS Subject Ch\jfication.s: hOF05, 60G10, 60644 multiple Wiener-It8 integral 3 stochastic integral with respect to square-integrable martingale * predictable variance process * martingale Functional Central Limit Theorem * Band- weighted Central Limit Theorem
- Published
- 1989
- Full Text
- View/download PDF
27. Clipped Gaussian processes are never M-step Markov
- Author
-
Eric V. Slud
- Subjects
Statistics and Probability ,Numerical Analysis ,Markov chain mixing time ,Markov kernel ,clipped process ,Markov chain ,Variable-order Markov model ,Markov process ,Markov model ,m-step Markov sequence ,Time reversibility ,symbols.namesake ,level-crossings process ,Calculus ,symbols ,Applied mathematics ,Markov property ,Statistics, Probability and Uncertainty ,Computer Science::Databases ,Mathematics - Abstract
It is shown that the level-crossings process of zeroes and ones corresponding to a stationary but not independent Gaussian sequence can never be exactly ( m -step) Markov, although its correlation-sequence can agree exactly with that of a Markov sequence.
- Full Text
- View/download PDF
28. Time Series Discrimination by Higher Order Crossings
- Author
-
Benjamin Kedem and Eric V. Slud
- Subjects
Statistics and Probability ,Series (mathematics) ,Binary number ,Quadratic form (statistics) ,clipped signal ,62M07 ,Combinatorics ,goodness of fit ,Goodness of fit ,Clipping (photography) ,Simple (abstract algebra) ,level crossings ,Stationary time series ,62M10 ,Applied mathematics ,Statistics, Probability and Uncertainty ,Degeneracy (mathematics) ,higher-order crossings ,62H30 ,Statistic ,discrimination ,Mathematics - Abstract
A new methodology is proposed for discrimination among stationary time-series. The time series are transformed into binary arrays by clipping (retaining only the signs of) the $j$th difference series, $j = 0, 1, 2, \cdots$. The degeneracy of clipped $j$th differences is studied as $j$ becomes large. A new goodness of fit statistic is defined as a quadratic form in the counts of axis-crossings by each of the first $k$ differences of the series. Simulations and the degeneracy of high-order differences justify fixing $k$ no larger than 10 for many processes. Empirical simulated distributions (with $k = 9$) of the goodness of fit statistic suggest a gamma approximation for its tail probabilities. Illustrations are given of discrimination between simple models with the new statistic.
- Published
- 1982
- Full Text
- View/download PDF
29. How Dependent Causes of Death Can Make Risk Factors Appear Protective
- Author
-
Byar D and Eric V. Slud
- Subjects
Statistics and Probability ,General Immunology and Microbiology ,Applied Mathematics ,Statistics ,Covariate ,General Medicine ,General Agricultural and Biological Sciences ,Competing risks ,General Biochemistry, Genetics and Molecular Biology ,Mathematics - Abstract
It is shown, using the results of Slud and Rubinstein (1983, Biometrika 70, 643-649) in a specially constructed theoretical example, that competing latent failure times Ti and Ci and a two-level covariate Vi, if analyzed as though Ti and Ci are independent for each Vi level, can lead to exactly the wrong conclusion about the ordering of Pr(Ti greater than or equal to t[Vi = 1) and Pr(Ti greater than or equal to t[Vi = 0) for every t. This phenomenon can never be excluded on purely statistical grounds using such data and should be considered when interpreting data analyses involving competing risks.
- Published
- 1988
- Full Text
- View/download PDF
30. A Comparison of Reflected Versus Test-Based Confidence Intervals for the Median Survival Time, Based on Censored Data
- Author
-
David P. Byar, Eric V. Slud, and Sylvan B. Green
- Subjects
Statistics and Probability ,General Immunology and Microbiology ,Applied Mathematics ,Nonparametric statistics ,General Medicine ,Time based ,Censoring (statistics) ,General Biochemistry, Genetics and Molecular Biology ,Confidence interval ,Statistics ,Econometrics ,Statistical analysis ,General Agricultural and Biological Sciences ,Median survival ,Survival analysis ,Mathematics - Abstract
The small-sample performance of some recently proposed nonparametric methods of constructing confidence intervals for the median survival time, based on randomly right-censored data, is compared with that of two new methods. Most of these methods are equivalent for large samples. All proposed intervals are either 'test-based' or 'reflected' intervals, in the sense defined in the paper. Coverage probabilities for the interval estimates were obtained by exact calculation for uncensored data, and by stimulation for three life distributions and four censoring patterns. In the range of situations studied, 'test-based' methods often have less than nominal coverage, while the coverage of the new 'reflected' confidence intervals is closer to nominal (although somewhat conservative), and these intervals are easy to compute.
- Published
- 1984
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.