70 results
Search Results
2. The Characterizations for Exponential and Geometric Distributions.
- Author
-
Shanbhag, D. N.
- Subjects
- *
PROBABILITY theory , *DISTRIBUTION (Probability theory) , *GEOMETRIC modeling , *RANDOM variables , *STATISTICAL sampling , *STANDARD deviations , *STATISTICAL hypothesis testing , *STATISTICAL reliability - Abstract
The lack of memory property of the exponential distribution plays an important part in the branch of applied probability. This property assumes the information regarding the probability distribution. In the present paper we give a characteristic property of the exponential distribution based on the means of the conditional distributions. Considering a random variable T with finite mean and such that P(T > O) > 0, and denoting by y a positive number such that P(T > y) > 0, we show that T has an exponential distribution if and only if the mean of the conditional distribution, given T > y, exceeds the mean of the unconditional distribution by the quantity y for all y. Also given in the paper is a similar characterization for the geometric distribution. Since the information concerning the expected values is easily accessible, we expect these properties to be useful in dealing with the practical problems. [ABSTRACT FROM AUTHOR]
- Published
- 1970
- Full Text
- View/download PDF
3. MOMENTS OF THE DISTRIBUTION OF SAMPLE SIZE IN A SPRT.
- Author
-
Ghosh, B. K.
- Subjects
- *
MOMENTS method (Statistics) , *STATISTICAL sampling , *ARITHMETIC , *DISTRIBUTION (Probability theory) , *DIFFERENTIABLE functions , *PROBABILITY theory , *APPROXIMATION theory , *EQUATIONS , *STATISTICS - Abstract
The article discusses moments of the distribution of sample size, N in a sequential probability ratio test (SPRT). The present paper provides variance, the third and the fourth moments of N. The details are worked out in five common applications of the SPRIT. The relation of the variance of N to the truncation of a SPRT is discussed is also discussed in the paper. Scholar A. Wald indicated in passing how one can obtain the moments of N, but the only published literature where the author encountered a general expression for the variance of N. However, their expression is incorrect. Using scholar J. Wolfowitz's results, which they do, or differentiating Wald's, fundamental identity twice one gets provided. In many practical applications of the SPRT, μ and moments in an equation derived are differentiable functions of a real-valued parameter. The limiting expressions for the moments can then be determined by standard methods of mathematical analysis. However, for the third and fourth moments the actual technique may involve an excessive amount of arithmetic.
- Published
- 1969
- Full Text
- View/download PDF
4. THE EXCEEDANCE TEST FOR TRUNCATION OF A SUPPLIER'S DATA.
- Author
-
Deely, J. J., Amos, D. E., and Steck, G. P.
- Subjects
- *
DISTRIBUTORS (Commerce) , *STATISTICAL sampling , *DATABASE management , *SUPPLIERS , *ELECTRONIC data processing , *SUPPLY chains , *SAMPLE size (Statistics) , *STATISTICS , *TESTING - Abstract
The purpose of this paper is to present an easily applied test useful in determining whether or not a supplier's data have been truncated. The proposed test has the following desirable properties: (I) it is the uniformly most powerful rank test, (ii) it is asymptotically uniformly most powerful, (iii) power computations can easily be made for arbitrary sample sizes, formulas for such computations being given in the paper. Although formulated in the context of verifying a supplier's data, the test can be applied to other situations in which false representation of data in the form of truncation is important. Such is the case, for example, in reliability demonstrations or legal suits involving physical measurements. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
5. COMPARISON OF FOUR RATIO-TYPE ESTIMATES UNDER A MODEL.
- Author
-
Rao, Poduri S. R. S.
- Subjects
- *
STATISTICAL sampling , *ERROR analysis in mathematics , *ESTIMATION theory , *MODULAR arithmetic , *RATIO & proportion - Abstract
In this paper, the simple estimate formed by the ratio of the sample means, the estimate obtained by the statistician M.H. Quenouille's method of bias reduction, the unbiased estimate of the statisticians L.A. Goodman and H.O. Hartley and the estimate proposed by H.O. Hartley are compared. These estimates are respectively denoted by t1, t2, t3 and t4. Mean square errors of these estimates are compared under J. Durbin's model. In practice g has often been found to lie between 0 and 2. In this paper expressions for the Mean Square Errors of these estimates are obtained for general values of n, g and h, and they are compared for finite values of n when g = 0, 1 and 2. When g = 0. The method of obtaining exact expressions for the Mean Square Errors (MSE's) of these estimates is similar to that of J.N.K. Rao and J.T. Webster. After a considerable amount of simplification, the following expressions for the biases and MSE's are obtained. When t=0, Durbin compared t1 with t2, and Rao compared t1 with t4.
- Published
- 1969
- Full Text
- View/download PDF
6. ASSOCIATION AND ESTIMATION IN CONTINGENCY TABLES.
- Author
-
Mosteller, Frederick
- Subjects
- *
CONTINGENCY tables , *DISTRIBUTION (Probability theory) , *MATHEMATICAL statistics , *SURVEYS , *PROBABILITY theory , *ESTIMATION theory , *STATISTICAL sampling - Abstract
The 1967 Committee on Publications, chaired by David L. Wallace, found that many American Statistical Association members desired more review and survey papers. These have been hard to come by and so as the author's last act before leaving office, decided to provide a short survey paper on some related ideas in a field where nearly all of the statisticians sometimes work — that of contingency tables. These ideas are largely available in literature and yet they have not often been put together, though statisticians I. J. Good's monograph and Leo Goodman's many papers form good sources. But the author's paper is not intended as a review of the literature, only as a survey of one set of ideas about estimation in the analysis of contingency tables. The author fears that the first act of most social scientists upon seeing a contingency table is to compute chi-square for it. Sometimes this process is enlightening, sometimes wasteful, but sometimes it does not go quite far enough. The author collected 500 samples of writing published about 1961.
- Published
- 1968
- Full Text
- View/download PDF
7. ORDER STATISTICS ESTIMATORS OF THE LOCATION OF THE CAUCHY DISTRIBUTION.
- Author
-
Barnett, V. D.
- Subjects
- *
ESTIMATION theory , *DISTRIBUTION (Probability theory) , *STATISTICS , *ARITHMETIC mean , *ORDER statistics , *VARIANCES , *STATISTICAL sampling , *MEDIAN (Mathematics) - Abstract
In a recent paper in this Journal, Rothenberg, Fisher and Tilanus [1] discuss a class of estimators of tile location parameter of the Cauchy distribution, taking the form of the arithmetic average of a central subset, of the sample order statistics. They show that the average of roughly the middle quarter of the ordered sample has minimum asymptotic variance within this class, and that asymptotically it eliminates about. 36 per cent of the efficiency loss of the median (the most commonly used estimate,r) in comparison to the maximum likelihood estimator (m.l.e.). Of course both the m.l.e, and the best linear unbiased estimator based on the order statistics (BLUE) achieve full asymptotic efficiency in the Cramer-Rao sense and there can be no dispute about the relative merits of the three estimators asymptotically, or about the inferiority of the median (with asymptotic efficiency 8/pi[sup 2] * This character cannot be converted in ASCII text) 0.8 compared with about 0.88 for the estimator of Rothenberg et al.). In any practical situation however, we will be concerned with estimation from samples of finite size and asymptotic properties will not necessarily give any guidance here. We are essentially concerned with two points in assessing the relative merits of estimators in small samples, their case of application and "small-sample efficiency" which is conveniently measured as the ratio of the Cramer-Rao lower bound to the variance of the estimator. In this paper various estimators of the location of the Cauchy distribution arc compared in these two respects for samples of up to 20 observations. The small-sample properties of the m.l.e. have been extensively discussed elsewhere (Barnett [2]) and relevant results are summarized where necessary. The main purpose of the paper is to discuss general linear estimators based on the order statistics, and to assess their utility in the present context. Since this paper was prepared a further interesting 'quick estimator', b [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
8. ON ROBUST PROCEDURES.
- Author
-
Gastwirth, Joseph L.
- Subjects
- *
DENSITY functionals , *ROBUST statistics , *DISTRIBUTION (Probability theory) , *PITMAN'S measure of closeness , *STATISTICAL sampling , *STATISTICS , *PARAMETER estimation , *RANKING - Abstract
This paper discusses a procedure for finding robust; estimators of the location parameter of symmetric unimodal distributions. The estimators are based on robust rank tests and the methods used are applicable to other one parameter problems. To every density function there corresponds an asymptotically most powerful rank test (a.m.p.r.t.). For a set F of density functions the maximin rank test, R, maximizes the minimum limiting Pitman's efficiency of R relative to the a.m.p.r.t. for each member of F. This maximin test, R, can be used to construct estimators according to the proposal of Hodges and Lehman; it generates another estimator T in the following manner. If the test based on R is the a.m.p.r.t. for samples from a density function g(x - theta), then the estimator T will be the best linear unbiased estimate (b.l.u.e.) of the location parameter for samples from g(x). Unfortunately, the estimator T is not necessarily consistent for all members of F. A class of rank tests which generate linear combinations of a few order statistics is introduced and a simple estimator using the 33 1/3rd, 50th and 66 2/3rd percentiles is proposed. The relationship of the present paper to the work of Huber is discussed and it is shown that the b.l.u.e. corresponding to his least favorable distribution is the trimmed mean. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
9. A NOTE ON THE ESTIMATION OF THE LOCATION PARAMETER OF THE CAUCHY DISTRIBUTION.
- Author
-
Bloch, Daniel
- Subjects
- *
ASYMPTOTIC theory in estimation theory , *ESTIMATION theory , *CAUCHY integrals , *LOCATION analysis , *ASYMPTOTIC expansions , *DISTRIBUTION (Probability theory) , *STATISTICAL sampling , *VARIANCES , *STATISTICS - Abstract
Recently Professors T. J. Rothenberg, F. M. Fisher, and C. B. Tilanus published a paper proposing the class of trimmed means as estimators of the location parameter of the Cauchy distribution [5]. They showed that the asymptotic sampling variance of the estimators in this class is essentially minimized by using the middle 24% of the sample order statistics. The corresponding estimate has an asymptotic relative efficiency to the best estimator for complete samples (A.R.E.) of .87796 as compared to an A.R.E. of .81057 for the sample median. In this paper a few "quick estimators" are considered as estimators for the location parameter of the Cauchy Distribution. A "quick estimate" is a linear combination (a weighted average) of one or more order statistics. Our goal is to find a simple estimator, i.e. an estimator based on only a few order statistics, which has an A.R.E. of at least 90%. We found an estimator based on five order statistics which is considerably better than the optimum trimmed mean (using the middle 24% of the sample order statistics) and much better than the sample median. The A.R.E. of the optimum censored estimate with censored fractiles .38 and .62 is also found, and a comparison between the trimmed, censored, and proposed estimators is made. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
10. SYSTEMATIC SAMPLING WITH UNEQUAL PROBABILITY AND WITHOUT REPLACEMENT.
- Author
-
Hartley, H. O.
- Subjects
- *
ESTIMATION theory , *STATISTICS , *PROBABILITY theory , *STATISTICAL sampling , *ANALYSIS of variance , *SAMPLE size (Statistics) - Abstract
Given a population of N units, it is required to draw a sample of n distinct units in such a way that the probability for the ith unit to be in the sample is proportional to its 'size' x. From the alternative methods of achieving this we consider here only the so-called systematic method which, to the best of our knowledge, was first developed by W. G. Madow (1949): The units in the population are listed in a 'particular' order, their x, accumulated and a systematic selection of n elements from a 'random start' is then made on the accumulation. In a more recent paper (H. O. Hartley and J. N. K. Rao (1962) ) an asymptotic estimation theory (for large N) associated with this procedure was developed for the case when the order of the listed units is random. In this paper we draw attention to certain properties of Madow's estimator: We utilize the fact that with systematic sampling the total number of different samples is N (rather than ([This eq. cannot be change in char.]) as with completely random sampling). This simplification in the definition of the variance of the estimator in repeated sampling enables us to identify the exact variance of Madow's estimator with a 'between sample mean square' in a special analysis of variance (see section 4) and compare it with the variance of the pps estimator in sampling with replacement as well as in other sampling procedures. We also develop two approximate methods of variance estimation (see section 5). We pay particular attention to the case when the units are listed in the order of their size. With this particular arrangement our method can be described as 'systematic with random start' and the gain in precision that we accomplish has of course, analogues in systematic sampling with equal probabilities employing ratio estimators in which there is a relation between the ratio ri =yi/Xi and xi Compared with other methods the present procedure combines the advantage of ease of systematic sample selection with the availability of exact variance formulas for any n and N. Moreover, it usually leads to a more efficient estimate. Its shortcoming resides in the fact that the estimation of the variance is based on certain assumptions. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
11. SAMUEL S. WILKS.
- Author
-
Stephan, Frederick F., Tukey, John W., Mosteller, Frederick, Mood, Alex M., Hansen, Morris H., Simon, Leslie E., and Dixon, W. J.
- Subjects
- *
STATISTICIANS , *MATHEMATICAL statistics , *RANDOM variables , *MATHEMATICIANS , *NONPARAMETRIC statistics , *STATISTICAL sampling , *STATISTICS - Abstract
The article presents information about Samuel S. Wilks, a great contributor to statistics. Wilks' behavior toward applications was peculiarly split, he encouraged his students to work on applications, not always an easy thing to do in a strongly theoretical mathematics department; he often told students about applications that he regarded as "neat" or "cute" or "clever"; the statistical colloquium he guided often had speakers on practical applications of mathematical statistics; he himself wrote some practical papers in statistics, but in the classroom he rarely discussed applications. Repeatedly, however, practical problems explicitly influenced both his own publications and those of his students and Wilks often used applications as motivation for the discussion of distribution theory, usually going well beyond the needs of the original problem. The papers to be discussed exhibit incompletely and fragmentarily a major influence on the work of the man and his students. Wilks watched the development of the statistical theory of order statistics closely. Indeed he wrote a masterful summary of the literature. The general area involves the study of the statistical properties of ordered measurements.
- Published
- 1965
- Full Text
- View/download PDF
12. ESTIMATION OF MULTIPLE CONTRASTS USING t-DISTRIBUTIONS.
- Author
-
Dunn, Olive Jean and Massey Jr, Frank J.
- Subjects
- *
TIME series analysis , *CHARACTERISTIC functions , *MATHEMATICAL statistics , *PROBABILITY theory , *CONFIDENCE intervals , *DISTRIBUTION (Probability theory) , *MATHEMATICAL models , *STATISTICAL sampling , *MULTIVARIATE analysis , *STATISTICS - Abstract
Various methods based on Student t variates have been suggested and used for obtaining simultaneous confidence intervals for several means, or for several contrasts among means. Determination of an overall confidence level for such intervals involves evaluating the probability mass of a multivariate t distribution over a hypercube centered at the origin, with sides paralleling the coordinate planes, or obtaining bounds for this probability mass. Since such distributions involve many nuisance parameters, an impossible number of tables would be necessary in order to make exact confidence intervals. In the virtual absence of tables, approximations and bounds become important. In this paper, an attempt has been made to investigate the adequacy of certain suggested approximations [2], [5], [8] by computing the exact distributions for some particular cases. These exact distributions have been compared with approximations. This paper is concerned with two-sided confidence intervals, rather than one-sided intervals. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
13. R.A. FISHER AND THE LAST FIFTY YEARS OF STATISTICAL METHODOLOGY.
- Author
-
Bartlett, M. S.
- Subjects
- *
STATISTICS , *ECONOMICS , *INTERVAL analysis , *GENETICS , *STATISTICAL sampling , *ESTIMATION theory - Abstract
In this article, the author will talk about researcher R.A. Fisher's contributions to statistics. It is well-known that his work in genetics was of comparable status. It is largely represented by his book "The Genetical Theory of Natural Selection," though in his subsequent work his further association with ecological and experimental studies in evolutionary genetics, and his share in the development of studies in the human blood groups, might especially be recalled. Fisher's contributions to statistics began with a paper in 1912 advocating the method of maximum likelihood for fitting frequency curves, although the first paper of substance was his 1915 paper in the journal "Biometrika," on the sampling distribution of the correlation coefficient. Fisher tried to pose problems of analysis as the reduction and simplification of statistical data. He put forward his well-known concept of amount of information in estimation theory, such that information might be lost, but never gained, by analysis. His concept has been of great practical value, especially in large sample theory.
- Published
- 1965
- Full Text
- View/download PDF
14. A HISTORY OF DISTRIBUTION SAMPLING PRIOR TO THE ERA OF THE COMPUTER AND ITS RELEVANCE TO SIMULATION.
- Author
-
Teichroew, Daniel
- Subjects
- *
SIMULATION methods & models , *PROBABILITY theory , *STATISTICAL sampling , *DISTRIBUTION (Probability theory) , *STATISTICS , *TIME series analysis , *DIGITAL computer simulation , *METHODOLOGY - Abstract
The use of simulation, as a technique for attacking difficult problems, has increased greatly with the availability of the digital computer. This is illustrated by the large number of references in Shubik's (1960) bibliography[sup 2] and in the large number of studies published since then. Simulation is essentially an extension of a technique known as empirical sampling, or distribution sampling, which has been used in the field of statistics for many years. The limitations of the technique, which are well known to statisticians, are apparently not as well known, or at least not as well recognized, by those using simulation today. The first part of this paper contains an historical survey of distribution sampling as used by statisticians. The material was originally prepared in 1953 and is reproduced here in slightly revised form to bring this history to the attention of present day simulators in order that the lessons that can be learned from this part can more readily be incorporated in the development of methodology today. The second part of this paper discusses the relevance of empirical sampling to the present day state of the art of simulation. The technique of generating random members, developed for empirical sampling can be applied directly to simulation. However in other aspects simulation is more difficult than empirical sampling and here the theory of distribution sampling does not have much to offer. The difficulties are due to lack of independence among time series, non-stationarity of the time series, and the large number of parameters. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
15. Point Estimation and Risk Preferences.
- Author
-
Baron, David P.
- Subjects
- *
ESTIMATION theory , *MATHEMATICAL statistics , *DISTRIBUTION (Probability theory) , *STATISTICAL decision making , *MATHEMATICAL models , *STATISTICAL sampling , *PROBABILITY theory - Abstract
The decision-theoretic approach to point estimation involves the choice of an estimate to minimize the expected loss associated with the estimate. The purpose of this paper is to indicate the influence of risk aversion on point estimates for classes of payoff functions including the piecewise linear and quadratic payoff functions. Increased risk aversion results in a point estimate closer to zero for a quadratic pay. off function and a lower estimate with a piecewise linear payoff function, for example. [ABSTRACT FROM AUTHOR]
- Published
- 1973
- Full Text
- View/download PDF
16. A Conservative Confidence Interval for a Likelihood Ratio.
- Author
-
Mitchell, Ann F. S. and Payne, Clive D.
- Subjects
- *
ANALYSIS of variance , *GAUSSIAN distribution , *CONFIDENCE intervals , *PARAMETER estimation , *RATIO analysis , *SIMULATION methods & models , *STATISTICAL sampling , *RATIO measurement , *STATISTICS - Abstract
A method is described for assigning an observation to one of two normal populations with differing, unknown means and differing, unknown variances. The classification procedure rests on the likelihood ratio, which, for a given observation, is a function of four unknown parameters. Sample information is used to obtain a confidence region for these parameters. From this confidence region, a conservative confidence interval for the likelihood ratio is derived. Interpreting the likelihood ratio as a measure of the odds in favor of each population as the source of the observation, the interval can be used, in an obvious manner, for classification purposes. Simulation techniques are employed to examine the conservative nature of the interval Finally, as an illustration of the method, the results are applied to the determination of authorship of the disputed Federalist papers. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
17. A CLASS OF SEQUENTIAL TESTS FOR AN EXPONENTIAL PARAMETER.
- Author
-
Hoel, D. G. and Mazumdar, M.
- Subjects
- *
STATISTICAL sampling , *BAYESIAN analysis , *POPULATION , *BERNOULLI numbers , *STATISTICS , *PROBLEM solving , *METHODOLOGY , *RESEARCH - Abstract
Weiss [5] and Freeman and Weiss [2] have shown how to construct sampling plans which at least approximately minimize the maximum expected sample size in the case of Bernoulli and Normal populations. In this paper we construct sequential procedures of this type for the testing of an exponential parameter. An invariance property of the stepping region is presented which simplifies the construction of the Bayes regions. Several sampling plans are given and some of their properties are obtained, [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
18. RELATIVE COSTS OF COMPUTERIZED ERROR INSPECTION PLANS.
- Author
-
O'Reagan, Robert T.
- Subjects
- *
COST effectiveness , *DATA editing , *STATISTICS , *QUALITY control , *ELECTRONIC data processing , *PROBLEM solving , *STATISTICAL sampling , *COMPUTER software - Abstract
Data editing by a computer program can be regarded as a form of statistical quality control. In that perspective, costs and benefits of alternative programs can be compared to one another and, of course, to no control plan at all. This paper outlines one approach to cost-benefit comparison and utilizes a realistic example to suggest that computerized data editing is generally superior to clerical editing. It also warns against edit programs which provide for record rejections and off-line review. Most importantly, it urges that a systematic comparison of costs can assist significantly in edit planning. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
19. TESTING AND ESTIMATING RATIOS OF SCALE PARAMETERS.
- Author
-
Shorack, Galen R.
- Subjects
- *
ROBUST statistics , *PERMUTATIONS , *RANDOM variables , *STATISTICAL sampling , *TESTING , *MONTE Carlo method , *SAMPLE size (Statistics) , *NUMERICAL analysis , *APPROXIMATION theory - Abstract
Let X[sub 1], ... , X[sub m] and Y[sub 1], ... , Y[sub n] be independent random samples from populations having continuous d.f.'s psi((x-micro)/sigma) and psi((y-nu)/tau) respectively. The classical F-test of a hypothesis concerning angle = tau/sigma is known to be non-robust. This paper examines several robust alternative procedures and compares them on the basis of Pitman a.r.e and Monte Carlo studies of power functions. An approximate permutation test [13] and a "jackknife" procedure [9] are found to be most satisfactory; while a class of "rank-like" tests [10] are found to be "useful inefficient statistics" [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
20. THE EXACT SAMPLING DISTRIBUTION OF ORDINARY LEAST SQUARES AND TWO-STAGE LEAST SQUARES ESTIMATORS.
- Author
-
Sawa, Takamitsu
- Subjects
- *
STATISTICAL sampling , *STOCHASTIC processes , *DISTRIBUTION (Probability theory) , *STATISTICS , *LEAST squares , *STOCHASTIC analysis , *REGRESSION analysis , *DENSITY functionals - Abstract
This paper presents the exact sampling distributions of the ordinary and the two-stage least squares estimators of a structural parameter in a structural equation with two endogenous variables in a complete system of stochastic equations. The results show that the distributions of the two estimators are essentially similar to each other. It can also be seen that both distributions depend crucially upon the deviation of a regression coefficient of disturbance terms of two endogenous variables from a structural parameter, and that the first estimator possesses moments up to the order N-2, while the second possesses them up to the order K-1, where N is the sample size and K is the number of exogenous variables excluded from the equation to be estimated. The small sample properties of the estimators are investigated by numerical evaluations of the density functions. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
21. TABLES OF CRITICAL VALUES OF SOME RENYI TYPE STATISTICS FOR FINITE SAMPLE SIZES.
- Author
-
Birnbaum, Z. W. and Lientz, B. P.
- Subjects
- *
FINITE groups , *STATISTICAL sampling , *DISTRIBUTION (Probability theory) , *SAMPLE size (Statistics) , *STATISTICS , *RANDOM variables , *PROBABILITY theory - Abstract
Let X[sub (1)] is less than or equal to X[sub (2] is less than or equal to ... is less than or equal to X[sub (n)] be an ordered sample of a random variable X which has continuous probability distribution function F (x), and let F[sub n] (x) be the corresponding empirical distribution function. The following three statistics, introduced by A. Renyi, are considered: [Multiple line equation(s) cannot be represented in ASCII text] The paper presents table of exact probabilities for these statistics for finite sample sizes. The limiting distributions of these statistics for sample size n arrow right Infinity are discussed, and sample sizes are indicated for which these limiting distributions can be used instead of the exact distributions. Numerical examples for the use of the tables are presented, as well as applications to testing hypotheses on life distributions and to one-sided estimation of probability distribution functions from censored data. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
22. THE COMPUTATION OF THE UNRESTRICTED AOQL WHEN DEFECTIVE MATERIAL IS REMOVED BUT NOT REPLACED.
- Author
-
Endres, Allen C.
- Subjects
- *
STATISTICAL sampling , *PRODUCT quality , *MANUFACTURING defects , *STATISTICS , *HYPOTHESIS , *PLANNING , *FRACTIONS - Abstract
The choice of a Dodge-type continuous sampling plan is usually based upon the requirement of a maximum limit on the expected fraction of accepted defective material (AOQL). This limit is contingent upon the process producing the material being in "a state of statistical control". White (1966) has provided a method by which a corresponding limit (UAOQL) can be obtained without the assumption that the process in statistical control. White assumed that all defective items were replaced by good items. This paper presents a method for calculating UAOQL's when defective items are removed but not replaced by good units. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
23. PLANNING SOME TWO-FACTOR COMPARATIVE SURVEYS.
- Author
-
Booth, Gordon and Sedransk, J.
- Subjects
- *
DEMOGRAPHIC surveys , *SURVEYS , *STATISTICAL sampling , *MATHEMATICAL programming , *SAMPLE size (Statistics) , *ESTIMATION theory , *METHODOLOGY , *ALGORITHMS - Abstract
In this paper it is assumed that, using a sample survey, two factors are to be studied, comparisons between the "levels" of the factors are of greatest interest, and there is "interaction" between the factors. Attention is concentrated on situations in which only two levels of each factor are to be compared, but extensions to more complex surveys are discussed. Assuming independent sampling, optimal sample size allocations are obtained. Where these allocations require recourse to programming algorithms, approximate solutions are given. If independent sampling is not feasible, a double sampling procedure is suggested. To indicate how sub-sampling from the first phase sample is to be carried out, a sampling rule (possessing optimal conditional precision properties) is derived. Then, a procedure to determine the optimal first phase sample size is given. Finally, it is demonstrated that this double sampling procedure can be applied to estimation of the (finite) population mean when double sampling with stratification is used. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
24. THE UNRELATED QUESTION RANDOMIZED RESPONSE MODEL: THEORETICAL FRAMEWORK.
- Author
-
Greenberg, Barnard G., Abul-Ela, Abdel-latif A., Simmons, Walt R., and Horvitz, Daniel G.
- Subjects
- *
RANDOM variables , *STATISTICS , *DEMOGRAPHIC surveys , *STATISTICAL sampling , *MATHEMATICAL models , *PARAMETER estimation , *METHODOLOGY , *MULTILEVEL models , *MATHEMATICAL statistics - Abstract
This paper develops a theoretical framework for the unrelated question randomized response technique suggested by Walt R. Simmons. The statistical efficiency of this technique is compared with the Warner technique under situations of both truthful and untruthful responses. Methods of allocating the total sample to each of two subsamples required by the unrelated question approach are developed. Recommendations are made concerning choices of values for those parameters which can be assigned at the discretion of the investigator. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
25. SPECTRAL PROPERTIES OF NON-STATIONARY SYSTEMS OF LINEAR STOCHASTIC DIFFERENCE EQUATIONS.
- Author
-
Chow, Gregory C. and Levitan, Richard E.
- Subjects
- *
STATISTICAL sampling , *STOCHASTIC difference equations , *LINEAR differential equations , *STATIONARY processes , *DENSITY matrices , *MATHEMATICAL statistics , *STOCHASTIC processes , *MATRICES (Mathematics) , *VARIANCES - Abstract
A method is proposed to eliminate trends from a sample from a nonstationary system of linear stochastic difference equations. The autocovariance matrix and the spectral density matrix of the detrended component of the sample are derived. The latter matrix turns out to have the same form as the spectral density matrix for a stationary system when expressed in terms of the roots of the system. Since the parameters of the system are assumed known throughout the paper, the problem of statistical inference does not arise. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
26. SHORTER CONFIDENCE INTERVALS USING PRIOR OBSERVATIONS.
- Author
-
Deely, J. J. and Zimmer, W. J.
- Subjects
- *
MATHEMATICAL models , *CONFIDENCE intervals , *STATISTICAL hypothesis testing , *VARIANCES , *ESTIMATES , *STATISTICAL sampling - Abstract
The purpose of this paper is to make the reader aware of the applicability and advantage of a particular mathematical model. The application is typified by an example and the advantage is via confidence intervals; that is, shorter confidence intervals are possible using the model than if one ignores it, providing the applicability is valid. It is also shown that an improved estimate can be obtained through use of the model. Let f(y|mu, sigma) be a normal density with mean mu and variance sigma[sup 2] and let g(mu|lambda, beta) be a normal density with mean lambda and variance beta[sup 2]. A sequence y[sub 1], y[sub 2],..., y[sub n+1] of independent observations from the mixture off and g can be considered as follows: An unobservable mu [sub i] is first drawn from g(mu|lambda, beta) and then y[sub i] which can be observed is drawn from f(y|mu[sub i], sigma). Confidence intervals on mu[sub n+1] are obtained which are based on the observations y[sub 1],..., y[sub n+1] and which are shorter than the standard interval based on y[sub n+1] only for any n. Shorter intervals are obtained for two cases: (i) lambda unknown, sigma, beta known; (ii) only sigma/beta = c known. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
27. CORRELATION COEFFICIENTS MEASURED ON THE SAME INDIVIDUALS.
- Author
-
Dunn, Olive Jean and Clark, Virginia
- Subjects
- *
STATISTICAL correlation , *GAUSSIAN distribution , *POPULATION , *MULTIVARIATE analysis , *Z transformation , *STATISTICAL sampling , *ASYMPTOTIC distribution , *MATHEMATICAL statistics - Abstract
When two correlation coefficients are calculated from a single sample, rather than from two samples, they are not statistically independent, and the usual methods for testing equality of the population correlation coefficients no longer apply. This paper considers the situation when the sample is from a multivariate normal distribution. Several possible large sample testing procedures are given, all based on Fisher's z-transformation. Power curves are given for each procedure and for seven values of the asymptotic correlation between the two sample correlation coefficients. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
28. MISSING OBSERVATIONS IN MULTIVARIATE STATISTICS--IV A NOTE ON SIMPLE LINEAR REGRESSION.
- Author
-
Afifi, A. A. and Elashoff, R. M.
- Subjects
- *
MULTIVARIATE analysis , *PARAMETER estimation , *REGRESSION analysis , *ASYMPTOTIC efficiencies , *STATISTICAL sampling , *ESTIMATION bias , *SAMPLE size (Statistics) , *ESTIMATION theory - Abstract
In this note we examine the bias and small sample efficiency of certain estimators for the parameters of a linear regression function when some observations are missing. The estimators studies in this paper were suggested by the large sample study reported in this issue of the Journal. We conclude that our asymptotically unbiased estimators of beta and mu[sub y|x], have negligible bias in sample sizes as small as n = 20, and that our asymptotically unbiased estimator of sigma[sup 2] may have an 8% bias when n = 20. The small sample and asymptotic efficiencies of these estimators are nearly the same for n = 60; when n = 20 the difference between these two efficiencies depends on the correlation coefficient rho and the pattern of missing observations. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
29. EXACT THREE-ORDER-STATISTIC CONFIDENCE BOUNDS ON RELIABLE LIFE FOR A WEIBULL MODEL WITH PROGRESSIVE CENSORING.
- Author
-
Mann, Nancy R.
- Subjects
- *
WEIBULL distribution , *CONFIDENCE intervals , *ORDER statistics , *STATISTICAL sampling , *DISTRIBUTION (Probability theory) , *FAILURE time data analysis , *MATHEMATICAL statistics - Abstract
A progressive-censoring model arises from a life test of a sample of items in which one or more of the survivors may be removed from the test at the time of any failure. Such a model is often more realistic for actual failure data which must be analyzed by a statistician than one in which all survivors are assumed to be removed from test simultaneously. This paper deals with the situation in which the underlying failure-time distribution for the population sampled is the two-parameter Weibull distribution. The reliable life for the population is defined to be the 100(1-R) percent point of the failure-time distribution, where R is a specified population survival proportion, or reliability. An exact confidence bound on reliable life based on three observed ordered failure times is derived for this progressive-censoring model. The criterion used for selecting the order numbers of the three failure times upon which the bound is based depends upon computed values of the power function of the test associated with the bound. A table from which lower bounds can be obtained is given for R equal to .95, confidence level .90, sample size equal to 2, 3, . . . , 6, and all possible censorings. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
30. SMALL-SAMPLE PROPERTIES OF SEVERAL TWO-STAGE REGRESSION METHODS IN THE CONTEXT OF AUTO-CORRELATED ERRORS.
- Author
-
Rao, Potluri and Griliches, Zvi
- Subjects
- *
REGRESSION analysis , *STATISTICAL sampling , *PARAMETER estimation , *ERRORS , *STATISTICAL correlation , *AUTOCORRELATION (Statistics) , *MONTE Carlo method , *MATHEMATICAL models , *STOCHASTIC processes - Abstract
In a linear regression model, when errors are autocorrelated, several asymptotically efficient estimators of parameters have been suggested in the literature. In this paper we study their small sample efficiency using Monte Carlo methods. While none of these estimators turns out to be distinctly superior to the others over the entire range of parameters, there is a definite gain in efficiency to be had from using some two-stage procedure in the presence of moderate high levels of serial correlation in the residuals and very little loss from using such methods when the true rho is small. Where computational costs are a consideration a mixed strategy of switching to a second stage only if the estimated rho is higher than some critical value is suggested and is shown to perform quite well over the whole parameter range. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
31. PLAY THE WINNER RULE AND THE CONTROLLED CLINICAL TRIAL.
- Author
-
Zelen, M.
- Subjects
- *
CLINICAL trials , *STATISTICAL sampling , *THERAPEUTICS , *SAMPLE size (Statistics) , *PATIENTS , *STATISTICS - Abstract
Consider a clinical trial to compare two treatments where response is dichotomous and patients enter the trial sequentially. This paper investigates the conduct of such a trial where the "Play the Winner Rule" (PWR) is used to assign patients to the different therapies. The implementation of the PWR in a clinical trial tends to place more patients on the better treatment. Both theoretical and numerical investigations show that over a wide range of situations this rule leads to near optimum results when used in a two-stage manner. Furthermore, these results are insensitive to optimum sample size requirements. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
32. ORDER STATISTICS FOR DISCRETE POPULATIONS AND FOR GROUPED SAMPLES.
- Author
-
David, H. A. and Mishriky, R. S.
- Subjects
- *
ORDER statistics , *DISTRIBUTION (Probability theory) , *PARAMETER estimation , *STATISTICAL sampling , *ANALYSIS of variance , *PROBABILITY theory - Abstract
The aim of this paper is two-fold: (1) To give a unified treatment of the theory of order statistics when the parent distribution is not necessarily continuous. (2) To assess the effects of grouping on the distribution of order statistics and to indicate the convenience, under suitable conditions, of using order statistics for the estimation of parameters from grouped data with or without censoring. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
33. THE USE OF A STRATIFICATION VARIABLE IN ESTIMATION BY PROPORTIONAL STRATIFIED SAMPLING.
- Author
-
Särndal, Carl-Erik
- Subjects
- *
STATISTICAL sampling , *ESTIMATION theory , *MATHEMATICAL variables , *GAUSSIAN distribution , *EQUATIONS , *DISTRIBUTION (Probability theory) , *MATHEMATICAL statistics - Abstract
This paper deals with proportional stratified sampling in the situation where the estimation variable X is difficult and expensive to observe, while the possibly erroneous stratification variable Y is easy and inexpensive to get at. The usually biased estimate [Multiple line equation(s) cannot be represented in ASCII text] is compared with the unbiased estimate [Multiple line equation(s) cannot be represented in ASCII text] where the P[sub I] are stratum weights and y[sub I] and x[sub I] are means of the units sampled from the I:th stratum. The two estimates are similar in that they utilize information from only those population units that make up the sample. While mu[sub a] is the more inexpensive estimate, mu[sub b] is usually preferable if one judges by the size of the mean square error, which, among other things, depends on the number of strata and the location of the stratum boundaries. In particular, the properties of mu[sub a] and mu[sub b] are discussed in connection with an example involving the bivariate normal distribution. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
34. OPTIMAL ALLOCATION IN STRATIFIED AND MULTISTAGE SAMPLES USING PRIOR INFORMATION.
- Author
-
Ericson, W. A.
- Subjects
- *
GAUSSIAN distribution , *ALGORITHMS , *BUDGET , *OVERHEAD costs , *STATISTICAL sampling , *DISTRIBUTION (Probability theory) - Abstract
The author [1], [2] has given an algorithm for finding that stratified allocation which minimizes the posterior variance of the overall population mean subject to a budget constraint under a model in which a normal prior distribution and independent normal sampling distributions were assumed. The budget constraint assumed a variable per unit cost of observation. In the present paper these results are extended to cover the case where there are fixed costs, as well as variable costs, associated with sampling in the ith stratum. The resulting algorithm is noted to be applicable in finding the optimal allocation of sampling effort (with fixed and variable sampling costs) under a variety of distributional assumptions. An interpretation is also given to two and higher stage design questions when there is differential prior information regarding the first stage units. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
35. ACCURACY OF AN APPROXIMATION TO THE POWER OF THE CHI-SQUARE GOODNESS OF FIT TEST WITH SMALL BUT EQUAL EXPECTED FREQUENCIES.
- Author
-
Slakter, Malcolm J.
- Subjects
- *
ESTIMATION theory , *SAMPLE size (Statistics) , *MONTE Carlo method , *STATISTICAL sampling , *APPROXIMATION theory , *MATHEMATICAL models - Abstract
This paper presents the results of a Monte Carlo study of the accuracy of an approximation to the power of the chi-square goodness of fit test with small but equal expected frequencies. Various combinations of sample size, number of groups, and alpha level are considered, and in most instances the actual power of the test is estimated to be less than the nominal power. The degree of accuracy appears to be more related to the size of the sample than to the size of the expected frequencies. The following rule of thumb is offered for obtaining crude estimates of the actual power from the nominal power for sample sizes from 10 to 50: The actual power of the test equals about eight-tenths of the nominal power. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
36. ANALYSIS OF EXTREME-VALUE DATA BY SAMPLE QUANTILES FOR VERY LARGE SAMPLES.
- Author
-
Hassanein, Khatab M.
- Subjects
- *
NONPARAMETRIC statistics , *MATHEMATICAL statistics , *STATISTICAL sampling , *SAMPLE size (Statistics) , *ESTIMATION theory , *DISTRIBUTION (Probability theory) - Abstract
This paper deals with estimating the location and the scale parameters of the extreme value distribution when the sample size is very large, using sample quantiles. An estimator is given for the location parameter when the scale parameter is known, based on one or more (up to 15) order statistics. Also given is an estimator for the scale parameter when the location parameter is known, built on two order statistics. An iterative procedure is utilized to estimate the parameters when both are unknown, using two order statistics. The problem of estimating the percentage point is also dealt with, and a comparison with Lieblein's method is included. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
37. ESTIMATION OF THE LARGER OF THE TWO NORMAL MEANS.
- Author
-
Blumenthal, Saul and Cohen, Arthur
- Subjects
- *
ESTIMATION theory , *STOCHASTIC processes , *GAUSSIAN distribution , *STATISTICAL sampling , *MATHEMATICAL statistics , *PROBABILITY theory - Abstract
Let X[sub i1], X[sub i2],..., X[sub iota n], I=1, 2, be a pair of random samples from populations which are normally distributed with means theta[sub iota], and common known variance tau[sup 2]. The problem is to estimate the function psi(theta[sub 1], theta[sub 2]) = maximum (theta[sub 1], theta[sub 2]). In this paper we consider five different estimators (or sets of estimators) for psi(theta[sub 1], theta[sub 2]) and evaluate their biases and mean square errors. The estimators are (I) psi(X[sub 1], X[sub 2]), where X[sub I] is the sample mean of the ith sample; (ii) the analogue of the Pitman estimator, i.e. the a posteriori expected value of psi(theta[sub 1], theta[sub 2]) when the generalized prior distribution is the uniform distribution on two dimensional space; (iii) a class of estimators which are generalized Bayes with respect to generalized priors which are products of uniform and normal priors; (iv) hybrid estimators, i.e. those which estimate by (X[sub 1] + X[sub 2])/2 when |X[sub 1] -X[sub 2]| is small, and estimate by psi(X[sub 1], X[sub 2]) when |X[sub 1] - X[sub 2]| is large; (v) maximum likelihood estimator. The bias and mean square errors for these estimators are tabled, graphed, and compared. Also the invariance properties of these estimators are discussed with a rationale for restricting to invariant estimators. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
38. THE FIRST-MEDIAN TEST: A TWO-SIDED VERSION OF THE CONTROL MEDIAN TEST.
- Author
-
Gastwirth, J. L.
- Subjects
- *
DISTRIBUTION (Probability theory) , *HYPOTHESIS , *STATISTICAL sampling , *MEDIAN (Mathematics) , *STATISTICS , *TESTING - Abstract
Consider two independent samples (called x's and y's) of mutually independent observations from populations with c.d.f.'s F(x) and G(x) respectively. Let v be the number of x's which are smaller than the median of the y-sample and let u be the number of y's which are less than the median of the x-sample. If the x-sample is regarded as the control group, then the control median test proposed by Kimball et al. [9], rejects the hypothesis that F(x) Is equivalent to G(x) if u is small. The present paper discusses a symmetrized version of this test, the first median test, which is based on u if the median of the x-sample precedes the median of the y-sample and on v, otherwise. The asymptotic distribution theory of both tests is developed. The tests are useful in analyzing life trial data because they permit the experimeter to reach a decision early. In the life trial situation it is important to minimize the expected number of observations required to reach a decision. It is. shown that in large samples, when curtailed sampling is utilized, these procedures reach a decision before the standard median test [12, 13]. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
39. ON A GENERAL SYSTEM OF DISTRIBUTIONS I. ITS CURVE-SHAPE CHARACTERISTICS II. THE SAMPLE MEDIAN.
- Author
-
Irving W. Burr and Cislak, Peter J.
- Subjects
- *
BETA functions , *DISTRIBUTION (Probability theory) , *CURVE fitting , *TRANSCENDENTAL functions , *LINEAR systems , *SPECIAL functions , *MEDIAN (Mathematics) , *EQUATIONS , *STATISTICAL sampling - Abstract
This paper presents the region of coverage of the curve shape characteristics, alpha[sup 2, sub 3] and delta = (2 alpha[sub 4]-3 alpha[sup 2, sub 3]-6)/(alpha[sub 4] + 3), for a certain general system of distributions, Burr (1942). These curve-shape characteristics were chosen for comparison with the Pearson system because of the simplicity with which they map the members of the latter system, Craig (1936). It is here shown that the present system covers almost all of the regions of the main Pearson Types IV and VI, and an important part of that of the main Type I (or beta distribution). The density function of the median for odd sized samples from the present system is given in closed form. All finite moments of the median are linear combinations of beta functions. Important characteristics of the median: bias,[Multiple line equation(s) cannot be represented in ASCII text] and efficiency relative to the sample mean, are given for samples of n = 3, 5, 7 and 11, for populations with [Multiple line equation(s) cannot be represented in ASCII text], .50, 1.00, 1.50, and, corresponding to each alpha[sub 3 :x] two well separated alpha[sub 4 :x] values. It also appears that for this system, the median begins to be more efficient than the mean at about the degree of non-normality of the exponential distribution. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
40. AN INVESTIGATION INTO THE SMALL SAMPLE PROPERTIES OF A TWO SAMPLE TEST OF LEHMANN'S'S.
- Author
-
Afifi, A. A., Elashoff, R. M., and Langley, P. G.
- Subjects
- *
ASYMPTOTIC distribution , *STATISTICAL sampling , *GAUSSIAN distribution , *SAMPLE size (Statistics) , *DISTRIBUTION (Probability theory) , *APPROXIMATION theory , *MATHEMATICAL statistics , *STATISTICS - Abstract
In this paper we examine how well the asymptotic null distribution of a two sample test due to Lehmann approximates the small sample distribution of the test, compare the validity of this Lehman test with the validity of the two sample t test under the null hypothesis of equal means, and compare the power of this Lehmannn test with the power of the t test. Our general conclusion is that experimenters will prefer to use the t test when the underlying distribution is the scale contaminated compound normal distribution and the sample sizes are less than thirty. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
41. AN ALGORITHM FOR THE DETERMINATION OF THE ECONOMIC DESIGN OF X-CHARTS BASED ON DUNCAN'S MODEL.
- Author
-
Goel, A. L., Jain, S. C., and Wu, S. M.
- Subjects
- *
ALGORITHMS , *GRAPHIC methods , *MATHEMATICAL variables , *STATISTICAL sampling , *SAMPLE size (Statistics) , *MATHEMATICAL statistics , *MATHEMATICAL models - Abstract
An algorithm for the determination of the economic design of X-charts based on Duncan's model is described in this paper. This algorithm consists of solving an implicit equation in design variables n (sample size) and k (control limit factor) and an explicit equation for h (sampling interval). The use of this algorithm not only yields the exact optimum but also provides valuable information so that the sensitivity of the optimum loss-cost (L*) can be evaluated. Loss-cost contours are used to discuss the nature of the loss-cost surface and the effect of the design variables. The effect of two parameters, the delay factor (e), and the average time for an assignable cause to occur (1/lambda), on the optimum design is evaluated. Numerical examples are used for illustrations. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
42. ESTIMATES IN SUCCESSIVE SAMPLING USING A MULTI-STAGE DESIGN.
- Author
-
Singh, S.
- Subjects
- *
STATISTICAL sampling , *EXPERIMENTAL design , *STATISTICS , *ESTIMATION theory , *MATHEMATICAL statistics , *SAMPLE variance - Abstract
In this paper, successive sampling procedures using multi-stage sampling design have been developed. It is found that it is generally advantageous to retain a fraction of the sample selected on previous occasions for improving the estimate of a mean on the most recent occasion. This type of partial replacement may sometimes be recommended for estimating a mean over all occasions. This is specially true if the experimenter is interested in not only obtaining an overall estimate for the entire period but also in separate estimates for each occasion. In a sampling enquiry repeated on three occasions it has been observed that for estimating the mean on the third occasion it would be preferable to repeat the same sample fraction from one occasion to the next, while for estimating the mean over all occasions the sample fraction repeated on the second occasion should not be repeated on the third but in its place a sub-sample from the sample selected afresh on the second occasion should be retained. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
43. SOME NONRESPONSE SAMPLING THEORY WHEN THE FRAME CONTAINS AS UNKNOWN AMOUNT OF DUPLICATION.
- Author
-
Rao, J. N. K.
- Subjects
- *
STATISTICAL sampling , *STATISTICS , *THEORY , *SURVEYS , *BEEF cattle , *BUSINESS partnerships - Abstract
Hanson and Hurwitz [3] have developed some non-response sampling theory, using the double sampling method. In this paper, this theory is extended to the case where there is an unknown amount of duplication in the available frame. The theory is developed from the context of a sample survey of beef cattle producers. The available frame was a list of names and addresses of beef cattle producers, whereas the unit of interest was a beef cattle producing operation which could be operated individually or in partnership. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
44. ON SOME MULTISAMPLE PERMUTATION TESTS BASED ON A CLASS OF U-STATISTICS.
- Author
-
Sen, Pranad Kumar
- Subjects
- *
U-statistics , *DISTRIBUTION (Probability theory) , *STATISTICAL sampling , *STATISTICS , *SAMPLE size (Statistics) , *COST analysis , *VARIANCES - Abstract
A class of permutation tests based on appropriate U-statistics is proposed for the multisample location, scale, and association problems. These U-statistics need not be functions solely of the ranks of the different sample observations and for the application of the tests the parent distributions need not be continuous. The proposed tests are based on the existence of some equally likely permutations of the sample observations and are valid for small as well as large sample sizes. Further, the underlying permutation principle yields optimum estimators of the variances and covariances of U-statistics entering into the expressions for the test statistics. The large sample test procedure is thereby simplified. This paper consists partly of expository material and partly of extensions of some allied two-sample results, considered earlier by the author (Sen [28]). [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
45. DESIGNING SOME MULTI-FACTOR ANALYTICAL STUDIES.
- Author
-
Sedransk, J.
- Subjects
- *
STATISTICAL sampling , *COST , *SURVEYS , *POPULATION , *MATHEMATICAL optimization , *MATHEMATICAL analysis - Abstract
The designing of some multi-factor "analytical" studies of survey data is considered in this paper. It is assumed that, for each factor, there are two categories of interest and these are to be compared. The main objective is to allocate the sample so that the desired precision for the specified contrasts is obtained at minimum cost. It is assumed that one may sample independently in each of the sub-populations under investigation. A model is employed to facilitate designing and analyzing such a survey. We discuss the relevance of this model for "analytical studies," and a few examples are presented. Several survey objectives are explored, and an optimal allocation is obtained for each. Two, three and four-factor studies are considered explicitly. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
46. ASYMPTOTICALLY ROBUST ESTIMATORS OF LOCATION.
- Author
-
Siddiqui, M. M. and Raghunandanan, K.
- Subjects
- *
ESTIMATION theory , *DENSITY functionals , *DISTRIBUTION (Probability theory) , *STATISTICAL sampling , *MEDIAN (Mathematics) , *FUNCTIONAL analysis , *PROBABILITY theory - Abstract
The robustness properties of four estimators of location are studied with respect to eight distribution types. For each type, the probability density function is symmetric about the median and the range of variate is infinite. For the entire class of distributions, the estimator with the highest guaranteed efficiency is the mean of the middle fifty percent of the sample. This study supplements the paper by Crow and Siddiqui (1967). [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
47. FINITE SAMPLE MONTE CARLO STUDIES: AN AUTOREGRESSIVE ILLUSTRATION.
- Author
-
Thornber, Hodson
- Subjects
- *
FIX-point estimation , *STATISTICAL sampling , *MONTE Carlo method , *ESTIMATION theory , *REGRESSION analysis , *DECISION theory , *MATHEMATICAL models , *STOCHASTIC processes - Abstract
In this paper the problem of choosing among point estimators on the basis of their small sample properties is discussed from the sampling point of view. The indeterminacy of most Monte Carlo studies is analysed and resolved within the framework of statistical decision theory. A first order autoregressive model is worked through in detail both for its own sake and to illustrate how a complete Monte Carlo study might be done. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
48. WILCOXON CONFIDENCE INTERVALS FOR LOCATION PARAMETERS IN THE DISCRETE CASE.
- Author
-
Noether, Gottfried E.
- Subjects
- *
CONFIDENCE intervals , *STATISTICAL sampling , *FRACTIONAL parentage coefficients , *LEAST squares , *HYPOTHESIS - Abstract
The paper surveys results about the behavior of nonparametric methods in cases where the customary continuity assumptions are not satisfied. A projection approach is used to show that true confidence coefficients associated with confidence intervals for appropriate location parameters derived from Wilcoxon one- and two-sample tests are at least equal to the nominal level for the continuous case, if the confidence intervals are considered closed, at most equal to the nominal level, if they are considered open. The results are interpreted in terms of tests of hypotheses. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
49. MINIMUM VARIANCE UNBIASED AND MAXIMUM LIKELIHOOD ESTIMATORS OF RELIABILITY FUNCTIONS FOR SYSTEMS IN SERIES AND IN PARALLEL.
- Author
-
Zacks, S. and Even, M.
- Subjects
- *
ESTIMATION theory , *STATISTICS , *ANALYSIS of variance , *VARIANCES , *POISSON processes , *EXPONENTIAL families (Statistics) , *EXPONENTIAL sums , *DISTRIBUTION (Probability theory) , *STATISTICAL sampling - Abstract
This paper investigates the properties of the minimum variance unbiased (M.V.U) and maximum likelihood (M.L.) estimators of the reliability functions of systems composed of two subsystems connected in series. The study falls into two parts, one for the Poisson case and one for the exponential case. In each of these cases the situations are distinguished between, where the two subsystems are identical and situations subsystems are different. In the Poisson case under minimum variance unbiased estimators a system A is considered which is composed of two subsystems connected in series. Failure time points of the subsystem follow a Poisson process with intensity. An experiment is performed on n independent replicates of each of the considered subsystems over a period of length. Under the exponential case, a system A is considered, same as Poisson case which consists of two subsystems connected in series. Failure time points of the two subsystems follow a Poisson process. Independent observations are available on the interfailure time lengths; namely, the life-lengths of the subsystems.
- Published
- 1966
- Full Text
- View/download PDF
50. THE EFFICIENCIES IN SMALL SAMPLES OF THE MAXIMUM LIKELIHOOD AND BEST UNBIASED ESTIMATORS OF RELIABILITY FUNCTIONS.
- Author
-
Zacks, S. and Even, M.
- Subjects
- *
ESTIMATION theory , *STATISTICAL sampling , *POISSON processes , *EXPONENTIAL families (Statistics) , *STANDARD deviations , *PROBABILITY theory , *VARIANCES , *ERRORS , *RATIO & proportion - Abstract
The paper presents the results of an inquiry concerning the small sample relative efficiency of maximum likelihood and best unbiased estimators of reliability functions of one-unit systems. Three cases are considered: The Poisson, exponential, and normal (standard deviation known). Two kinds of relative efficiency functions are studied. The first kind consists of the common ratio of the Cramer-Rao lower bound of the variances of unbiased estimators, to the mean-square-error of the considered estimator. The second kind is a new type of a relative efficiency function, which is called 'the closeness relative efficiency function.' This function is defined as the ratio of the probabilities that the maximum likelihood and the beat unbiased estimators yield estimates in a prescribed neighborhood of the unknown reliability value. A substantial part of the study is devoted to the derivation of the required moments of the estimators. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.