36 results
Search Results
2. A Proposal for a Computer-Based Interactive Scientific Community.
- Author
-
Pager, David and Lawson, C. L.
- Subjects
- *
COMPUTER systems , *COMPUTER software , *PROGRAMMING languages , *MULTIMEDIA systems , *MATHEMATICS , *COMPUTER terminology - Abstract
Because of the problems created by the explosion of papers in the mathematical sciences and the drawbacks that this places on research, it is suggested that a tree of all mathematical results and terminology be maintained in a multiterminal computer system. Users of the system can store in the computer an updated file of their current knowledge, and on selecting a paper to read, they can obtain from the computer the minimum subtree of theorems required to bring them from what they already know to the background knowledge which the paper assumes. Under certain conditions, means are also provided for the contribution of useful comments by the readers of a work and for interaction between commentators and with the author. This paper describes how the system can be organized and the role required of readers, writers, and commentators. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
3. Interval Arithmetic Determinant Evaluation and Its Use in Testing for a Chebyshev System.
- Author
-
Smith, Lyle B.
- Subjects
- *
INTERVAL analysis , *MATHEMATICS , *NUMERICAL analysis , *CHEBYSHEV systems , *CONTINUOUS functions , *DETERMINANTS (Mathematics) - Abstract
Two recent papers, one by Hansen and one by Hansen and R. R. Smith, have shown how interval Arithmetic (I.A.) can be used effectively to bound errors in matrix computations. In the present paper a method proposed by Hansen and R. R. Smith is compared with straightforward use of I.A. in determinant evaluation. Computational results show the accuracy and running times that can be expected when using I.A. for determinant evaluation. An application using I.A. determinants in a program to test a set of functions to see if they form a Chebyshev system is then presented. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
4. A SELECTION OF EARLY STATISTICAL PAPERS OF J. NEYMAN (Book).
- Author
-
Kempthorne, Oscar
- Subjects
- *
MATHEMATICS , *STATISTICS , *NONFICTION - Abstract
Reviews the book "A Selection of Early Statistical Papers of J. Neyman."
- Published
- 1970
- Full Text
- View/download PDF
5. The Selected Papers of E.S. Pearson (Book).
- Author
-
Johnson, Norman L.
- Subjects
- *
MATHEMATICS , *PERIODICAL editors , *NONFICTION - Abstract
Reviews the book "Selected Papers of E.S. Pearson."
- Published
- 1968
- Full Text
- View/download PDF
6. Algorithm 457 Finding All Cliques of an Undirected Graph [ H ].
- Author
-
Bron, Coen, Kerbosch, Joep, Roy, Mohit Kumar, Lawrence, E. E., and Williamson, Hugh
- Subjects
- *
ALGORITHMS , *GRAPH theory , *BRANCH & bound algorithms , *MATHEMATICAL programming , *ALGEBRA , *MATHEMATICS - Abstract
The article reports on finding all subgraphs of an undirected graph [H]. A maximal complete subgraph is also called clique which refers to a complete subgraph that is not included in any other complete subgraph. According to the authors, a research paper was released wherein descriptions of techniques to find maximal complete subgraphs are provided. The authors discussed two algorithms using a branch-and-bound technique so that there will be no clique. The first algorithm produces cliques in lexicographic order while the second is derived from the first algorithm.
- Published
- 1973
- Full Text
- View/download PDF
7. A Statistical Study of the Accuracy of Floating Point Number Systems.
- Author
-
Kuki, H., Cody, W. J., and Timlake, W. P.
- Subjects
- *
FLOATING-point arithmetic , *BINARY number system , *COMPUTER arithmetic , *COMPUTER programming , *MATHEMATICS , *NUMBER systems - Abstract
This paper presents the statistical results of tests of the accuracy of certain arithmetic systems in evaluating sums, products and inner products, and analytic error estimates for some of the computations. The arithmetic systems studied are 6-digit hexadecimal and 22-digit binary floating point number representations combined with the usual chop and round modes of arithmetic with various numbers of guard digits, and with a modified round mode with guard digits. In a certain sense, arithmetic systems differing only in their use of binary or hexadecimal number representations are shown to be approximately statistically equivalent in accuracy. Further, the usual round mode with guard digits is shown to be statistically superior in accuracy to the usual chop mode in all cases save one. The modified round mode is found to be superior to the chop mode in all cases. [ABSTRACT FROM AUTHOR]
- Published
- 1973
- Full Text
- View/download PDF
8. TECHNICAL PROGRAM--FALL JOINT COMPUTER CONFERENCE 1968.
- Subjects
- *
COMPUTER science , *MANAGEMENT information systems , *ELECTRONIC data processing , *MATHEMATICS - Abstract
A calendar of events for the Fall Joint Computer Conference 1968 technical program is presented. Information about several papers that will be discussed at a symposium including doctor Sanford Elkin's "Reliability, Maintenance, Error Recovery in Third Generation Systems" and doctor Glen Lewis "Applied Mathematics" is presented.
- Published
- 1968
9. Eliminating Monotonous Mathematics with FORMAC.
- Author
-
Tobey, Robert G. and Graham, R. M.
- Subjects
- *
COMPILERS (Computer programs) , *COMPUTER software , *MATHEMATICAL analysis , *SYSTEMS software , *PROGRAMMING languages , *MATHEMATICS - Abstract
The FORMAC (FORmula MAnipulation Compiler) programming system provides a powerful tool for performing mathematical analysis. It is an extension of FORTRAN IV which permits the use of the computer to perform the tedious algebraic computations that arise in many different fields. Among the areas in which it has been successfully used are: differentiation of complicated expressions, expansion of truncated power series, solution of simultaneous equations with literal coefficients, nonlinear maximum likelihood estimation, tensor analysis, and generation of the coefficients of equations in Keplerian motion. These types of analysis—which arose in the solution of specific practical problems in physics, engineering, astronomy, statistics and astronautics—are discussed in the paper. In addition to its usage for specific problem solutions, FORMAC can also be used to automate the analysis phase in certain production programming. Several such applications are presented. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
10. APPLIED STATISTICS ALGORITHMS SECTION.
- Subjects
- *
MATHEMATICS , *ALGORITHMS , *PAPER , *COMPUTER programming , *TECHNICAL specifications - Abstract
The article presents information on the publication of a book "Applied Statistics, Algorithms," relevant to statistics, by the Royal Statistical Society in cooperation with the Science Research Council's Working Party on Statistical Computing. A policy statement describing the editorial policy appears in "Applied Statistics," Vol. 1. No. 1 (1968). A support paper describing the expected contents of the external specification and making recommendation for the layout of algorithms and for programming strategy will appear in the following issue.
- Published
- 1968
11. TIES IN PAIRED-COMPARISON EXPERIMENTS: A GENERALIZATION OF THE BRADLEY-TERRY MODEL.
- Author
-
Rao, P. V. and Kupper, L. L.
- Subjects
- *
PROBABILITY theory , *MATHEMATICS , *BAYESIAN analysis , *PARAMETERS (Statistics) , *STATISTICS - Abstract
The Bradley-Terry model for a paired-comparison experiment with t treatments postulates a set of t 'true' treatment ratings π1, π2, ..., π1 such that π1 is ≥ 0, σpi;i = 1 and the probability for preferring treatment i to treatment j is π2 (π2 + πj)-1. Thus, according to this model, every comparison of two treatments results in a definite preference for one of the two. This is an unrealistic restriction since when there is no difference between the responses due to two treatments, any method of expressing preference for one over the other is somewhat arbitrary. This paper considers a modification of the Bradley-Terry model by introducing an additional parameter, called threshold parameter, into the model. This permits 'ties' in the model. The problem of estimation and tests of hypotheses for the parameters of the modified model is also dealt with in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
12. A Car-Following Model Relating Reaction Times and Temporal Headways to Accident Frequency.
- Author
-
Brill, Edward A.
- Subjects
- *
AUTOMOBILE driving , *MODEL cars (Toys) , *AUTOMOBILE drivers , *TRAFFIC accidents , *ACCELERATION (Mechanics) , *TRANSPORTATION , *TRANSPORTATION accidents , *MATHEMATICS , *MOTOR vehicle drivers , *REACTION time - Abstract
This paper deals with a car-following model relating driver reaction time, temporal headway and deceleration response to accident frequency. The central goal is to assess the sensitivity of collision probability to a shift in expected reaction time. This problem reduces to determining the sensitivity of the probability of ruin to changes in the drift of the process of cumulative differences between reaction times and temporal headways. A diffusion-type approximation is used and it is shown that additive changes in mean reaction time correspond to multiplicative changes in collision probability. A numerical example is given to illustrate the potential effects of a mere 0.1 sec decrease in mean reaction time. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
13. A Mathematical Programming Model for the Combined Distribution-Assignment of Traffic.
- Author
-
Tomlin, J. A.
- Subjects
- *
COMMUNICATIONS industries , *TRAFFIC assignment , *MATHEMATICAL programming , *TRAFFIC estimation , *LINEAR programming , *ASSIGNMENT problems (Programming) , *ECONOMIC equilibrium , *DISTRIBUTION (Probability theory) , *MATHEMATICS - Abstract
The traffic assignment and distribution problems are customarily treated as though they were independant rather than related and interacting. In this paper a combined traffic distribution-assignment model is formulated as a mathematical program, based on the equilibrium traffic distribution model and a linear programming assignment model. The behavior of this model using the DANZIG- WOLFE decomposition principle is investigated and illustrated by means of an example. A number of areas for extension are indicated and an alternative formulation that has certain advantages is also [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
14. An Empirical Model for Multilane Road Traffic.
- Author
-
Miller, Alan J.
- Subjects
- *
COMMUNICATIONS industries , *TRAFFIC flow , *EMPIRICAL research , *TRAFFIC engineering , *POINT processes , *TRANSPORTATION , *STOCHASTIC processes , *PROBABILITY theory , *MATHEMATICS - Abstract
Before we can construct models for multilane traffic flow, we must be able to describe single-lane traffic. This paper considers some of these methods of description including the use of headway and counting distributions. Another method of describing point processes is by means of product densities. These are essentially joint probabilities of two or more vehicles passing a point at different specified times. Product densities have been used in many fields including particle physics, ecology, and road traffic. The idea of using product densities is simple and attractive, but they do have two major disadvantages: (i) except in simple cases, headway distributions are very difficult to derive from product densities, and (ii) product density estimates for different time lags are cor- related, so that tests of hypotheses are not easy to perform. An alternative to product densities is proposed. For want of a better name these are called termination rates. The termination rate after lag τ is the probability of a vehicle passing a point in a small interval of time, divided by the length of the small interval and conditional upon the last vehicle having passed a time τ previously. These are very similar to mortality rates for humans or scrap page rates for commodities with limited life-times. The interrelations between these statistics are quoted but not derived. Various sets of data are used to illustrate the less common statistics. Some analysis is given of a sample of data from the Congress (now Eisenhower) Expressway in Chicago. Variance to mean ratios are used to show that there is a strong correlation between lanes even though the autocorrelations of headways within lanes are very small. Bivariate termination rates have been estimated for each lane. Given a vehicle in one lane, it was found that the expected number of vehicles in a neighboring lane within 2 or 3 sec was enhanced by about 10-15 percent. [ABSTRACT FROM AUTHOR]
- Published
- 1970
- Full Text
- View/download PDF
15. A Stochastic Process for Determinig Migration Probability.
- Author
-
Rogers, Tommy W.
- Subjects
- *
STOCHASTIC processes , *PROBABILITY theory , *EMIGRATION & immigration , *POPULATION , *POPULATION geography , *MATHEMATICS - Abstract
The article describes a process for determining migration probability. A stochastic process may be defined as a process which develops over time and where events at time 2 are not independent of events at time 1. This lack of independence may be viewed in probabilistic terms. Since the application of stochastic probability processes to problems of predicting individual behavior requires data obtained from case studies where each individual can be classified with respect to a given set of categories at successive points in time, the projected analysis required a set of individual longitudinal migration histories. Consequences of the basic mechanism of the model utilized in the paper are also detailed.
- Published
- 1968
- Full Text
- View/download PDF
16. Functionalism Made Verifiable.
- Author
-
Adler, Franz
- Subjects
- *
FUNCTIONALISM (Social sciences) , *MATHEMATICS , *STATISTICS , *SCIENCE , *CYBERNETICS , *EQUATIONS - Abstract
Merton points out that "the most precise significance of the word function is of course found in mathematics …. " But, says Martindale, "The work of most of the recent functionalists has been at the opposite pole from this meaning." Consequently, assertions made by functionalists have not always been received with the degree of confidence that their authors might consider appropriate. It is the purpose of this paper to suggest a way in which qualitatively formulated propositions of a functional kind may be translated into a language which points to empirical referents and thus, if all goes well, leads in the end to the formulation of mathematical equations. [ABSTRACT FROM AUTHOR]
- Published
- 1963
- Full Text
- View/download PDF
17. NOUVELLE MÉ:THODE DE DÉTERMINATION DE LA SPHÉRICITÉ DES SÉDIMENTS MEUBLES.
- Author
-
Weydert, Pierre
- Subjects
- *
SEDIMENTS , *SAND , *LOGARITHMS , *SIEVES (Mathematics) , *CURVES , *MATHEMATICS - Abstract
In this paper we show a new method to determine the sphericity of sand particles with sedimentary analysis. We use two different sieves: the one with open squares: the side is a and the diagonal d = a√2; the other with circular openings: the diameter is D. Experimentation shows that elements crossed the open-square sieves in diagonal positions. When we reported the cumulative curves into arithmetic or semi-logarithmic graphs, we used the value d and not the side square a. A sediment C, which is composed of not very angular or subspherical particles, is represented by two cumulative curves on the same graph. If a sediment A is very angular, the two curves are disconcerted. The sediment B is subspherical, the difference between the two curves is maximum, We call this difference δ which is equal to d-D. For C this difference is Δ, which is always inferior to δ. The absolute sphericity index Sa is the ratio Δ/δ in %. When the particles are very angular, sphericity values are 0—like A and when they are sphericals the index is 1 — like B. This index is determined with graphic solutions. The values obtained are comparable with those of visual charts. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
18. COINCIDENTAL VARIATION AS A SOURCE OF CONFUSION IN THE EXPERIMENTAL STUDY OF RATE.
- Author
-
Clevenger Jr., Theodore and Clark, Margaret Leitner
- Subjects
- *
TEMPO (Phonetics) , *MATHEMATICS , *EXPERIMENTAL design , *MATHEMATICAL variables , *SYLLABLE (Grammar) , *SPEECH - Abstract
Rate is one of the most broadly useful concepts in mathematics, expressing the relationship between two variables in terms of an artificial unit. Rate can be defined as the average number of units of one variable per unit of another. Rate of speaking is not essentially different from other applications of the concept. The greater difficulties it presents are due in part to the difficulty of selecting an appropriate unit for analysis and in part to the complexity of the variables of which it is composed, since many of these variables are analyzable into other units with complex interrelationships. The present paper concentrates on the relations among rate variables and their consequences for research design and interpretation. Many studies have measured rate in words or syllables per minute. The procedure involves a count of the number of words or syllables in a given utterance, divided by the length of time consumed.
- Published
- 1963
- Full Text
- View/download PDF
19. A Note on the Consistency between Two Approaches to Incorporate Data from Unreliable Sources in Bayesian Analysis.
- Author
-
Schaefer, Ralf E. and Borcherding, Katrin
- Subjects
- *
COGNITIVE consistency , *REASONING , *HYPOTHESIS , *EVIDENCE , *SET theory , *ALGORITHMS , *MATHEMATICS , *PROBABILITY theory , *SOCIAL sciences - Abstract
In many cases Bayes' theorem is an appropriate algorithm for the aggregation of probabilistic evidence. As with other statistical procedures, there are restrictions that must be taken into account. In the present paper we shall comment on several approaches that have been devoted to one of these restrictions; the incorporation of uncertainty about the true state of a datum. A datum is a variable which can be partitioned into equivalence classes. These classes represent the possible data states which will be also called events. In Bayes' theorem an event is an item of information which will be used for revising the opinion about the relative likelihood of hypotheses. In any specific situation only one of the possible events will be the true event. An event may come from a source whose reporting or observational accuracy is not perfect. An example may illustrate the issue. A medical doctor wants to come to a diagnosis. To achieve this he considers several data. One datum might be the result of a medical test, which has three possible states: positive, negative, inconclusive. If the doctor reports the state of the datum to be positive, he may be wrong by whatever reasons. That is, the report of an event must not necessarily coincide with the actual or true event of the datum under consideration. This kind of uncertainty about the true event in any specific situation is a characteristic of the source. In most cases it will reduce the diagnostic impact of an event. Whenever the report of an event and the true event coincide imperfectly, measures of source inaccuracy must be incorporated into Bayes' theorem. The task can be considered as two stage probabilistic induction. The first step is induction from the reported to the actual event, the second is induction from the actual event to the creditation of hypotheses. This is the reason why Gettys and Willke (1969) speak about "cascaded inference." [ABSTRACT FROM AUTHOR]
- Published
- 1973
- Full Text
- View/download PDF
20. The Application of Bayes's Theorem When the True Data State is Uncertain.
- Author
-
Gettys, Charles F. and Willke, T. A.
- Subjects
- *
ALGORITHMS , *ALGEBRA , *FOUNDATIONS of arithmetic , *PROBABILITY theory , *MATHEMATICAL combinations , *MATHEMATICS , *BAYES' theorem , *STATISTICAL decision making , *STATISTICS - Abstract
This paper discusses the application of Bayes's theorem to those cases where the true state of the world is not known with certainty. An algorithm is proposed that relaxes the requirement of Bayes's theorem that the true data state be known with certainty by postulating a true but unobservable elementary event, ω, which gives rise to posterior probabilities which reflect the uncertainty of the data. A derivation is presented for the calculation of Bayesian posterior probabilities which uses as its input these probabilities, rather than the true event, o, which is assumed to be unavailable. Suggestions are made as to the application of this modification of Bayes's theorem to cascaded inference processes. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
21. Ideal Multiple-choice Items.
- Author
-
Weitzman, R. A.
- Subjects
- *
MULTIPLE choice examinations , *MULTIPLE comparisons (Statistics) , *EXAMINATIONS , *MATHEMATICAL analysis , *PROBABILITY measures , *MATHEMATICS , *MATHEMATICAL economics , *LOGIC ,QUESTIONS & answers - Abstract
A multiple-choice item is called ideal if all its alternatives are equally attractive to every person who cannot answer the item correctly without guessing. This paper discusses reasons for attempting to construct ideal items, develops methods for testing whether items are ideal or at least more nearly ideal than other items, and presents examples of both ideal and nonideal items. [ABSTRACT FROM AUTHOR]
- Published
- 1970
- Full Text
- View/download PDF
22. MULTIVARIATE MAXIMA AND MINIMA WITH MATRIX DERIVATIVES.
- Author
-
Tracy, Derrick S. and Dwyer, Paul S.
- Subjects
- *
MATRICES (Mathematics) , *MAXIMA & minima , *MATHEMATICS , *KRONECKER products , *TENSOR products , *PROBLEM solving - Abstract
The purpose of this paper is the presentation of formulae for obtaining matrix derivatives of the second order to use in making tests for maxima and minima. The theory of such second order derivatives is presented. These formulae require the rearrangment of the parameter elements in vector form and the transformed results feature Kronecker products, which have certain desirable properties. Application is made to several types of problems. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
23. FIRST AND SECOND MOMENTS OF THE RANDOMIZATION TEST IN TWO-ASSOCIATE PBIB DESIGNS.
- Author
-
Cléroux, Robert
- Subjects
- *
MOMENTS method (Statistics) , *MATHEMATICS , *STATISTICS , *DISTRIBUTION (Probability theory) , *THEORY , *APPROXIMATION theory , *ERROR , *THEORY of knowledge , *PROBLEM solving - Abstract
In this paper the first two moments of the Statistic (treatment sum of squares)/(treatment +error sums of squares) over all possible random assignments of treatments to the experimental plots are obtained for two associate PBIB designs. They are compared with the corresponding moments of a central beta distribution to study the extent to which the normal theory test may serve as an approximation to the randomization test. It is found that the approximation is reasonable for some classes of PBIB designs. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
24. A SOLUTION TO THE PROBLEM OF LINKING MULTIVARIATE DOCUMENTS.
- Author
-
Du Bois Jr., N. S. D'Andrea
- Subjects
- *
DOCUMENTATION , *MULTIVARIATE analysis , *PUBLIC health , *CLINICAL medicine , *MATHEMATICS , *THEORY , *HEALTH , *MEDICAL care - Abstract
In many scientific investigations, it is desired to bring together, or link, two or more documents which represent the same individual, even though these documents do not contain a unique identifier and were derived from different sources. In medical and public health research and elsewhere, this problem is known as the document linkage problem. This paper considers some aspects of classifying pairs of documents into one of two populations when their items are identifying information, where each item of information can take on three distinct values correct, incorrect or missing. Section I identifies three document linkage problems. Sections 2 and 3 deal with the mathematical formulation of the multivariate document linkage problem. Section 4 gives the classification procedure and Section 5 deals with the application of the theory to a problem in the field of public health. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
25. FORECASTING SHORT-TERM ECONOMIC CHANGE.
- Author
-
Moore, Geoffrey H.
- Subjects
- *
STATISTICIANS , *ECONOMIC forecasting , *ECONOMISTS , *STATISTICS , *MATHEMATICAL statistics , *MATHEMATICS - Abstract
Economic statisticians do not enjoy an untarnished reputation for accurate forecasting. We have managed, over the years, to come up with some memorable failures. While we have also had our share of successes, they are not as well remembered nor as numerous as we should like. Recently, however, we have begun to pay more attention to the record, and a substantial body of evidence on forecasting performance has accumulated. In this paper I propose to review this record, try to arrive at a balanced appraisal, and offer some suggestions for improvement. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
26. ON SOME INVARIANT CRITERIA FOR GROUPING DATA.
- Author
-
Friedman, H. P. and Rubin, J.
- Subjects
- *
CLUSTER analysis (Statistics) , *STATISTICAL correlation , *RANDOM variables , *SPATIAL analysis (Statistics) , *MULTIVARIATE analysis , *MATHEMATICS , *MATHEMATICAL optimization - Abstract
This paper deals with methods of "cluster analysis". In particular we attack the problem of exploring the structure of multivariate data in search of "clusters". The approach taken is to use a computer procedure to obtain the "best" partition of n objects into g groups. A number of mathematical criteria for "best" are discussed and related to statistical theory. A procedure for optimizing the criteria is outlined. Some of the criteria are compared with respect to their behavior on actual data. Results of data analysis are presented and discussed. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
27. PROBABILITY SAMPLING WITH QUOTAS.
- Author
-
Sudman, Seymour
- Subjects
- *
STATISTICAL sampling , *PROBABILITY measures , *PROBABILITY theory , *STATISTICS , *MATHEMATICS , *INTERVIEWING , *AGE & employment - Abstract
This paper describes certain quota sampling procedures and attempts to show that they are very close to traditional probability sampling. Quotas are shown to depend on availability for interviewing and evidence is presented to show that sex, age, and employment status are reasonable predictors of availability. Quota sampling methods are not unbiased but data are presented which suggest that the bias is generally of the order of 3 to 5 per cent. It is shown, however, that the cost differentials between these quota samples and call-back samples are small. The major advantage of this new procedure may well be the speed with which interviewing may be completed during crises such as the Kennedy assassination. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
28. MISSING OBSERVATIONS IN MULTIVARIATE STATISTICS .
- Author
-
Afifi, A. A. and Elashoff, R. M.
- Subjects
- *
MULTIVARIATE analysis , *MATHEMATICAL statistics , *STATISTICS , *ANALYSIS of variance , *MATHEMATICAL variables , *MATHEMATICS , *REGRESSION analysis , *ESTIMATION theory - Abstract
In this paper we review the literature on the problem of handling multivariate data with observations missing on some or all of the variables under study. We examine the ways that statisticians have devised to estimate means, variances, correlations and linear regression functions from such data and refer to specific computer programs for carrying out the estimation. We show how the estimation problems can be simplified if the missing data follows certain patterns. Finally, we outline the statistical properties of the various estimators. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
29. CONFIDENCE BANDS OF UNIFORM AND PROPORTIONAL WIDTH FOR LINEAR MODELS.
- Author
-
Bowden, David C. and Graybill, Franklin A.
- Subjects
- *
LINEAR statistical models , *MATHEMATICAL models , *MATHEMATICAL statistics , *STATISTICS , *MATHEMATICS - Abstract
In this paper confidence bands are given for simple linear model. The confidence bands that are given are straight lines rather than curves and are either (1) parallel or (2) trapezoidal. The confidence bands are over a finite length interval. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
30. OPTIMUM ALLOCATION OF SAMPLING UNITS TO STRATA WHEN THERE ARE R RESPONSES OF INTEREST.
- Author
-
Folks, John Leroy and Antle, Charles E.
- Subjects
- *
MATHEMATICAL variables , *STATISTICAL sampling , *MATHEMATICAL functions , *MATHEMATICS , *SAMPLE size (Statistics) , *STATISTICS - Abstract
A common problem in many areas is to achieve a combination of control variables which will simultaneously maximize all of the responses which are functions of the control variables. The fact that in general we can not simultaneously maximize all of the responses has led to the formulation of compromise solutions to this vector maximum problem. A point in the control variable space is better than a second point if each response at the first point is greater than or equal to the corresponding response at the second point. If a point is such that there is no better point, it is said to be admissible, or efficient. A set of points is complete if given any point not in the set there is a point in the set which is better. In this paper this formulation is used to consider the allocation of sample size to several strata when there are several characteristics of interest. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
31. APPLICATIONS OF PROBABILITY THEORY IN CRIMINALISTICS.
- Author
-
Kingston, Charles R.
- Subjects
- *
CRIMINAL investigation , *DISTRIBUTION (Probability theory) , *PROBABILITY theory , *FORENSIC sciences , *RANDOM variables , *LEGAL evidence , *MATHEMATICAL variables , *MATHEMATICS - Abstract
This paper considers some problems in the probabilistic analysis of physical evidence in criminal investigations. Two basic assumptions are made: (1) That the number of persons or objects possessing a particular set of properties can be considered as a random variable, and (2) that it is possible to estimate the probability function of this random variable. Two models (one with and one without the assumption that the suspect is a random selection from the set of possible suspects) which arc applicable to the evaluation of partial transfer evidence--an important category of physical evidence that is found in most criminal investigations--are developed. A detailed analysis of the models and an example are presented for the case when the estimated probability distribution is binomial with an expected value less than 1. As the expected value becomes smaller, the assumption of randomness in the selection of the suspect becomes immaterial to the evaluation of the evidential significance. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
32. Accurate Floating-Point Summation.
- Author
-
Timlake, W. P. and Linz, Peter
- Subjects
- *
FLOATING-point arithmetic , *COMPUTER arithmetic , *RECURSIVE functions , *ALGORITHMS , *MATHEMATICAL logic , *MATHEMATICS - Abstract
This paper describes an alternate method for summing a set of floating-point numbers. Comparison of the error bound for this method with that of the standard summation method shows that it is considerably less sensitive to propagation of round-off error. [ABSTRACT FROM AUTHOR]
- Published
- 1970
33. Error-Free Methods for Statistical Computations.
- Author
-
Rodden, B. E.
- Subjects
- *
ALGORITHMS , *ALGEBRA , *MATHEMATICS , *MATHEMATICAL statistics , *ERROR analysis in mathematics , *COMPUTER algorithms - Abstract
Neely has discussed computational error generated by some algorithms used to compute various statistics. In the present paper methods are described which are error-free, simple in concept, and usually less costly in machine time than those mentioned by Neely. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
34. A Note on Cheney's Nonrecursive List-Compacting Algorithm.
- Author
-
Walden, David C.
- Subjects
- *
COMPACTING , *SANITATION workers , *ISOSTATIC pressing , *MATHEMATICAL programming , *MATHEMATICS - Abstract
Presents a note on author C.J. Cheney's paper 'Nonrecursive List-Compacting Algorithm,' which appeared in the November 1970 issue of the periodical 'Communications of the ACM.' Effect of circular list consisting exclusively of non-items on the flow of the list-compacting algorithm; Structural meaning of the list-compacting algorithm.
- Published
- 1972
- Full Text
- View/download PDF
35. A Note on `A Modification of Nordsieck's Method Using an `Off-Step' Point'.
- Author
-
Blumberg, John W. and Foulk, Clinton R.
- Subjects
- *
DIFFERENTIAL equations , *CALCULUS , *IBM computers , *MATHEMATICS , *UNIVERSITIES & colleges - Abstract
The article presents information on the experimental results presented by researchers J.J. Kohfeld and G.T. Thompson in their paper on a modification of Reinhard Nordsieck's method for the numerical solution of ordinary differential equations using a multiple precision arithmetic package available on the IBM 7094 at the Ohio State University Computer Center in Columbus, Ohio. The experimental results agree with those presented by Kohfeld and Thompson at h = 0.10 and 0.15 for the Nordsieck and the GSN methods, which is to be expected since round-off error is not critical at those interval lengths.
- Published
- 1971
- Full Text
- View/download PDF
36. SUGGESTIONS FOR MORE ACCURATE MEASUREMENT OF SOME FIGURE DRAWING VARIABLES.
- Author
-
Handler, Leonard, Levine, Joseph R., and Potasr, Herbert M.
- Subjects
- *
DRAWING , *ART , *MATHEMATICS , *PSYCHOLOGICAL factors , *FIGURE drawing , *PERSPECTIVE (Art) - Abstract
The article presents suggestions for more accurate measurement of some figure drawing variables. Suggestions are offered to increase the precision and reliability of scoring certain traditional drawing variables. If drawings done along different axes are to be compared for location, they may be more accurately scored using a procedure. Vertical imbalance may easily be scored with an inexpensive protractor. The score might be the angle the center of the figure makes with the bottom edge of the paper. The angle can be read from the protractor in degrees.
- Published
- 1965
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.