200 results
Search Results
2. Two Papers on the Comparison of Bayesian and Frequentist Approaches to Statistical Problems of Prediction: Bayesian Tolerance Regions
- Author
-
J. Aitchison
- Subjects
Statistics and Probability ,business.industry ,010102 general mathematics ,Bayesian probability ,Machine learning ,computer.software_genre ,01 natural sciences ,010104 statistics & probability ,Frequentist inference ,Artificial intelligence ,0101 mathematics ,business ,computer ,Mathematics - Published
- 1964
3. A Selection of Early Statistical Papers of J. Neyman
- Author
-
Jerzy Neyman
- Subjects
business.industry ,Artificial intelligence ,business ,Machine learning ,computer.software_genre ,computer ,Selection (genetic algorithm) ,Mathematics - Published
- 1967
4. A Paper-Bag Test Cage for Use with the Tobacco Budworm1
- Author
-
Cheryl K. Lahren and H. M. Flint
- Subjects
Ecology ,business.industry ,Insect Science ,General Medicine ,Artificial intelligence ,Biology ,Machine learning ,computer.software_genre ,business ,Cage ,computer ,Test (assessment) - Published
- 1966
5. A Note on the Paper: An Examination of the Use of Adaptive Filtering in Forecasting
- Author
-
R. Harris
- Subjects
Marketing ,Operations research ,business.industry ,Computer science ,Strategy and Management ,Reliability (computer networking) ,Management Science and Operations Research ,Machine learning ,computer.software_genre ,Management Information Systems ,Scheduling (computing) ,Adaptive filter ,Management of Technology and Innovation ,Information system ,Artificial intelligence ,business ,computer - Published
- 1973
6. Meeting Papers: Selection and Rejection
- Author
-
Luise Margolies
- Subjects
business.industry ,Computer science ,General Medicine ,Artificial intelligence ,business ,Machine learning ,computer.software_genre ,computer ,Selection (genetic algorithm) - Published
- 1971
7. A Selection of Early Statistical Papers of J. Neyman
- Author
-
Jerzy Neyman and Oscar Kempthorne
- Subjects
Statistics and Probability ,Computer science ,business.industry ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,Machine learning ,computer.software_genre ,computer ,Selection (genetic algorithm) - Published
- 1970
8. J. Neyman: Selection of Early Statistical Papers
- Author
-
W. R. Buckland and J. Neyman
- Subjects
Statistics and Probability ,business.industry ,Computer science ,Artificial intelligence ,business ,Machine learning ,computer.software_genre ,computer ,Selection (genetic algorithm) - Published
- 1968
9. A flexible effort estimator model based on ASO algorithm
- Author
-
Amin Moradbeiky, Vahid Khatibi, and Mehdi Jafari
- Subjects
atom search optimization ,development effort estimation ,machine learning ,software project ,Engineering design ,TA174 - Abstract
Accurate estimation of required effort for software development plays an important role in the success of the software project. This is always a challenging issue due to the intangible nature of the software project. Therefore, a large category of researches have been performed to develop accurate tools to estimate the required efforts for software development. According to the presented papers in related works, the adoption of methods to identify the types of relationship between software project features and features affecting the required effort for software development have a significant impact on effort estimation accuracy increment. In addition, the effectiveness of various features on the software development effort estimation is different. So, the feature effectiveness determination is advantageous in increasing the effort estimation accuracy. This paper presents a new model consisting of sub-models for project features analyzing and it uses a new and accurate heuristic algorithm called Atom Search Optimization (ASO) Algorithm to configure tools and data modeling methods. The presented model in this article is designed in multiple layers and the sub-models are organized in separate layers. The organizations of sub-models are in such a way to increase performance of other layers and ultimately increase the final estimate accuracy. In accuracy evaluation of the proposed model, 3 data sets from real projects are used and the comparisons of the results with different methods are presented. Based on the results, the proposed model leads to significant improvement of final effort estimation accuracy.
- Published
- 1378
- Full Text
- View/download PDF
10. COKO III: The Cooper-Koz Chess Program.
- Author
-
Kozdrowicki, Edward W., Cooper, Dennis W., and Lawson, C. L.
- Subjects
GAMES ,VIDEO games ,CHESS clubs ,ALGORITHMS ,BOARD games ,IBM computers - Abstract
COKO III is a chess player written entirely in Fortran. On the IBM 360-65, COKO III plays a minimal chess game at the rate of .2 sec cpu time per move, with a level close to lower chess club play. A selective tree searching procedure controlled by tactical chess logistics allows a deploy merit of multiple minimal game calculations to achieve some optimal move selection. The tree searching algorithms are the heart of COKO's effectiveness, yet they are conceptually simple. in addition, an interesting phenomenon called a tree searching, catastrophe has plagued COKO's entire development just as it troubles a human player. Standard exponential growth is curbed to a large extent by the definition and trimming of the Fischer set. A clear distinction between tree pruning and selective tree searching is also made. Representation of the chess environment is described along with a strategical preanalysis procedure that maps the Lasker regions. Specific chess algorithms are described which could be used as a command structure by anyone desiring to do some chess program experimentation. A comparison is made of some mysterious actions of human players and COKO III. [ABSTRACT FROM AUTHOR]
- Published
- 1973
- Full Text
- View/download PDF
11. Learning characteristics of human operator in man-machine system
- Author
-
H. Hashimoto and Hajime Akashi
- Subjects
Frequency response ,Computer science ,Active learning (machine learning) ,business.industry ,Stability (learning theory) ,Process (computing) ,Online machine learning ,Machine learning ,computer.software_genre ,Step response ,Control system ,Unsupervised learning ,Artificial intelligence ,business ,computer - Abstract
This paper deals with a problem on the learning characteristics of human operator in a compensatory tracking control task.It is already known that the human operator can learn and adapt himself to changes in control dynamics.It is therefore essential for a system designer to analyze the learning behavior of the human operator. However, the process of learning has not been nivestigated in much detail. A new method to assess the learning process of a human operator is proposed here by fitting the experimental curve to logarithmic function. Variations of the coefficients are then considered to provide a measure for the learning speed.The new method of evaluating the learning process is applied to the cases of step response and frequency response. Also, in order to reduce the training interval, the method of performance scores display is used for the detection of optimal switching point in a relay control system. The result of these investigations presented in this paper are expected to be of use in the design of man-machine systems.
- Published
- 1972
12. Observer Reliability and Human Inference
- Author
-
David A. Schum and Paul E. Pfeiffer
- Subjects
Observer (quantum physics) ,business.industry ,Computer science ,Principal (computer security) ,Inference ,Machine learning ,computer.software_genre ,Unobservable ,Empirical research ,Conditional independence ,Data system ,Artificial intelligence ,Data mining ,Electrical and Electronic Engineering ,Safety, Risk, Reliability and Quality ,business ,computer ,Reliability (statistics) - Abstract
This paper presents a formal analysis of the problem of determining the inferential impact of the information in a composite report from a collection of unreliable observers or sensors. Each sensor reports one of a finite number of possible states of a data system linked probabilistically with an ``objective system'' whose condition is to be inferred from the data state. The principal assumptions are that the sensors do not ``collaborate'' in making their reports and that their reports are conditioned only by the existing data state and not by the actual, unobservable state of the objective system. Use of the notion of conditional independence to express these assumptions gives the analytic expressions a tractable form which sheds light on various inference issues. The paper also briefly discusses current empirical research on the question of how well people actually adjust the impact of inferential evidence to correspond to the unreliability of the sources of information.
- Published
- 1973
13. Event Classification
- Author
-
Robert T. Nash
- Subjects
Process (engineering) ,Computer science ,business.industry ,SIGNAL (programming language) ,Control (management) ,Management Science and Operations Research ,Machine learning ,computer.software_genre ,Computer Science Applications ,Conjunction (grammar) ,Course of action ,Artificial intelligence ,Set (psychology) ,business ,computer ,Selection (genetic algorithm) ,Event (probability theory) - Abstract
Choosing a suitable set of signals for measurement in classification procedures is poorly understood. Since classification is generally associated with a decision process, this paper shows how the entire question is most sensibly formulated as a two-stage statistical decision process. The first decision involves the detection of some suitable inferential signal, while the second is the selection of a suitable course of action or a control policy. The paper discusses techniques for choosing a set of signals for measurement, subject to certain constraints that are imposed upon this selection process, and demonstrates that the classification procedure must be considered in conjunction with the associated decision or control process, if the most appropriate inferential signals are to be selected. In this formulation, pure classification occurs if the possible actions are chosen in a particular manner. The selection of the best signal or signals for measurement may be influenced significantly by the time-dependent nature of the statistical risk.
- Published
- 1969
14. Numerical Diagnosis Using 'Statrules'
- Author
-
Richard McFee and Gerhard M. Baule
- Subjects
Point (typography) ,Computer science ,business.industry ,Biomedical Engineering ,Numerical models ,computer.software_genre ,Machine learning ,Index (publishing) ,Pattern recognition (psychology) ,Artificial intelligence ,Construct (philosophy) ,business ,computer ,Natural language processing - Abstract
In some cases disease causes widespread changes in a number of physical and chemical indexes, without any one index being grossly altered. The difficulty inherent in evaluating numerous small changes may obscure the diagnosis, even though it is conclusive. We show that subtle patterns of change may be recognized using marked cards. These consist of a ``patient data card'' on which is typed a number of scales, each representing one of the measurements, and a ``statrule'' card which contains the statistics for these measurements for the diseases of concern. These cards implement procedures often used in studies of numerical diagnosis made with digital computers. The statistical basis of statrules is discussed in the first part of the paper using primarily a geometric point of view. The second part of the paper shows how to use and construct statrules.
- Published
- 1969
15. Selection of Lives
- Author
-
E. A. J. Heath
- Subjects
Computer science ,business.industry ,Artificial intelligence ,business ,Machine learning ,computer.software_genre ,computer ,Selection (genetic algorithm) - Abstract
In dealing with the selection of lives we leave the actuarial idea of selection and come down to the practical consideration of the individual. The actuarial student is accustomed to dealing with select lives and we want to find out how these lives are classified. In this practical sense there are two types of selection, medical and non-medical, the latter referring not to a non-medical scheme of assurance but to the points arising from the papers and circumstances of the case apart from the medical report.It is proposed to consider both these aspects and to show that the actuary can take his part in the selection just as much as the Principal Medical Officer. Some considerations need the actuary alone, others the P.M.O. alone, and the best results can only be obtained by whole-hearted co-operation between the two. No attempt is made to discuss extra risks, this being outside the scope of the paper, and we shall merely examine the various phases of the selection and outline points which the actuary must be ready to detect so that he can assist the P.M.O. as far as possible.
- Published
- 1931
16. Some Psychological Properties of Digital Learning Nets
- Author
-
Igor Aleksander
- Subjects
Recall ,Generalization ,business.industry ,Computer science ,General Engineering ,Division (mathematics) ,Machine learning ,computer.software_genre ,Salient ,Oscillation (cell signaling) ,Identity (object-oriented programming) ,The Internet ,Artificial intelligence ,Digital learning ,business ,computer - Abstract
The paper initially discusses some of the differences between conventional computer learning algorithms and digital learning nets. Particular attention is given to the division of all sensory information into pattern information and identity information and the relationships of these quantities during a learning operation. Generalization in learning nets is discussed where digital nets are compared with threshold elements. The enhancement of the net's behaviour by means of feedback is the salient topic in the second half of the paper. A new definition of “recognition” is discussed: this is based on the generation of a stable oscillation in the feedback network. Experiments on small learning nets are described to illustrate that nets with feedback can recognize and recall sequences. The final section of the paper draws some tentative comparisons between digital learning nets and the human brain.
- Published
- 1970
17. A Bayesian Sequential Life Test
- Author
-
V. D. Barnett
- Subjects
Statistics and Probability ,Exponential distribution ,business.industry ,Applied Mathematics ,Bayesian probability ,Machine learning ,computer.software_genre ,Measure (mathematics) ,Yardstick ,Modeling and Simulation ,Econometrics ,Life test ,Artificial intelligence ,business ,computer ,Prior information ,Mathematics - Abstract
Detailed examination of the lifetimes of components and assemblies by means of the usual sequential probability ratio tests is often not feasible because of the prohibitive time or cost involved in such an examination. Situations commonly arise, however, where extraneous information is available (perhaps in the form of experience of similar situations or concerning the reputation of the supplier of the components or assemblies) which reflects on the current situation. Such information might be incorporated in a Bayesian analysis, but little work seems to have been done in this area. This paper presents a possible sequential life test incorporating prior information, applied to the simplest situation of components with exponentially distributed lifetimes, tested individually. The results for conventional sequential lifetests are used as a rough yardstick against which to measure the properties of the Bayesian lifetest discussed in the paper.
- Published
- 1972
18. A FACET-FACTORIAL APPROACH TO THE CONSTRUCTION OF RATING SCALES TO MEASURE COMPLEX BEHAVIORS
- Author
-
Harold F. Abeles
- Subjects
Factorial ,business.industry ,Behaviorally anchored rating scales ,Test validity ,Machine learning ,computer.software_genre ,Measure (mathematics) ,Education ,Rating scale ,Developmental and Educational Psychology ,Achievement test ,Psychology (miscellaneous) ,Artificial intelligence ,Psychology ,business ,Social psychology ,computer ,Applied Psychology ,Performance rating ,Pencil (mathematics) - Abstract
The purpose of this study was to examine a technique for the development of performance rating scales to measure achievement in courses whose objectives require complex behaviors not easily measurable with paper and pencil achievement tests. A facet-factorial approach to rating scale construction was employed (i.e. the behavior was conceptualized as multidimensional and items for the scales were selected by employing factor analytical techniques) to construct scales to measure clarinet music performance. The three major results of the study were: I) a thirty-item rating scale based on a six factor structure of clarinet music performance; 2) high inter-judge reliability estimates for both the total score (above .90) and the scale scores (above .60); and, 3) criterion-related validity coefficients greater than .80. Results of the investigation suggest that the facet-factorial approach can be an effective technique for the construction of rating scales to measure complex behavior such as music performance. The purpose of this study was to examine a technique for the development of performance rating scales to measure achievement in courses whose objectives require complex behaviors not easily measurable with paper and pencil achievement tests. A facetfactorial (Butt & Fiske, 1968) approach to rating scale construction was applied to the construction of rating scales for the evaluation of clarinet music performance. The facet-factorial approach consists of conceptualizing the behavior as multidimensioned and employing factor analytical procedures to select items for the scales. One of the main difficulties in the evaluation of complex behaviors is that the measures employed are typically subjective judgments based on irregular and uncontrolled observations (Ebel, 1965; Whybrew, 1962). A consensus among judges concerning adequacy of a performance is difficult to obtain. The replacement of judges'
- Published
- 1973
19. Stimulus programming in psychophysics
- Author
-
J. E. Keith Smith
- Subjects
Markov chain ,business.industry ,Applied Mathematics ,Stimulus (physiology) ,Machine learning ,computer.software_genre ,Ogive ,Sensitivity testing ,Statistics ,Psychophysics ,Artificial intelligence ,Logistic function ,Remainder ,business ,computer ,General Psychology ,Mathematics - Abstract
The major thesis of this paper has been that one should choose a psychophysical method appropriate to the question which prompted the inquiry. It has been suggested that the method of constant stimuli, although not inappropriate for most questions, is by no means optimal for any particular question. A class of stimulus programming techniques called Markov designs has been briefly described. These have the principal advantage of using data as it is gathered to improve the selection of stimuli for the remainder of the experiment. This improvement is achieved by concentrating observations in the region of most interest. They are also reasonably easy to carry out and can be tailored to a wide range of experimental objectives, and pose no new problems in data analysis. Finally data analysis is discussed. It is pointed out that the question of which mathematical form to use in graduating the data is a minor one if the data are efficiently gathered and that, within reason, mathematical convenience is a good criterion. An advantage of the logistic function over the normal ogive is mentioned. Few of the ideas advanced here are new but many have not been appreciated fully by psychophysicists. The emphasis in this paper has been on the areas of sensitivity testing in which psychophysics has lagged behind best statistical practice, although it was pointed out that no efficient methods of analysis of the continuous response procedures have yet been devised by the statisticians. Better communication in both directions is clearly desirable.
- Published
- 1961
20. Information reduction in the analysis of sequential tasks
- Author
-
Michael I. Posner
- Subjects
business.industry ,Computer science ,Concept Formation ,Information Theory ,Stimulus (physiology) ,Classification ,Information theory ,Machine learning ,computer.software_genre ,Concept learning ,Humans ,Learning ,Artificial intelligence ,business ,computer ,General Psychology - Abstract
This paper proposes a taxonomy of information-processing tasks. Information conserving, reducing, and creating operations are viewed as different methods of processing. The main concern of this paper is information reduction which, it is suggested, represents a kind of thinking in which the solution is in some way implicit in the problem, but in which the input information must be reflected in a reduced or condensed output. A number of tasks within the areas of concept identification and utilization are shown to have this character. If the tasks require complete representation of the stimulus in the response (condensation) the amount of information reduced is directly related to difficulty both during learning and in utilization of previously learned rules. If the tasks allow Ss to ignore information in the stimulus (gating) the direct relation between reduction and difficulty is found during learning but may not occur after the rule is learned.
- Published
- 1964
21. Perimetry—The information theoretical basis for its automation
- Author
-
A. Roulier, Franz Fankhauser, and Pierre Koch
- Subjects
Electronic Data Processing ,Basis (linear algebra) ,Computer science ,business.industry ,Automated perimetry ,Information Theory ,Machine learning ,computer.software_genre ,Information theory ,Models, Biological ,Automation ,Sensory Systems ,Ophthalmology ,Humans ,Visual Field Tests ,Artificial intelligence ,Visual Fields ,Patient simulation ,business ,computer - Abstract
This paper is concerned with communication between patient and perimetrist. The problem is considered in terms of information theory with the aim of finding criteria for the design of a largely automated system of perimetry for the acquisition and processing of data. The agreement of theoretical expectations and results obtained by patient simulation, as presented in a foregoing paper, is encouraging. It is shown that a memory of 8000 words is probably sufficient for a control computer in automated perimetry.
- Published
- 1972
22. On the Asymptotic Improvement in the Out- come of Supervised Learning Provided by Additional Nonsupervised Learning
- Author
-
D.B. Cooper and J.H. Freeman
- Subjects
business.industry ,Active learning (machine learning) ,Computer science ,Algorithmic learning theory ,Supervised learning ,Online machine learning ,Semi-supervised learning ,Machine learning ,computer.software_genre ,Generalization error ,Ensemble learning ,Theoretical Computer Science ,Computational Theory and Mathematics ,Hardware and Architecture ,Unsupervised learning ,Artificial intelligence ,business ,Cluster analysis ,computer ,Software - Abstract
This paper treats an aspect of the learning or estimation phase of statistical pattern recognition (and adaptive statistical decision making in general). Simple mathematical expressions are derived for the improvement in supervised learning provided by additional nonsupervised learning when the number of learning samples is large so that asymptotic approximations are appropriate. The paper consists largely of the examination of a specific example, but, as is briefly discussed, the same procedure can be applied to other parametric problems and generalization to nonparametric problems seems possible. The example treated has the additional interesting aspect that the data does not have structure that would enable the machine to learn in the nonsupervised mode alone; but the additional nonsupervised learning can provide substantial improvement over the results obtainable by supervised learning alone. A second purpose of the paper is to suggest that a new fruitful area of research is the analytical study of the possible benefits of combining supervised and nonsupervised learning.
- Published
- 1970
23. Error Categorization and Analysis in Man-Computer Communication Systems
- Author
-
Leon H. Nawrocki, Michael H. Strub, and Ross M. Cecil
- Subjects
Computer science ,business.industry ,Information processor ,Context (language use) ,Machine learning ,computer.software_genre ,Communications system ,Reliability engineering ,Categorization ,Artificial intelligence ,Electrical and Electronic Engineering ,Safety, Risk, Reliability and Quality ,business ,Human error assessment and reduction technique ,computer ,Reliability (statistics) ,Structured systems analysis and design method ,Human reliability - Abstract
This paper briefly examines traditional approaches to human reliability and presents a technique which permits the system designer to derive a mutually exclusive and exhaustive set of operator error categories in a man-computer system. These error categories are defined in terms of process failures and provide the system designer with a qualitative index suitable for determining error causes and consequences. The technique is demonstrated, and the utility of the resulting error categories is evaluated in the context of two studies on a military information processing system. The paper concludes with a brief discussion of detectable and non-detectable errors and a suggestion for determining the impact of errors on ultimate system goals.
- Published
- 1973
24. Measurement of subjective probability
- Author
-
Carl-Axel S. Staël von Holstein
- Subjects
Computer science ,business.industry ,media_common.quotation_subject ,Scoring rule ,Experimental and Cognitive Psychology ,General Medicine ,Machine learning ,computer.software_genre ,Incentive ,Arts and Humanities (miscellaneous) ,Honesty ,Developmental and Educational Psychology ,Artificial intelligence ,business ,Function (engineering) ,Social psychology ,computer ,media_common ,Event (probability theory) - Abstract
Assessments of subjective probabilities do not always correspond with the assessor's true beliefs. It is therefore essential to provide him with an incentive to make honest assessments. This could be accomplished by using a scoring rule (a function of the assessed probabilities and the event which eventually occurs). The paper discusses various properties that scoring rules should possess in order to encourage honesty. Examples of scoring rules are given together with some practical and experimental experience with scoring rules. The paper also includes a survey of assessment techniques which do not rely on scoring rules.
- Published
- 1970
25. Decision processes in perception
- Author
-
Theodore G. Birdsall, John A. Swets, and Wilson P. Tanner
- Subjects
Generality ,Visual perception ,Computer science ,business.industry ,Decision theory ,media_common.quotation_subject ,Observer (special relativity) ,Machine learning ,computer.software_genre ,Perception ,Psychophysics ,Technical report ,Detection theory ,Artificial intelligence ,business ,computer ,General Psychology ,media_common - Abstract
About 5 years ago, the theory of statistical decision was translated into a theory of signal detection. Although the translation was motivated by problems in radar, the detection theory that resulted is a general theory for, like the decision theory, it specifies an ideal process. The generality of the theory suggested to us that it might also be relevant to the detection of signals by human observers. Beyond this, we were struck by several analogies between this description of ideal behavior and various aspects of the perceptual process. The detection theory seemed to provide a framework for a realistic description of the behavior of the human observer in a variety of perceptual tasks. 1 This paper is based upon Technical Report No. 40, issued by the Electronic Defense Group of the University of Michigan in 1955. The research was conducted in the Vision Research Laboratory of the University of Michigan with support from the United States Army Signal Corps and the Naval Bureau of Ships. Our thanks are due H. R. Blackwell and W. M. Kincaid for their assistance in the research, and D. H. Howes for suggestions concerning the presentation of this material. This paper was prepared in the Research Laboratory of Electronics, Massachusetts Institute of Technology, with support from the Signal Corps, Air Force (Operational Applications Laboratory and Office of Scientific Research), and Office of Naval Research. This is Technical Report No. ESD-TR-61-20. 2 For a formal treatment of statistical decision theory, see Wald (1950) ; for a brief and highly readable survey of the essentials, see Bross (1953). Parallel accounts of the detection theory may be found in Peterson, Birdsall, and Fox (1954) and in Van Meter and Middleton (1954). The particular feature of the theory that was of greatest interest to us was the promise that it held of solving an old problem in the field of psychophysics. This is the problem of controlling or specifying the criterion that the observer uses in making a perceptual judgment. The classical methods of psychophysics make effective provision for only a single free parameter, one that is associated with the sensitivity of the observer. They contain no analytical procedure for specifying independently the observer's criterion. These two aspects of performance are confounded, for example, in an experiment in which the dependent variable is the intensity of the stimulus that is required for a threshold response. The present theory provides a quantitative measure of the criterion. There is left, as a result, a relatively pure measure of sensitivity. The theory, therefore, promised to be of value to the student of personal and social processes in perception as well as to the student of sensory functions. A second feature of the theory that attracted us is that it is a normative theory. We believed that having a standard with which to compare the behavior of the human observer would aid in the description and in the interpretation of experimental results, and would be fruitful in suggesting new experiments. This paper begins with a brief review of the theory of statistical decision and then presents a description of the elements of the theory of signal detection appropriate to human observers.
- Published
- 1961
26. On some new and simple methods of detecting manganese in natural and artificial compounds, and of obtaining its combinations for œconomical or other uses
- Author
-
Edmund Davy
- Subjects
Computer science ,business.industry ,chemistry.chemical_element ,General Medicine ,Manganese ,Machine learning ,computer.software_genre ,Natural (archaeology) ,chemistry ,Simple (abstract algebra) ,Artificial intelligence ,Biological system ,business ,computer - Abstract
In this paper the growing importance of manganese since its discovery, and its extensive distribution in Nature are noticed. Manganese is chiefly found combined with oxygen, but its oxides are commonly mixed with those of iron, and though different methods of separating them have been recommended, yet no very simple or unobjectionable test for manganese seems to be known. Two methods for detecting manganese are recommended, viz.— 1. The pure hydrated fixed alkalies, potash and soda, and especially potash. 2. Sulphur.
- Published
- 1854
27. Methods of Quality Control
- Author
-
T. P. Whitehead and Lorna O. Morris
- Subjects
business.industry ,Technician ,media_common.quotation_subject ,Clinical Biochemistry ,Control (management) ,General Medicine ,Machine learning ,computer.software_genre ,030226 pharmacology & pharmacy ,01 natural sciences ,0104 chemical sciences ,010404 medicinal & biomolecular chemistry ,03 medical and health sciences ,0302 clinical medicine ,Medicine ,Quality (business) ,Artificial intelligence ,business ,computer ,media_common - Abstract
In this paper no attempt is made to review the many analytical and statistical techniques which have been suggested for the maintenance of repro ducibility and accuracy in a clinical chemistry laboratory. Instead a description is given of the techniques which have been introduced into this laboratory over the past 7 years, with examples of their usefulness. Since no single technique of quality control is adequate in itself, six methods are sug gested in this paper. More would be used if they demonstrated errors which were not shown by our present techniques providing they did not make impossible demands on labour and personnel. Experience has shown that quality control tech niques are failing if it becomes necessary to reject a batch of tests following analysis and calculations. Prevention, based upon experience with quality control techniques, is possible and, particularly with the AutoAnalyzer, the batch can be controlled during processing of the specimens. Retrospective information of analytical methods drifting out of control is often not interpretable on one day's results, because no statistical technique detects a significant change on one day. A computer is used in two of the techniques described but at least one of these is widely prac tised in laboratories without a computer. Quality control techniques are not computer dependent but are easier with one. In the future there may be quality control techniques which cannot be performed without a computer. Display of results The greatest problem in quality control is not performing the techniques but interpreting them, and this is very dependent on the method of display ing the results. In this laboratory, the technician records the results obtained from control sera on a pre-printed form (Fig. 1) which is sent to the bio chemist responsible for quality control, who records the results and draws the appropriate graphs. Each section of the laboratory has a bench log book in which dates of changes in solutions, apparatus and techniques are recorded.
- Published
- 1969
28. Statistical Analysis: Theory Versus Practice
- Author
-
Robert L. Winkler
- Subjects
business.industry ,Computer science ,Experimental psychology ,Bayesian probability ,Machine learning ,computer.software_genre ,Prior probability ,Statistical inference ,Statistical analysis ,Artificial intelligence ,business ,Likelihood function ,computer ,Scientific reporting - Abstract
In this paper the gap between theory and practice in statistical analysis is investigated, with particular attention given to the Bayesian approach to statistical analysis. The primary concern is with statistical analysis in experimental psychology, although the paper has implications for all areas in which statistical methods are used. Current statistical practice in experimental psychology and various factors contributing to the theory-practice gap in statistical analysis are considered. Finally, some general questions involving scientific reporting and the use of Bayesian procedures in statistical inference are discussed.
- Published
- 1974
29. Application of Machine Computation of Well Logs in California
- Author
-
G.L. Marquis and C.A. Strozier
- Subjects
Computer science ,business.industry ,Computation ,Well logging ,Artificial intelligence ,Machine learning ,computer.software_genre ,business ,computer - Abstract
This paper was prepared for the California Regional Meeting of the Society of Petroleum Engineers of AIME, to be held in Santa Barbara, Calif., Nov. 17–18, 1966. Permission to copy is restricted to an abstract of not more than 300 words. Illustrations may not be copied. The abstract should contain conspicuous acknowledgment of where and by whom the paper is presented. Publication elsewhere after publication in the JOURNAL OF PETROLEUM TECHNOLOGY or the SOCIETY OF PETROLEUM ENGINEERS JOURNAL is usually granted upon requested to the Editor of the appropriate journal, provided agreement to give proper credit is made. Discussion of this paper is invited. Three copies of any discussion should be sent to the Society of Petroleum Engineers Office. Such discussions may be presented at the above meeting and, with the paper, may be considered for publication in one of the two SPE magazines. Abstract Machine computation from well bore measurements can provide more data for analysis than is practical inn many cases, from manual computation. Machine computation minimizes arithmetic time for the individual studying well logs. His time is freed for analyzing the computed data, and he can usually command more data to use in his evaluation. The result is a more comprehensive well evaluation than is normally obtained. Machine computation lend itself to high density information evaluation and precision calculations While this is not a normal requirement of routine well logs, there are applications for this capability. In most areas in the State of California, the Engineer, Geologist, or Log Analyst is faced with the problem on each well of evaluating hundreds of feet of sand. From the standpoint of time alone, this makes a detailed evaluation very difficult and at times impractical. This paper discusses applications of machine computation to process well log measurements and tabulate the resulting data. On standard open hole suite of fresh mud logs emphasis is placed on several values of porosity determined independently and in combination from measured values of formation acoustic travel time and bulk density. Water saturation evaluation is approached from computed formation resistivity factor ratios (F ratios) and apparent formation water resistivities (R .) or water saturations (S ,.). Examples of the application of this program are presented. A second important application of machine computation is the analysis of Neutron Lifetime Logs. The connate waters encountered in California are relatively fresh (30,000 ppm NaCl) with formation porosities from 20–30%. In using this log in California or any other area of difficult interpretation, machine computation greatly reduces the analyst's work and improves his evaluation. A third area of application discussed is high-precision wireline measurements.
- Published
- 1966
30. Simulation methodology II
- Author
-
Grace Carter, J. W. Schmidt, Michael Stonebraker, Michael A. Crane, William E. Biles, R. E. Taylor, Donald L. Inglehart, and V. Chachra
- Subjects
business.industry ,Computer science ,Face (geometry) ,Artificial intelligence ,Session (computer science) ,business ,Machine learning ,computer.software_genre ,computer ,Confidence interval ,Task (project management) - Abstract
This session focuses on new techniques to assist practitioners of simulation in obtaining desired results efficiently. Many such users are attempting to find optimum performance of a simulated system. In this situation, the problem of selecting a procedure to search for the best choice is a challenging one. Two papers in this session compare alternate strategies for attacking this question. Other users face the task of finding confidence intervals for quantities obtained from simulation experiments. This job is often complicated by statistical dependence of successive observations. The third paper in this session suggests a way around this difficulty by utilizing properties found in many stable stochastic systems.
- Published
- 1973
31. A pattern recognition program that generates, evaluates, and adjusts its own operators
- Author
-
Leonard Uhr and Charles Vossler
- Subjects
Computer science ,business.industry ,media_common.quotation_subject ,Machine learning ,computer.software_genre ,Pattern recognition (psychology) ,Code (cryptography) ,Artificial intelligence ,Function (engineering) ,Set (psychology) ,business ,computer ,media_common ,Problem space - Abstract
This paper describes an attempt to make use of machine learning or self-organizing processes in the design of a pattern-recognition program. The program starts not only without any knowledge of specific patterns to be input, but also without any operators for processing inputs. Operators are generated and refined by the program itself as a function of the problem space and of its own successes and failures in dealing with the problem space. Not only does the program learn information about different patterns, it also learns or constructs, in part at least, a secondary code appropriate for the analysis of the particular set of patterns input to it.
- Published
- 1961
32. COMPUTER PATTERN RECOGNITION TECHNIQUES: SOME RESULTS WITH REAL ELECTROCARDIOGRAPHIC DATA
- Author
-
Mitsuharu Okajima, S. Yasui, Lawrence Stark, and Gerald H. Whipple
- Subjects
Electronic Data Processing ,business.industry ,Computer science ,Computer Applications ,Biomedical Engineering ,General Medicine ,Application software ,computer.software_genre ,Machine learning ,Automation ,Weighting ,Adaptive filter ,Electrocardiography ,Artificial Intelligence ,Adaptive system ,Humans ,Artificial intelligence ,Medical diagnosis ,business ,computer ,Euclidean vector - Abstract
Automatic interpretation of electrocardiograms is a particular example of the application of digital computers to medical diagnosis; this paper describes our experience with a new approach involving pattern recognition techniques. The program employs a multiple adaptive matched filter system with a variety of normalization, weighting, comparison, decision, modification, and adapting operations. The flexibility of the method has permitted study of effects of experimental variations of these operations on the pattern classification process to simulate human interpretation of electrocardiograms more closely. These programs have been successfully applied to actual electrocardiograms from cardiac patients. These researches in application of computer pattern recognition techniques to the automatic interpretation of electrocardiograms have been undertaken because they join together three fields of great interest. First, an example of artificial intelligence or a self-organized system is represented by the adaptive filter memory, together with the related decision operations. Second, we consider our program to be a model of complex sensory discrimination and use our intuition of human psychology as a guide when selecting one of several possible program mechanisms to overcome temporary obstacles. Third, the automation of medical diagnosis is a rapidly developing and promising field contributing to medical progress. This paper pays particular attention to the third of these objectives. The present state of computer analysis of electrocardiograms is mainly one of orthogonalization of the spatial vector, point recognition to separate the various component waves, parameterization, in one case via Fourier techniques, and then statistical matrix analysis.
- Published
- 1963
33. An experimental anatomy computer system
- Author
-
H.Reynold Fiege, Robert S. Smith, Elaine Kruger, and Dawn Palit
- Subjects
Male ,Computer science ,business.industry ,Computers ,media_common.quotation_subject ,Medicine (miscellaneous) ,Cognition ,Object (computer science) ,Machine learning ,computer.software_genre ,Automaton ,Models, Structural ,Human–computer interaction ,Humans ,Artificial intelligence ,Objectification ,Medical diagnosis ,Anatomy ,Function (engineering) ,Set (psychology) ,business ,computer ,Discipline ,media_common ,Information Systems - Abstract
Future computer systems, to assist the practicing physician, may take many forms. Some may, as independent automata, prepare and present specific diagnoses or identify and recommend specified courses of therapy. Others may assist by augmenting the physician's processes of reasoning to enhance both his diagnostic ability and his therapeutic decisions. Such systems will serve as prosthetic extensions of human cognition and will function as integrated manmachine systems combining the unique features of both the man and machine to the solution of diagnostic and therapeutic problems. This latter type of synergetic automaton must have the ability to answer any question submitted by the physician within its information content. Thus, it must be able to interpret the question and derive the answer in a simulation of the human ability to use knowledge. To achieve this, basic research is needed to develop a general system for the organization of disciplinary information and general programs for its utilization. Such research involves the development of a simulation, suitable for computer application, of the primitive pattern of human objectification and the fundamental processes of organization. This paper presents the first operational stage of a computer system constructed for the study of one aspect of this problem. The study involved the modeling of gross objects of the physical world, their attributes and relationships; the subject area used was human gross anatomy and the object set, a male human body. The output of the computer system is demonstrated, comments are given, the system is described, and plans for subsequent research discussed. The paper is written primarily as a progress report on the first operational stage of an experimental system.
- Published
- 1969
34. XI. Letter addressed to the Secretary R. S. by Dr. W. Roberts, F. R. S
- Author
-
William Roberts
- Subjects
Computer science ,business.industry ,Pancreatic Extracts ,Artificial intelligence ,business ,Machine learning ,computer.software_genre ,General Economics, Econometrics and Finance ,computer - Abstract
In deference to the request of Mr. W. R. Dunstan, I wish to correct an error of omission in my paper “On the Estimation of the Amylolytic and Proteolytic Activity of Pancreatic Extracts,” printed in “Proc. Roy. Soc.,” vol. 32, p. 145. Mr. Dunstan points out to me that I had overlooked a paper by himself and Mr. A. F. Dimmock on the “Estimation of Diastase, published in the “Pharmaceutical Journal” for March 8th, 1879, wherein he described a process, in which (as in my method) the cessation of the iodine reaction is utilised for the purpose of gauging the activity of diastasic solutions on starch gelatine. I had not previously seen this paper, and am now glad to have the opportunity of referring to it those who are interested in diastasimetry.
- Published
- 1882
35. Statistical Inference Regarding Markov Chain Models
- Author
-
Chris Chatfield
- Subjects
Statistics and Probability ,Markov chain ,Computer science ,business.industry ,Variable-order Markov model ,Markov chain Monte Carlo ,Markov model ,Machine learning ,computer.software_genre ,Predictive inference ,symbols.namesake ,Frequentist inference ,Statistical inference ,Fiducial inference ,symbols ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,computer - Abstract
The paper reviews different techniques for examining sequential dependencies in a series of observations each of which can have c possible outcomes. The relationship between the likelihood ratio test and information theory is described. The paper also considers how the techniques need to be modified in situations where two successive outcomes are always different.
- Published
- 1973
36. A BAYESIAN SOLUTION FOR SOME EDUCATIONAL PREDICTION PROBLEMS
- Author
-
D. V. Lindley
- Subjects
Operations research ,business.industry ,Physics::Physics Education ,Machine learning ,computer.software_genre ,Bayesian solution ,Variety (cybernetics) ,General theory ,Value (economics) ,Statistical analysis ,Artificial intelligence ,business ,computer ,Mathematics - Abstract
The paper considers a single college selecting students from a variety of high schools. It criticizes the usual statistical analysis and proposes an alternative procedure which uses more efficiently all information that is present in the system. The bulk of the paper consists of a mathematical appendix in which some general theory is developed which, it is expected, will be of value in studying other more complex prediction systems besides the one discussed here.
- Published
- 1969
37. Computer Augmented Organizational Problem Solving
- Author
-
Joyner, Robert and Tunstall, Kenneth
- Published
- 1970
38. What Has Cybernetics to Do with Operational Research?
- Author
-
Beer, Stafford
- Published
- 1959
- Full Text
- View/download PDF
39. An Improved Algorithm for the Generation of Nonparametric Curves
- Author
-
W.J. Lennon, B.W. Jordan, and B.D. Holm
- Subjects
Current (mathematics) ,business.industry ,Improved algorithm ,Nonparametric statistics ,Machine learning ,computer.software_genre ,Theoretical Computer Science ,Decision variables ,Computational Theory and Mathematics ,Hardware and Architecture ,Plotter ,Numerical control ,Point (geometry) ,Artificial intelligence ,business ,Representation (mathematics) ,Algorithm ,computer ,Software ,Mathematics - Abstract
Generation of curves using incremental steps along fixed coordinate axes is important in such diverse areas as computer displays, digital plotters, and numerical control. Direct implementation of a nonparametric representation of a curve, f(x, y) = 0, has been shown to be attractive for digital generation. The algorithm in this paper is developed directly from the nonparametric representation of the curve, allows steps to be taken to any point adjacent to the current one, and uses decision variables closely related to an error criterion. Consequently, the algorithm is more general and produces curves closer to the actual curve than do previously reported algorithms.
- Published
- 1973
40. Pedestrian Traffic Planning and the Perception of the Urban Environment: A French Example
- Author
-
B Marchand
- Subjects
Visual perception ,Computer science ,media_common.quotation_subject ,Geography, Planning and Development ,0211 other engineering and technologies ,0507 social and economic geography ,02 engineering and technology ,Pedestrian ,Environmental Science (miscellaneous) ,Space (commercial competition) ,Machine learning ,computer.software_genre ,Urban area ,Perception ,Mental mapping ,Computer vision ,Multidimensional scaling ,media_common ,geography ,geography.geographical_feature_category ,business.industry ,05 social sciences ,021107 urban & regional planning ,Metric (mathematics) ,Artificial intelligence ,business ,050703 geography ,computer - Abstract
Planning pedestrian traffic in the city involves a better understanding of pedestrians' behavior and their perception of the urban environment. Lynch's (1961) studies proposed a qualitative method. An alternative method, essentially quantitative, is proposed here: pedestrians surveyed are asked to locate on a paper some well-known landmarks (six in all). Distances between each pair of them are measured. The model allows (1) interpretation of the degree of agreement between mental maps, and (2) a study of the ‘mean’ map. Perception seems to make space more symmetrical. Distortions can be explained by two effects: differences in transportation modes, and a particular knowledge of the neighborhood. The mental map recovered through multidimensional scaling is compared with the topographic one. It does not have the metric topology.
- Published
- 1974
41. The Measurement of Attention Capacity through Concurrent Task Performance with Individual Difficulty Levels and Shifting Priorities
- Author
-
Robert A. North and Daniel Gopher
- Subjects
Engineering ,Secondary task ,business.industry ,05 social sciences ,Control (management) ,Poison control ,General Medicine ,Machine learning ,computer.software_genre ,050105 experimental psychology ,Dual (category theory) ,Task (project management) ,Job analysis ,Task analysis ,0501 psychology and cognitive sciences ,Artificial intelligence ,business ,computer ,Compensatory tracking ,050107 human factors ,Simulation - Abstract
Some of the unsolved problems in the application of secondary task techniques include: (a) the evaluation of relative changes in performance in dual task situations; (b) the prediction of possible interactions between different tasks and their components; and (c) the extent of voluntary control of capacity allocation. The present paper describes a three-phase experiment in which an effort was made to attack these problems by a new methodological approach. The three successive phases included separate performance of the experimental tasks (one dimensional compensatory tracking and a digit processing, reaction time task) with adaptive adjustment of difficulty, simultaneous performance of the tasks with equal task priorities, and simultaneous performance with several manipulations of the two task priorities. The results have demonstrated the usefulness of the general methodological approach for the assessment of capacity limitations as well as for the evaluation of possible interactions between tasks. With regard to the allocation of capacity, the experimental results proved that, in general, subjects were able to adjust their allocation of capacity to the various changes in task priorities.
- Published
- 1974
42. A Tentative Classification of Decision Making
- Author
-
Rita Loeb
- Subjects
Typology ,Subjectivity ,Sociology and Political Science ,Basis (linear algebra) ,business.industry ,media_common.quotation_subject ,Resolution (logic) ,Machine learning ,computer.software_genre ,Similarity (psychology) ,Cognitive dissonance ,Artificial intelligence ,Sociology ,Objectivity (science) ,business ,Function (engineering) ,computer ,Social psychology ,media_common - Abstract
This paper presents a two-dimensional classification of decision making based on two continuous variables which interact, namely, degree of subjectivity (objectivity) of attitude and degree of dissimilarity (similarity) of function of choice alternatives. On the basis of this typology two new ways of dissonance resolution are proposed.
- Published
- 1974
43. Statistical methods for the analysis of genotype-environment interactions
- Author
-
G H Freeman
- Subjects
Analysis of Variance ,Multivariate statistics ,Genotype ,Relation (database) ,business.industry ,Statistics as Topic ,Stability (learning theory) ,Inference ,Regression analysis ,Environment ,Biology ,Machine learning ,computer.software_genre ,Measure (mathematics) ,Regression ,Genes ,Genetics ,Regression Analysis ,Artificial intelligence ,business ,Set (psychology) ,computer ,Mathematics ,Genetics (clinical) - Abstract
This is largely a review paper, describing various statistical methods for analysing interactions in general and genotype-environment interactions in particular, and giving nearly 100 references to previous work. The joint regression analysis approach introduced by Yates and Cochran (1938) is considered in some detail; alternatives to regression are discussed, as are various stability parameters. Much work has been done on statistical methods for testing for interactions in general, and this also is reviewed, from Tukey's (1949) one degree of freedom for non-additivity to Milliken and Graybill's (1970) generalisation to testing for various types of possible interaction. The difficulties of testing and inference in the presence of interaction are discussed. Data from a two-way table may be regarded as a multivariate set, as first shown by Williams (1952) and later extended by others, particularly Mandel (1969b and 1971 ). These methods are only just beginning to be used in studies of genotype-environment interactions, and several recent references are given. External measurements may be used to measure the environment and these may be either physical or biological. Again, the appropriate methods of analysis are fairly new. The interpretation of interactions is considered in relation to the use to be made of the results. It is suggested that various multivariate techniques may be used to assist in the elucidation of interactions, especially when these are not easy to explain by simpler methods of analysis.
- Published
- 1973
44. Parameter identification in models of hydrologic systems using the digital simulation language PDEL
- Author
-
Walter J. Karplus and Wing Cheung Tam
- Subjects
Numerical Analysis ,Partial differential equation ,General Computer Science ,business.industry ,Computer science ,Applied Mathematics ,Hydrological modelling ,Type (model theory) ,Machine learning ,computer.software_genre ,Computer Graphics and Computer-Aided Design ,Theoretical Computer Science ,Simulation language ,Identification (information) ,Modeling and Simulation ,System parameters ,Environmental systems ,Artificial intelligence ,business ,computer ,Software - Abstract
Hydrologic systems, like most environmental systems, are usually characterized by partial differential equations. In order to develop a suitable mathe matical model for the analysis of such systems, it, is first necessary to make use of observed data in order to identify or estimate the system parameters. The digital simulation language PDEL has been ex tended so as to facilitate this type of parameter identification. The extended version of the lan guage has been named PDEL-ID. In this paper, two hydrologic modeling problems taken from recent water-resource studies are used as examples to illustrate the parameter identification facilities of PDEL. The solution of these two problems demon strates that little programming effort and virtually no knowledge of optimization techniques are required.
- Published
- 1974
45. A general Bayesian model for hierarchical inference
- Author
-
Scott Barclay and Clinton W. Kelly
- Subjects
Structure (mathematical logic) ,Theoretical computer science ,business.industry ,Probabilistic logic ,Inference ,General Medicine ,Inductive reasoning ,Bayesian inference ,Machine learning ,computer.software_genre ,Variable (computer science) ,Conditional independence ,Artificial intelligence ,Special case ,business ,computer ,Mathematics - Abstract
An inductive inference problem will often be structured so that the target variable (hypotheses) are logically distant from the observable events (data). In this situation it may be difficult or impossible to assess the probabilistic connection between them, but it may be possible to decompose the problem through the use of intermediate or explanatory variables. That is, it will often be possible to assess the likelihood of the observed data given some intermediate variable, and the likelihood of that intermediate variable given another, and so on, until the hypotheses of interest are reached. Inferences which incorporate one or more intermediate variables are called hierarchical, cascaded, or multistage inferences. The present paper presents a normative model for the solution of the general hierarchical inference problem. The formulation begins with a formal description of the hierarchical inference tree, including a discussion of various simplifying conditional independence assumptions. The solution is first derived for three special case models of differing structure, and then the algorithm for the general solution is given for two cases: one in which the conditional independence assumptions have been made and one in which they have not.
- Published
- 1973
46. Unsupervised root locus gain selection
- Author
-
Charles J. Kocourek and Richard A. Northouse
- Subjects
business.industry ,Root locus ,Machine learning ,computer.software_genre ,Computer Science Applications ,ComputingMethodologies_PATTERNRECOGNITION ,Control and Systems Engineering ,ComputingMethodologies_SYMBOLICANDALGEBRAICMANIPULATION ,Point (geometry) ,Artificial intelligence ,business ,computer ,Algorithm ,Selection (genetic algorithm) ,Mathematics - Abstract
This paper describes an algorithm for the unsupervised selection of gains for a Root Locus analysis. This algorithm includes breakaway point gains, critical gains, and additive increments.
- Published
- 1974
47. Glossary and index to remotely sensed image pattern recognition concepts
- Author
-
Robert M. Haralick
- Subjects
Information retrieval ,Glossary ,business.industry ,Computer science ,Perspective (graphical) ,Meaning (non-linguistic) ,Resolution (logic) ,Machine learning ,computer.software_genre ,Field (computer science) ,Index (publishing) ,Artificial Intelligence ,Signal Processing ,Word usage ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,computer ,Software - Abstract
The purpose of the glossary is to state in the simplest possible way the general meaning or word usage for many of the terms in image pattern recognition. There is no intent to provide definitive statements for terms such as “resolution” but rather only statements about the general nature of what resolution is. There is no intent to provide mathematical formulas involving integrals or derivatives in any of the statements. Those who need the mathematics can get it from technical papers or texts. The glossary is designed to be read by those generally unfamiliar with the area and provide for them an overall perspective. The organization approaches that of programmed learning material and can be smoothly (I hope) read from beginning to end. Those needing to look up a specific term can do so via the index. There is some overlap of terms in this glossary with those glossaries or definitions in radiometry and aerial photography. There is no intent that the way the terms are described here replace the way they are described in those glossaries and definitions. The overlap is provided here so that the reader can get a perspective of a cluster of terms frequently used in our field. The perspective is intended to start from what the image concept is through the recording of an image by some sensor, the possible conversion of image format and the simple analog or more complex digital processing which must be done on the imagery. In short, the perspective is one of image pattern recognition.
- Published
- 1973
48. On hierarchical structure adaptation and systems identification†
- Author
-
C.M. Fry and Andrew P. Sage
- Subjects
Structure (mathematical logic) ,business.industry ,Computer science ,Distributed computing ,System identification ,Machine learning ,computer.software_genre ,Computer Science Applications ,Reduction (complexity) ,Identification (information) ,Parallel processing (DSP implementation) ,Dimension (vector space) ,Control and Systems Engineering ,Artificial intelligence ,business ,Adaptation (computer science) ,computer - Abstract
This paper demonstrates that Magill's method of structure adaptation can be incorporated into the framework of hierarchical estimation and system identification. The associated reduction in computational requirements potentially allows efficient parallel processing of these algorithms for use with systems of high dimension.
- Published
- 1974
49. Prediction logic: A method for empirical evaluation of formal theory†
- Author
-
David K. Hildebrand, James D. Laing, and Howard Rosenthal
- Subjects
Algebra and Number Theory ,Sociology and Political Science ,Computer science ,business.industry ,Theory ,Artificial intelligence ,Data mining ,Machine learning ,computer.software_genre ,business ,computer ,Social Sciences (miscellaneous) - Abstract
This paper proposes an approach to data analysis that assists the investigator in discriminating among specific relations corresponding to alternative scientific predictions about qualitative variates.
- Published
- 1974
50. A comparison and evaluation of three machine learning procedures as applied to the game of checkers
- Author
-
Arnold K. Griffith
- Subjects
Linguistics and Language ,Polynomial ,Theoretical computer science ,business.industry ,Heuristic ,Computer science ,media_common.quotation_subject ,Machine learning ,computer.software_genre ,Language and Linguistics ,Artificial Intelligence ,Simple (abstract algebra) ,Linear polynomial ,Simplicity ,Artificial intelligence ,business ,computer ,media_common - Abstract
This paper presents two new machine learning procedures used to arrive at “knowledgeable” static evaluators for checker board positions. The static evaluators are compared with each other, and with the linear polynomial used by Samuel [9], using two different numerical indices reflecting the extent to which they agree with the choices of checker experts in the course of tabulated book games. The new static evaluators are found to perform about equally well, despite the relative simplicity of the second; and they perform noticably better than the linear polynomial. An indication of the significance of the absolute values of these two numerical indices is provided by a discussion of a simple, purely heuristic, static evaluator, whose performance indices lie between those of the polynomial and those of the other two static evaluators.
- Published
- 1974
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.