46 results
Search Results
2. Letters to the editor: On 'prime phrase' in Feldman and Gries paper
- Author
-
Paul W. Abrahams
- Subjects
Order of operations ,Phrase ,General Computer Science ,Computer science ,business.industry ,Speech recognition ,Compiler ,Artificial intelligence ,computer.software_genre ,business ,computer ,Prime (order theory) ,Natural language processing - Published
- 1968
3. Commentary on Mr. Mooers' paper
- Author
-
T. B. Steel
- Subjects
General Computer Science ,Computer science ,business.industry ,Programming language ,Second-generation programming language ,computer.software_genre ,Programming paradigm ,Fourth-generation programming language ,Artificial intelligence ,Fifth-generation programming language ,First-generation programming language ,business ,computer ,Natural language processing ,Programming language theory - Published
- 1968
4. Variable-Precision Exponentiation.
- Author
-
Richman, P. L. and Timlake, W. P.
- Subjects
COMPUTER algorithms ,COMPUTER programming ,ARTIFICIAL intelligence ,PROGRAMMING languages ,ELECTRONIC data processing ,COMPUTER science - Abstract
A previous paper presented an efficient algorithm, called the Recomputation Algorithm, for evaluating a rational expression to within any desired tolerance on a computer which performs variable-precision arithmetic operations. The Recomputation Algorithm can be applied to expressions involving any variable-precision operations having O(10
-... + Σ ∣ε∣) error bounds, where p denotes the operation's precision and ε, denotes the error in the operation's with argument. This paper presents an efficient variable-precision exponential operation with an error bound of the above order. Other operations, such as log, sin, and cos, which have simple series expansions, can be handled similarly. [ABSTRACT FROM AUTHOR]- Published
- 1973
- Full Text
- View/download PDF
5. A Theorem-Proving Language for Experimentation.
- Author
-
Standish, T. A., Henschen, L., Overbeek, Ross, and Wos, L.
- Subjects
AUTOMATIC theorem proving ,PROGRAMMING languages ,ARTIFICIAL intelligence ,INFERENCE (Logic) ,COMPUTER logic ,COMPUTER software - Abstract
Because of the large number of strategies and inference rules presently under consideration in automated theorem proving, there is a need for developing a language especially oriented toward automated theorem proving. This paper discusses some of the features and instructions of this language. The use of this language permits easy extension of automated theorem-proving programs to include new strategies and/or new inference rules. Such extendability will permit general experimentation with the various alternative systems. [ABSTRACT FROM AUTHOR]
- Published
- 1974
6. Letters to the Editor.
- Author
-
Boas, R. P., Henirici, Peter, Sammet, Jean E., Swanson, Rowena, and Potter, Stephen
- Subjects
LETTERS to the editor ,PROGRAMMING languages ,DIFFERENTIAL equations ,ARTIFICIAL intelligence ,INFORMATION retrieval ,MACHINE theory - Abstract
Presents several letters to the editor on issues related to computer languages. Comments on the usage of heavy sprinkling of capital letters in the computing literature; Discussion on the paper related to the statistical method used in the study of roundoff propagation in the solution of ordinary differential equations; View that vagueness of the terms "information retrieval," "cognitive processes," "artificial intelligence," etc., has led to too many conferences that have tried to span so wide a field that they have not been effective in any one area.
- Published
- 1966
7. Technical Program Sessions and Chairmen.
- Subjects
CONFERENCES & conventions ,COMPUTER science ,CYBERNETICS ,ARTIFICIAL intelligence ,MACHINE theory - Abstract
The article presents information on technical program and sessions of the "25th Anniversary Conference" of the Association for Computing Machinery (ACM), which would be held in Boston, from August 14-16, 1972. John J. Donovan would hold Feature session on "Current Research in Computer Science." The following papers would be presented in the paper sessions: "Artificial Intelligence: Theoretical Papers," by George W. Ernst; "Artificial Intelligence: General Paper," by Thomas G. Evans; "Implementation of Medical Information Systems," by Lael Gatewood; "Computer Languages for Interactive Health Services," by Allan H. Levy; and others.
- Published
- 1972
8. professional activities.
- Subjects
CONFERENCES & conventions ,COMPUTER industry ,COMPUTER science ,ARTIFICIAL intelligence ,COMPUTABLE functions ,INTELLIGENT agents - Abstract
The article presents information about some forthcoming conferences related to the computer industry. The Third Technical Symposium of the Special Interest Group on Computer Science Education of the Association for Computing Machinery (ACM) will be held at the Niel House Hotel in Columbus, Ohio, during February 22-23, 1973. A conference on Principles of Programming Languages, sponsored jointly by the ACM Special Interest Group on Automata and Computability Theory and the ACM Special Interest Group for Programming Languages, will be held at the Copley Plaza Hotel in Boston, Massachusetts, during October 1-3, 1973. A conference on Cognitive Verfahren und Systeme (Artificial Intelligence) will be held in Hamburg, German Federal Republic, during April 11-13, 1973. The Annual Meeting of the American Society for Information Science will be held at the Hilton Hotel, Los Angeles, California, during October 21-25, 1973. The First Annual Symposium of the newly formed ACM Special Interest Group on Measurement and Evaluation will be held in Palo Alto, California, during February 26-28, 1973.
- Published
- 1973
9. Toward an Automata Theory of Brains.
- Author
-
Arbib, Michael A.
- Subjects
MACHINE theory ,ARTIFICIAL intelligence ,HUMAN information processing ,BIONICS ,BRAIN ,ROBOTICS ,AUTOMATION ,LOGIC - Abstract
A source of ideas for automata theory — the study of the brain — has been pushed aside in mathematical development of the theory. This paper suggests the ways in which automata theory might evolve over the next 25 years if it is to contribute to an understanding of how the brain processes information. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
10. A Man-Machine Approach Toward Solving the Traveling Salesman Problem.
- Author
-
Krolak, Patrick, Felts, Wayne, and Marble, George
- Subjects
PRODUCTION scheduling ,HUMAN-computer interaction ,ARTIFICIAL intelligence - Abstract
Describes a man-machine approach toward solving the traveling salesman problem, a class of scheduling and routing problem. Features of a computer-aided heuristic technique adopted for the problem; New directions in the fields of man-machine interaction and artificial intelligence arising from the approach.
- Published
- 1971
- Full Text
- View/download PDF
11. COMING EVENTS.
- Subjects
CONFERENCES & conventions ,BIOMATHEMATICS ,MACHINE theory ,ARTIFICIAL intelligence - Abstract
The article offers information on upcoming congresses and symposiums in the U.S., including the Seventh Annual Symposium on Switching and Automata Theory to be held at the University of California, Berkeley, California, on October 26-28, 1966, the Coins Symposium on Learning, Adaptation, and Control in Information Systems to be held on August 22-24, 1966, and the Symposium on Biomathematics and Computer Science in the Life sciences to be held on March 24-26, 1966.
- Published
- 1966
12. Comments on Moorer's Music and Computer Composition.
- Author
-
Smoliar, Stephen W. and McGuire, Micheal R.
- Subjects
COMPUTER composition ,COMPUTER sound processing ,ELECTRONIC music ,ARTIFICIAL intelligence ,DIGITAL computer simulation ,MUSICAL composition mechanical aids - Abstract
Comments on J.A. Moorer's article entitled "Music and Computer Composition." Criticism of the views of the author on musical composition; Results of the simulation of the compositional abilities of humans; Reply by Moorer on the criticism of his views.
- Published
- 1972
- Full Text
- View/download PDF
13. A Method for Composing Simple Traditional Music by Computer.
- Author
-
Rader, Gary M. and Montgomery, C. A.
- Subjects
COMPUTER simulation ,COMPUTER programming ,COMPUTER scientists ,OPERATIONS research ,COMPUTER-aided design ,MUSICIANS - Abstract
A method is described for composing musical rounds by computer. This method uses some music theory plus additional heuristics. Fundamental to the method is a set of productions together with sets of applicability rules and weight rules which operate on the productions deciding when and to what extent they are available for use. Several rounds generated by the computer implementation of the method are presented. Generally, the resultant music sounds mediocre to the professional although usually pleasing to the layman. It appears that full-blown music theory is not needed for rounds—all the hardware required for structural Levels is not necessary for these pieces. The author has fried to address both musicians and computer scientists. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
14. Order-n Correction for Regular Languages.
- Author
-
Wagner, Robert A. and Standish, T.A.
- Subjects
PROGRAMMING languages ,INFORMATION retrieval ,ARTIFICIAL intelligence - Abstract
Examines a method for calculating the correction of regular computer languages. Requirements for the calculation; Application of method in information retrieval, artificial intelligence and spelling correction systems; Definition of a set of edit operations.
- Published
- 1974
- Full Text
- View/download PDF
15. A Learning Program Which Plays Partnership Dominoes.
- Author
-
Smith, Michael H.
- Subjects
DOMINOES ,COMPUTER game programming ,ARTIFICIAL intelligence ,BASIC (Computer program language) - Abstract
Describes a learning program which has been written in BASIC computer language to play four-player partnership dominoes. Application of different principles of artificial intelligence and problem solving; Description of the rules; Utilization of a strategy signature table which classifies board situations throughout the interactions of game parameters.
- Published
- 1973
- Full Text
- View/download PDF
16. COKO III: The Cooper-Koz Chess Program.
- Author
-
Kozdrowicki, Edward W., Cooper, Dennis W., and Lawson, C. L.
- Subjects
GAMES ,VIDEO games ,CHESS clubs ,ALGORITHMS ,BOARD games ,IBM computers - Abstract
COKO III is a chess player written entirely in Fortran. On the IBM 360-65, COKO III plays a minimal chess game at the rate of .2 sec cpu time per move, with a level close to lower chess club play. A selective tree searching procedure controlled by tactical chess logistics allows a deploy merit of multiple minimal game calculations to achieve some optimal move selection. The tree searching algorithms are the heart of COKO's effectiveness, yet they are conceptually simple. in addition, an interesting phenomenon called a tree searching, catastrophe has plagued COKO's entire development just as it troubles a human player. Standard exponential growth is curbed to a large extent by the definition and trimming of the Fischer set. A clear distinction between tree pruning and selective tree searching is also made. Representation of the chess environment is described along with a strategical preanalysis procedure that maps the Lasker regions. Specific chess algorithms are described which could be used as a command structure by anyone desiring to do some chess program experimentation. A comparison is made of some mysterious actions of human players and COKO III. [ABSTRACT FROM AUTHOR]
- Published
- 1973
- Full Text
- View/download PDF
17. Music and Computer Composition.
- Author
-
Moorer, James Anderson
- Subjects
COMPUTER composition ,COMPUTER programming ,COMPUTER music ,HARMONY in music ,POPULAR music ,MELODY - Abstract
The problem discussed is that of simulating human composition of Western popular music by computer and some relevant theories of music and harmony are given. Problems with this kind of program and several schemes that are known not to work are discussed. Several previous computer compositions are discussed, including the ILLIAC Suite. A program to generate short melody fragments was written to simulate some of the aspects of human composition. Five samples of its output are presented and discussed. It was discovered that although the fragments show many of the characteristics of popular melodies, they have a strangely alien sound. It is theorized that this is because the relevant probabilities which would discriminate against unfamiliar sequences were not used. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
18. On Shrinking Binary Picture Patterns.
- Author
-
Levialdi, S. and Newman, W.
- Subjects
PARALLEL processing ,ARTIFICIAL neural networks ,ALGORITHMS ,COMPUTER networks ,ARTIFICIAL intelligence ,ELECTRONIC data processing - Abstract
A parallel processing algorithm for shrinking binary patterns to obtain single isolated elements, one for each pattern, is presented. This procedure may be used for counting patterns on a matrix, and a hardware implementation of the algorithm using large scale integrated technology is envisioned. The principal features of this method are the very small window employed (two-by-two elements), the parallel nature of the process, and the possibility of shrinking any pattern, regardless of the complexity of its configuration. Problems regarding merging and disconnection of patterns during the process as well as the determination of the maximum number of steps necessary to obtain a single isolated element from a pattern, are reviewed and discussed. An analogy with a neural network description, in terms of McCulloch-Pitts "neurons" is presented. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
19. Symbolic Integration: The Stormy Decade.
- Author
-
Moses, Joel
- Subjects
MATHEMATICAL logic ,INTEGRATORS ,ARTIFICIAL intelligence - Abstract
Describes approaches to symbolic integration in the 1960s. Work on artificial intelligence which led to Slagle's Symbolic Automatic Integrator and Moses' Symbolic Integrator; Implementations from algebraic manipulation; Proof of the unsolvability of the problem for a class of functions derived from the mathematics approach; Generalizations of Risch algorithms for solving differential equations.
- Published
- 1971
- Full Text
- View/download PDF
20. Toward Automatic Program Synthesis.
- Author
-
Manna, Zohar, Waldinger, Richard I., and Gries, D.
- Subjects
PROGRAMMING languages ,PROBLEM solving ,COMPUTER programming ,C (Computer program language) ,MATHEMATICAL induction ,ARTIFICIAL intelligence - Abstract
An elementary outline of the theorem-proving approach to automatic program synthesis Is given, without dwelling on technical details. The method Is Illustrated by the automatic construction of both recursive and Iterative programs operating on natural numbers, lists, ant frees. In order to construct a program satisfying certain specifications, a theorem induced by those specifications is proved, and the desired program Is extracted from the proof. The same technique is applied to transform recursively defined functions into iterative programs, frequently with a major gain in efficiency. It is emphasized that hi order to construct a program with loops or with recursion, the principle of mathematical Induction must be applied. The relation between the version of the induction rule used and the form of the program constructed is explored in some detail. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
21. Experiments in Automatic Learning for a Multipurpose Heuristic Program.
- Author
-
Lawson, C. L., Slagle, James R., and Farrell, Carl D.
- Subjects
COMPUTER programming ,CALCULUS software ,MATHEMATICAL analysis ,REGRESSION analysis ,MATHEMATICAL statistics ,MULTIVARIATE analysis - Abstract
An automatic learning capability has been developed and implemented for use with the MULTIPLE (MULTIpurpose Program that LEarns) heuristic tree-searching program, which is presently being applied to resolution theorem-proving in predicate calculus. MULTIPLE's proving program (PP) uses two evaluation functions to guide its search for a proof of whether or not a particular goal is achievable. Thirteen general features of predicate calculus clauses were created for use in the automatic learning of better evaluation functions for PP. A multiple regression program was used to produce optimal coefficients for linear polynomial functions in terms of the features. Also, automatic data-handling routines were written for passing data between the learning program and the proving program, and for analyzing and summarizing results. Data was generally collected for learning (regression analysis) from the experience of PP. A number of experiments were performed to test the effectiveness and generality of the learning program. Results showed that the learning produced dramatic improvements in the solutions to problems which were in the same domain as those used for collecting learning data. Learning was also shown to generalize successfully to domains other than those used for data collection. Another experiment demonstrated that the learning program could simultaneously improve performance on problems in a specific domain and on problems in a variety of domains. Some variations of the learning program were also tested. [ABSTRACT FROM AUTHOR]
- Published
- 1971
22. Transition Network Grammars for Natural Language Analysis.
- Author
-
Woods, W. A. and Bobrow, D. G.
- Subjects
NATURAL language processing ,NETWORK grammar ,PROGRAMMING languages ,ARTIFICIAL intelligence ,ELECTRONIC data processing ,HUMAN-computer interaction - Abstract
The use of augmented transition network grammars for the analysis of natural language sentences is described. Structure-building actions associated with the arcs of the grammar network allow for the reordering, restructuring, and copying of constituents necessary to produce deep-structure representations of the type normally obtained from a transformational analysis, and conditions on the arcs allow for a powerful selectivity which can rule out meaningless analyses and take advantage of semantic information to guide the parsing. The advantages of this model for natural language analysis are discussed in detail and illustrated by examples. An implementation of an experimental parsing system for transition network grammars is briefly described. [ABSTRACT FROM AUTHOR]
- Published
- 1970
- Full Text
- View/download PDF
23. Natural Language Question-Answering Systems: 1969.
- Author
-
Bobrow, D. G. and Simmons, Robert F.
- Subjects
COMPUTATIONAL linguistics ,NATURAL language processing ,ARTIFICIAL intelligence ,PROGRAMMING languages ,DATA structures ,QUESTION answering systems - Abstract
Recent experiments in programming natural language question-answering systems are reviewed to summarize the methods that have been developed for syntactic, semantic, and logical analysis of English strings. It is concluded that at least minimally effective techniques have been devised for answering questions from natural language subsets in small scale experimental systems and that a useful paradigm has evolved to guide research efforts in the field. Current approaches to semantic analysis and logical inference are seen to be effective beginnings but of questionable generality with respect either to subtle aspects of meaning or to applications over large subsets of English. Generalizing from current small-scale experiments to language-processing systems based on dictionaries with thousands of entries-with correspondingly large grammars and semantic systems-may entail a new order of complexity and require the invention and development of entirely different approaches to semantic analysis and question answering. [ABSTRACT FROM AUTHOR]
- Published
- 1970
24. A Global Parser for Context-Free Phrase Structure Grammars.
- Author
-
Unger, Stephen H. and Wirth, N.
- Subjects
COMPUTER algorithms ,HEURISTIC programming ,SNOBOL (Computer program language) ,PARSING (Computer grammar) ,ARTIFICIAL intelligence ,PROGRAMMING languages - Abstract
An algorithm for analyzing any context-free phrase structure grammar and for generating a program which can then parse any sentence in the language (or indicate that the given sentence is invalid) is described. The parser is of the "top-to-bottom" type and is recursive. A number of heuristic procedures whose purpose is to shorten the basic algorithm by quickly ascertaining that certain substrings of the input sentence cannot correspond to the target nonterminal symbols are included. Both the generating algorithm and the parser hove been implemented in RCA SNOBOL and have been tested successfully on a number of artificial grammars and on a subset of ALGOL. A number of the routines for extracting data about a grammar, such as minimum lengths of N-derivable strings and possible prefixes, are given and may be of interest apart from their application in this particular context. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
25. Storage and Retrieval of Aspects of Meaning in Directed Graph Structures.
- Author
-
Simmons, R. F.
- Subjects
INFORMATION retrieval ,ARTIFICIAL intelligence ,PROGRAMMING languages ,ELECTRONIC data processing ,ELECTRONIC file management ,SORTING (Electronic computers) - Abstract
An experimental system that uses LISP to make a conceptual dictionary is described. The dictionary associates with each English word the syntactic information, definitional material, and references to the contexts in which it has been used to define other words. Such relations as class inclusion, possession, and active or passive actions are used as definitional material. The resulting structure serves as a powerful vehicle for research on the logic of question answering. Examples of methods of inputting information and answering simple English questions are given. An important conclusion is that, although LISP and other list processing languages ore ideally suited for producing complex associative structures, they are inadequate vehicles for language processing on any large scale—at least until they can use auxiliary memory as continuous extension of core memory. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
26. Experiments with a Deductive Question-Answering Program.
- Author
-
Gillies, D. B. and Slagle, James R.
- Subjects
ARTIFICIAL intelligence ,HEURISTIC programming ,NEURAL computers ,MATHEMATICAL analysis ,MATHEMATICAL programming ,CYBERNETICS - Abstract
As an investigation in artificial intelligence, computer experiments on deductive question-answering were run with a LISP program called DEDUCOM, an acronym for DEDUctive COMmunicator. When given 68 facts, DEDUCOM answered 10 questions answerable from the facts. A fact tells DEDUCOM either some specific information or a method of answering a general kind of question. Some conclusions drawn in the article ore: (1) DEDUCOM can answer a wide variety of questions. (2) A human can increase the deductive power of DEDUCOM by telling it more facts. (3) DEDUCOM can write very simple programs (it is hoped that this ability is the forerunner of an ability to self-program, which is a way to learn). (4) DEDUCOM is very slow in answering questions. (5) DEDUCOM's search procedure at present has two bad defects: some questions answerable from the given facts cannot be answered and some other answerable questions can be answered only if the relevant fads ore given in the "right" order. (6) At present, DEDUCOM's method of making logical deductions in predicate calculus has two bad defects: some facts have to be changed to logically equivalent ones before being given to DEDUCOM, and some redundant facts have to be given to DEDUCOM. [ABSTRACT FROM AUTHOR]
- Published
- 1965
27. A compiler-building system developed by Brooker and Morris
- Author
-
Saul Rosen
- Subjects
Flexibility (engineering) ,General Computer Science ,Object code ,business.industry ,Computer science ,Compiler ,Artificial intelligence ,computer.software_genre ,business ,Software engineering ,computer - Abstract
In a number of articles published during the past two years, R. A. Brooker and D. Morris (joined by J. S. Rohl in their most recent paper) have presented a very interesting programming system that they have developed for the Ferranti Atlas computer. The present paper describes some of the major features of their system. It expands on some points that the original authors cover briefly, and treats only very lightly some topics to which they devote considerable space. The purpose of this paper is purely expository. Except in some very small details, and in some comments, it does not intentionally depart from or add to the material published in the listed references. In the opinion of the writer, systems of this kind are well worth implementing and will provide useful research tools in the development of languages and techniques. This opinion is true even when such systems turn out to be of limited usefulness in producing “production” compilers, where compiling speed and object code optimization may be considered more important than language flexibility and elegance or generality of system organization.
- Published
- 1964
28. Generating parsers for affix grammars
- Author
-
David Crowe
- Subjects
General Computer Science ,Computer science ,Affix ,Context-sensitive grammar ,computer.software_genre ,Rule-based machine translation ,Indexed grammar ,Phrase structure grammar ,c-command ,Parsing ,business.industry ,Programming language ,Deterministic context-free grammar ,Parsing expression grammar ,Context-free grammar ,Embedded pushdown automaton ,Tree-adjoining grammar ,Extended Affix Grammar ,Affix grammar ,Stochastic context-free grammar ,S-attributed grammar ,Artificial intelligence ,Definite clause grammar ,L-attributed grammar ,business ,computer ,Natural language processing ,Van Wijngaarden grammar ,Bottom-up parsing - Abstract
Affix grammars are two-level grammars which are similar to van Wijngaarden's two-level grammars used in the definition of Algol 68. Affix grammars are shown by Koster to be equal in power to van Wijngaarden grammars. They are much more suited to parsing than are the latter, however. Koster, the inventor of affix grammars, suggests a top-down scheme for parsing them, based on recursive procedures. This paper presents a bottom-up scheme for parsing them, based on an extension of Floyd Production Language (FPL). Included is an algorithm, similar to that of DeRemer's, for converting a large class of affix grammars into FPL. The paper concludes by discussing briefly the applicabilities of the conversion algorithm and affix grammars in general, and some possible extensions to Koster's definition of affix grammars.
- Published
- 1972
29. Summary remarks
- Author
-
S. Gorn
- Subjects
Parsing ,General Computer Science ,Semantics (computer science) ,Computer science ,business.industry ,computer.software_genre ,Concatenation (mathematics) ,Semantics ,Syntax ,Expression (mathematics) ,Session (web analytics) ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Artificial intelligence ,business ,computer ,Natural language processing ,Generative grammar - Abstract
The topics began with discussion of almost exclusively syntactic analysis and methods. Beginning with context-free phrase-structure languages, we considered limitations thereof to remove generative syntactic ambiguities (Floyd), and extensions thereto to introduce more context-dependence (Rose). As the conference proceeded we ran through a spectrum of considerations in which the expressions in the languages considered were examined less and less as meaningless objects (the formal, or purely syntactic approach, as in the paper by Steel) and required more and more meaningful interpretations. In other words, we became more and more involved with semantic considerations. It is clear, then, that applications of the study of mechanical languages to programming must involve semantic questions; ADD must mean something more than the concatenation of three (not two) characters. The papers beyond Session 1 were therefore discussing the mechanization of semantics, but in only one case did we hear about the formalization (and hence mechanization) of the specification of the semantics of a language (McCarthy).
- Published
- 1964
30. A computer system for transformational grammar
- Author
-
Joyce Friedman
- Subjects
General Computer Science ,Syntax (programming languages) ,Programming language ,Computer science ,business.industry ,Generalized phrase structure grammar ,Attribute grammar ,Phrase structure rules ,Emergent grammar ,Government and binding theory ,computer.software_genre ,Syntax ,Transformational grammar ,Affix grammar ,Abstract syntax ,Lexical grammar ,Relational grammar ,Artificial intelligence ,Computational linguistics ,business ,computer ,Natural language processing ,Generative grammar - Abstract
A comprehensive system for transformational grammar has been designed and implemented on the IBM 360/67 computer. The system deals with the transformational model of syntax, along the lines of Chomsky's Aspects of the Theory of Syntax . The major innovations include a full, formal description of the syntax of a transformational grammar, a directed random phrase structure generator, a lexical insertion algorithm, an extended definition of analysis, and a simple problem-oriented programming language in which the algorithm for application of transformations can be expressed. In this paper we present the system as a whole, first discussing the general attitudes underlying the development of the system, then outlining the system and discussing its more important special features. References are given to papers which consider some particular aspect of the system in detail.
- Published
- 1969
31. Multiword list items
- Author
-
W. T. Comfort
- Subjects
General Computer Science ,Process (engineering) ,Computer science ,business.industry ,Artificial intelligence ,Self-organizing list ,Space (commercial competition) ,Element (category theory) ,computer.software_genre ,business ,Execution time ,computer ,Natural language processing - Abstract
The list concept as originally proposed by Newell, Simon and Shaw specified single computer words as elements of a list. This report describes the use of two or more consecutive words as one element. Such use results in a considerable saving in both the space required to hold a given amount of data, and in the execution time required to perform a given process on the data.Following a brief description of standard list structures with single-word items, the multiword items are introduced. Then variable-length items are described, along with the corresponding space-utilization problems. Finally, several examples are given to illustrate the use of multiword lists.This paper attempts to draw together various recent papers which have applied some of these concepts in different ways, and indicate how they relate to the more general problem.
- Published
- 1964
32. acm news.
- Subjects
AUDIOCASSETTES in education ,COMPUTER science ,INFORMATION processing ,ARTIFICIAL intelligence ,VIRTUAL storage (Computer science) - Abstract
The article presents news briefs related to the Association for Computing Machinery (ACM). The association is making available to its membership and others, selected speakers' presentations on audiocassettes. This service is an extension of the ACM audiocassette program. The previous cassette programs are available upon request. With the addition of these 19 tapes, the complete library will cover over 40 topics in computer science and information processing, such as database, virtual memory and computer research. The service was designed to provide the members and others interested with a source of expert audio reference material. In another development, made possible by an ACM artificial intelligence award, the second "Computers and Thought" public lecture was presented by computer scientist Patrick H. Winston at the International Joint Conference on Artificial Intelligence held at Stanford University, Palo Alto, California in August 1973.
- Published
- 1973
33. "Tell It Like It Is"
- Subjects
COMPUTER systems ,OFFICE practice automation ,ELECTRONIC systems ,COMPUTER industry ,ARTIFICIAL intelligence ,TECHNOLOGICAL innovations - Abstract
Discusses the significance of computer systems. Observation that in many fields, some were reluctant to move so rapidly into the new technology, while the younger members were extremely enthusiastic; Narration of the use of the office's computer terminal by the author's friend; View that computer systems are doing such fantastic things today that one does not need to claim tomorrow's achievements now.
- Published
- 1969
- Full Text
- View/download PDF
34. CALENDAR.
- Subjects
CONFERENCES & conventions ,NUCLEAR energy ,UNIVERSITIES & colleges ,ARTIFICIAL intelligence ,NEURAL computers ,ELECTRONIC data processing - Abstract
The article presents information related to the upcoming events organized by Association for Computing Machinery (ACM). 1969 ACM San Francisco Bay Area Technical Symposium will take place on April 18,1969 in San Francisco, California. Conference on Effective Use of Computers in the Nuclear Industry would be held on April 21-23, 1969 at the University of Tennessee, Knoxville, Tennessee. Computer Workshop for Civil Engineers will take place on April 28-30, 1969 Purdue University, Lafayette, Indiana. 24th ACM National Conference would take place on August 26-28, 1969 at San Francisco Hilton Hotel and San Francisco Civic Center in San Francisco, California. ACM Extensible Languages Symposium will be held on May 18, 1969 at Boston, Massachusetts. Association of Educational Data Systems Annual Convention would take place on May 6-9, 1969 at Portland, Oregon. International Joint Conference on Artificial Intelligence will be held on May 7-9, 1969 at Statler Hilton Hotel, Washington D.C. ACM Symposium on Theory of Computing will take place on May 5-6, 1969 at Marina del Rey Hotel, Marina del Rey, California.
- Published
- 1969
35. A locally-organized parser for spoken input
- Author
-
Perry Lowell Miller
- Subjects
Parsing ,General Computer Science ,business.industry ,Computer science ,media_common.quotation_subject ,Speech recognition ,String (computer science) ,Speech corpus ,Ambiguity ,computer.software_genre ,Top-down parsing ,Artificial intelligence ,business ,computer ,Natural language processing ,Utterance ,Word (computer architecture) ,media_common ,Bottom-up parsing - Abstract
This paper describes LPARS, a locally-organized parsing system, designed for use in a continuous speech recognizer. LPARS processes a string of phonemes which contains ambiguity and error. The system is locally-organized in the sense that it builds local parse structures from reliable word candidates recognized anywhere in an input utterance. These local structures are used as “islands of reliability” to guide the search for more highly garbledwords which might complete the utterance.
- Published
- 1974
36. Report on proposed American standard flowchart symbols for information processing
- Author
-
Robert J. Rossheim
- Subjects
Flowchart ,General Computer Science ,Computer science ,business.industry ,Association (object-oriented programming) ,Information processing ,computer.software_genre ,law.invention ,law ,Artificial intelligence ,business ,computer ,Algorithm ,Natural language processing - Abstract
This paper presents the essential contents of the Proposed American Standard Flowchart Symbols for Information Processing. This is the first proposed standard prepared by Subcommittee X3.6 on Problem Description and Analysis of the American Standards Association (ASA).
- Published
- 1963
37. Syntactic analysis by digital computer
- Author
-
Robert P. Futrelle and M. P. Barnett
- Subjects
Parsing ,General Computer Science ,Syntax (programming languages) ,Relation (database) ,Computer science ,Programming language ,business.industry ,Subroutine ,String (computer science) ,computer.software_genre ,Syntax ,Artificial intelligence ,Abstract syntax tree ,business ,computer ,Natural language processing - Abstract
This paper provides an account of the Shadow language that is used to describe syntax and of a corresponding subroutine that enables a computer to perform syntactic analysis. The input to this subroutine consists of a string to be analyzed and a description of the syntax that is to be used. The syntax is expressed in the Shadow language. The output consists of a trace table that expresses the results of the syntactic analysis in a tabular form. Several versions of the subroutine and some associated programs have been in use now for over three years. The present account of the language and the subroutine contains a summary of material that has been described previously in unpublished reports and also some additional discussion of the work in relation to the more general questions of problem-oriented languages and string transformations.
- Published
- 1962
38. On the problem of communicating complex information
- Author
-
David Pager
- Subjects
Cognitive models of information retrieval ,General Computer Science ,Computer science ,business.industry ,media_common.quotation_subject ,Information processing ,computer.software_genre ,Management information systems ,Presentation ,Human–computer interaction ,Human–computer information retrieval ,Question answering ,Automated information system ,Artificial intelligence ,business ,computer ,Natural language ,Natural language processing ,media_common - Abstract
The nature of the difficulty involved in communicating mathematical results between scientists using a computer based information retrieval system is examined. The problem is analyzed in terms of psychological and information-processing processes, and what turns out to be a vicious circle of effects is described. The paper then considers how the presentation of information by a computer-based information retrieval system, or by other media, can be improved. Some trade-offs which affect the design of the presentation are mentioned, and a number of ideas for improvement are described. These include ways of augmenting written natural language by various notational and linguistic devices, the exhibition of the structure inherent in the information we are communicating, and a sophisticated interactive system controlled by computer.
- Published
- 1973
39. Application of game tree searching techniques to sequential pattern recognition
- Author
-
James R. Slagle and Richard C. T. Lee
- Subjects
Dynamic programming ,General Computer Science ,Computer science ,business.industry ,Pattern recognition (psychology) ,Feature (machine learning) ,Pattern recognition ,Artificial intelligence ,Medical diagnosis ,Minimax ,Game tree ,business ,Algorithm - Abstract
A sequential pattern recognition (SPR) procedure does not test all the features of a pattern at once. Instead, it selects a feature to be tested. After receiving the result of that test, the procedure either classifies the unknown pattern or selects another feature to be tested, etc. Medical diagnosis is an example of SPR. In this paper the authors suggest that SPR be viewed as a one-person game played against nature (chance). Virtually all the powerful techniques developed for searching two-person, strictly competitive game trees can easily be incorporated either directly or by analogy into SPR procedures. In particular, one can incorporate the “miniaverage backing-up procedure” and the “gamma procedure,” which are the analogues of the “minimax backing-up procedure” and the “alpha-beta procedure,” respectively. Some computer simulated experiments in character recognition are presented. The results indicate that the approach is promising.
- Published
- 1971
40. The working set model for program behavior
- Author
-
Peter J. Denning
- Subjects
Operations research ,General Computer Science ,business.industry ,Process (engineering) ,Computer science ,Distributed computing ,Most recently used ,Working set ,Working set size ,Multiprocessing ,Scheduling (computing) ,Set (abstract data type) ,Dynamic management ,Resource allocation ,Resource allocation (computer) ,Decision function ,Program behavior ,Computer multitasking ,Artificial intelligence ,business - Abstract
Probably the most basic reason behind the absence of a general treatment of resource allocation in modern computer systems is an adequate model for program behavior. In this paper a new model is developed, the “working set model”, which enables us to decide which information is in use by a running program and which is not. Such knowledge is vital for dynamic management of paged memories. The working set of pages associated with a process, defined to be the collection of its most recently used pages, is a useful allocation concept. A proposal for an easy-to-implement allocation policy is set forth; this policy is unique, inasmuch as it blends into one decision function the heretofore independent activities of process-scheduling and memory-management.
- Published
- 1968
41. Toward an automata theory of brains
- Author
-
Michael A. Arbib
- Subjects
Network complexity ,General Computer Science ,Computer science ,business.industry ,Automata theory ,Artificial intelligence ,business ,Automaton - Abstract
A source of ideas for automata theory—the study of the brain—has been pushed aside in mathematical development of the theory. This paper suggests the ways in which automata theory might evolve over the next 25 years if it is to contribute to an understanding of how the brain processes information.
- Published
- 1972
42. On coordination reduction and sentence analysis
- Author
-
Peter S. Rosenbaum, Paul M. Postal, and Stanley R. Petrick
- Subjects
Parsing ,General Computer Science ,Computer science ,business.industry ,Phrase structure rules ,Emergent grammar ,Computer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing) ,computer.software_genre ,Syntax ,Transformational grammar ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Rule-based machine translation ,Lisp ,Artificial intelligence ,Relational grammar ,business ,computer ,Natural language ,Natural language processing ,Generative grammar ,computer.programming_language - Abstract
A class of coordination phenomena in natural languages is considered within the framework of transformational theory. To account for these phenomena it is proposed that certain machinery be added to the syntactic component of a transformational grammar. This machinery includes certain rule schemata, the conditions under which they are to be applied, and conditions determining the sequence of subtrees on which they are to be performed. A solution to the syntactic analysis problem for this class of grammars is outlined. Precise specification of both the generative procedure of this paper and its inverse is given in the form of LISP function definitions.
- Published
- 1969
43. Some comments on the use of ambiguous decision tables and their conversion to computer programs
- Author
-
R. G. Johnson and P. J. H. King
- Subjects
Set (abstract data type) ,Information retrieval ,General Computer Science ,Decision engineering ,business.industry ,Computer science ,Decision tree ,Table (database) ,Artificial intelligence ,Meaning (existential) ,business ,Decision table ,Decision analysis - Abstract
This paper comments upon recently published work on decision table translation using methods similar to the rule-mask technique. The applicability of these methods under various possible conventions on overall table meaning is discussed, and it is argued that there is a place both for the multi-rule and the single-rule (or action set) convention in decision table usage.
- Published
- 1973
44. Hidden lines elimination for a rotating object
- Author
-
Yutaka Matsushita
- Subjects
General Computer Science ,Computer science ,business.industry ,Smoothing group ,Computer Science::Computational Geometry ,Rotation ,Tree (graph theory) ,Point in polygon ,Painter's algorithm ,Polygon (computer graphics) ,Line (geometry) ,Polygon ,Polygon mesh ,Computer vision ,Artificial intelligence ,business - Abstract
A method is presented of determining which parts of three-dimensional objects are visible and which are invisible when the objects are rotated about some axis. This paper describes a polygon comparison scheme in which the relationships of two polygons can be classified into tree types, and also discusses how the relationship is changed for each pair of polygons under rotation about some axis. A rotation table is defined for each pair of polygons, which remains fixed as long as rotation is about one axis and provides a means of rapidly determining the visible and hidden line relationship between two polygons. Additional work must be done to extend this approach to simultaneous rotation about several axes.
- Published
- 1972
45. 'Structural connections' in formal languages
- Author
-
E. T. Irons
- Subjects
Parsing ,General Computer Science ,Syntax (programming languages) ,Programming language ,Chomsky hierarchy ,business.industry ,Computer science ,Comparison of multi-paradigm programming languages ,Object language ,Context-free language ,Abstract family of languages ,Second-generation programming language ,Ontology language ,computer.software_genre ,Cone (formal languages) ,Picture language ,Formal grammar ,Third-generation programming language ,Formal language ,Artificial intelligence ,Fifth-generation programming language ,business ,computer ,Natural language ,Natural language processing - Abstract
This paper defines the concept of “structural connection” in a mechanical language in an attempt to classify various formal languages according to the complexity of parsing structures on strings in the languages. Languages discussed vary in complexity from those with essentially no structure at all to languages which are self-defining. The relationship between some existing recognition techniques for several language classes is examined, as well as implications of language structure on the complexity of automatic recognizers.
- Published
- 1964
46. An application of FORMAC
- Author
-
L. D. Neidleman
- Subjects
General Computer Science ,Computer science ,Programming language ,business.industry ,FORMAC ,Artificial intelligence ,computer.software_genre ,business ,computer ,computer.programming_language - Abstract
A nonlinear circuit analysis problem is stated and the way in which it was solved using FORMAC is indicated. The solution of the problem using FORMAC was notable since several other methods that were tried failed. The problem is straightforward (although untenable by hand) but nevertheless involved an elaborate use of the FORMAC language. The program was fairly large and utilized practically every command. In particular, it made extensive use of the PART command. Several tricks were necessary in order to circumvent some of the shortcomings of the FORMAC system. This paper is more concerned with the use of programming techniques in FORMAC than with the actual engineering problem, although readers may be interested in the problem because it is stated in a general (mathematical) sense and could be of interest in areas other than circuit analysis.
- Published
- 1967
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.