Results 1  10
of
36
Computational Approaches to Analogical Reasoning: A Comparative Analysis
 ARTIFICIAL INTELLIGENCE
, 1989
"... Analogical reasoning has a long history in artificial intelligence research, primarily because of its promise for Ike acquisition unit effective use of knowledge. Defined as a representational mapping from a known "source " domain into a novel "target" domain, analogy pro ..."
Abstract

Cited by 81 (0 self)
 Add to MetaCart
Analogical reasoning has a long history in artificial intelligence research, primarily because of its promise for Ike acquisition unit effective use of knowledge. Defined as a representational mapping from a known &quot;source &quot; domain into a novel "target" domain, analogy provides a basic mechanism for effectively connecting a reasoner's past and present experience. Using a fourcomponent process model of analogical reasoning, this paper reviews sixteen computational studies of analogy. These studies are organized chronologically within broadly defined task domains of automated deduction, problem solving and planning, natural language comprehension, and machine learning. Drawing on these detailed reviews, a comparative analysis of diverse contributions to basic analogy processes identifies recurrent problems for studies of analogy and common approaches to their solution. The paper concludes by arguing that computational studies of analogy are in a slate of adolescence: looking to more mature research areas in artificial intelligence for robust accounts of basic reasoning processes and drawing upon a long tradition of research in other disciplines.
Nonresolution theorem proving
 Artificial Intelligence
, 1977
"... This talk reviews those efforts in automatic theorem proving, during the past few years, which have emphasized techniques other than resolution. These include: knowledge bases, natural deduction, reduction, (rewrite rules), typing, procedures, advice, controlled forward chaining, algebraic simplific ..."
Abstract

Cited by 60 (3 self)
 Add to MetaCart
This talk reviews those efforts in automatic theorem proving, during the past few years, which have emphasized techniques other than resolution. These include: knowledge bases, natural deduction, reduction, (rewrite rules), typing, procedures, advice, controlled forward chaining, algebraic simplification, builtin associativity and commutativity, models, analogy, and manmachine systems. Examples are given and suggestions are made for future work. 1.
The computational modeling of analogymaking
 Trends in Cognitive Sciences
, 2002
"... Our ability to see a particular object or situation in one context as being “the same as” another object or situation in another context is the essence of analogymaking. It encompasses our ability to explain new concepts in terms of alreadyfamiliar ones, to emphasize particular aspects of situatio ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
Our ability to see a particular object or situation in one context as being “the same as” another object or situation in another context is the essence of analogymaking. It encompasses our ability to explain new concepts in terms of alreadyfamiliar ones, to emphasize particular aspects of situations, to generalize, to characterize situations, to explain
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 36 (22 self)
 Add to MetaCart
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
A Formal Definition of Intelligence Based on an Intensional Variant of Algorithmic Complexity
 In Proceedings of the International Symposium of Engineering of Intelligent Systems (EIS'98
, 1998
"... Machine Due to the current technology of the computers we can use, we have chosen an extremely abridged emulation of the machine that will effectively run the programs, instead of more proper languages, like lcalculus (or LISP). We have adapted the "toy RISC" machine of [Hernndez & H ..."
Abstract

Cited by 29 (18 self)
 Add to MetaCart
Machine Due to the current technology of the computers we can use, we have chosen an extremely abridged emulation of the machine that will effectively run the programs, instead of more proper languages, like lcalculus (or LISP). We have adapted the "toy RISC" machine of [Hernndez & Hernndez 1993] with two remarkable features inherited from its objectoriented coding in C++: it is easily tunable for our needs, and it is efficient. We have made it even more reduced, removing any operand in the instruction set, even for the loop operations. We have only three registers which are AX (the accumulator), BX and CX. The operations Q b we have used for our experiment are in Table 1: LOOPTOP Decrements CX. If it is not equal to the first element jump to the program top.
Experiments in the Heuristic Use of Past Proof Experience
 Proc. CADE13
, 1996
"... Problems stemming from the study of logic calculi in connection with an inference rule called "condensed detachment" are widely acknowledged as prominent test sets for automated deduction systems and their search guiding heuristics. It is in the light of these problems that we demonstrate ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Problems stemming from the study of logic calculi in connection with an inference rule called "condensed detachment" are widely acknowledged as prominent test sets for automated deduction systems and their search guiding heuristics. It is in the light of these problems that we demonstrate the power of heuristics that make use of past proof experience with numerous experiments. We present two such heuristics. The first heuristic attempts to reenact a proof of a proof problem found in the past in a flexible way in order to find a proof of a similar problem. The second heuristic employs "features" in connection with past proof experience to prune the search space. Both these heuristics not only allow for substantial speedups, but also make it possible to prove problems that were out of reach when using socalled basic heuristics. Moreover, a combination of these two heuristics can further increase performance. We compare our results with the results the creators of Otter obtained with t...
Managing the Evolution of ObjectOriented Systems
, 1994
"... ii Class organizations (schemas) evolve over the life cycle of objectoriented systems for avariety of reasons. This issue has recently been a subject of increasing attention in the literature of both objectoriented languages and objectoriented database systems. One of the most common forms of evo ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
ii Class organizations (schemas) evolve over the life cycle of objectoriented systems for avariety of reasons. This issue has recently been a subject of increasing attention in the literature of both objectoriented languages and objectoriented database systems. One of the most common forms of evolution involves the extension of an existing system by addition of new classes of objects or the addition of attributes to the original objects. Sometimes class structures are reorganized even when the set of objects is unchanged. In this case the reorganization might represent an optimization of the system, or just a change in the users ' perspective. At the other extreme, a class reorganization might re ect not only the extension and reclassi cation of existing objects, but also structural changes (other than addition of attributes) in the original objects. This work provides a mathematical treatment of a calculus of class transformations. Three kinds of transformations that commonly occur in the evolution of class structures are considered: objectextending, objectpreserving, and languagepreserving. For each kind of transformation, methods for automating the maintenance of systems based on the evolving class structure are discussed. The languagepreserving transformations are a special case of transformations that change the structure of existing objects. If an object schema is decorated with concrete syntax, it de nes not only a class structure, but also a language for describing the objects. When two schemas de ne the same language but di erent classes, the language may be used to guide the discovery of analogies between the classes. The resulting analogies may then be used to transport functionality between domains. iii Acknowledgments Iwould like to thank my advisor, Karl Lieberherr, for his generous support, guidance, and feedback. I would also like to thank my wife, Vickie, for her constant encouragement and understanding without which thiswork would not have been possible. iv
Variant design for mechanical artifacts: A stateoftheart survey
 ENGINEERING WITH COMPUTERS, 12: 115
, 1996
"... Variant design refers to the technique of adapting existing design specifications to satisfy new design goals and constraints. Specific support of variant design techniques in current computer aided design systems would help to realize a rapid response manufacturing environment. A survey of approach ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
Variant design refers to the technique of adapting existing design specifications to satisfy new design goals and constraints. Specific support of variant design techniques in current computer aided design systems would help to realize a rapid response manufacturing environment. A survey of approaches supporting variant design is presented. Capabilities used in current commercial computer aided design systems are discussed along with approaches used in recent research efforts. Information standards applicable to variant design are also identijed. Barriers to variant design in current systems are ident#ed and ideas are presented for augmentation of current systupzs to support uariant design.
Change of Representation in Theorem Proving by Analogy
, 1993
"... Constructing an analogy between a known and already proven theorem (the base case) and another yet to be proven theorem (the target case) often amounts to finding the appropriate representation at which the base and the target are similar. This is a wellknown fact in mathematics, and it was corrobo ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
Constructing an analogy between a known and already proven theorem (the base case) and another yet to be proven theorem (the target case) often amounts to finding the appropriate representation at which the base and the target are similar. This is a wellknown fact in mathematics, and it was corroborated by our empirical study of a mathematical textbook, which showed that a reformulation of the representation of a theorem and its proof is indeed more often than not a necessary prerequisite for an analogical inference. Thus machine supported reformulation becomes an important component of automated analogydriven theorem proving too. The reformulation component proposed in this paper is embedded into a proof plan methodology based on methods and metamethods, where the latter are used to change and appropriately adapt the methods. A theorem and its proof are both represented as a method and then reformulated by the set of metamethods presented in this paper. Our approach supports analog...
How Mathematicians Prove Theorems
 IN PROC. OF THE ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY
, 1994
"... This paper analyzes how mathematicians prove theorems. The analysis is based upon several empirical sources such as reports of mathematicians and mathematical proofs by analogy. In order to combine the strength of traditional automated theorem provers with humanlike capabilities, the questions aris ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
This paper analyzes how mathematicians prove theorems. The analysis is based upon several empirical sources such as reports of mathematicians and mathematical proofs by analogy. In order to combine the strength of traditional automated theorem provers with humanlike capabilities, the questions arise: Which problem solving strategies are appropriate? Which representations have to be employed? As a result of our analysis, the following reasoning strategies are recognized: proof planning with partially instantiated methods, structuring of proofs, the transfer of subproofs and of reformulated subproofs. We discuss the representation of a component of these reasoning strategies, as well as its properties. We find some mechanisms needed for theorem proving by analogy, that are not provided by previous approaches to analogy. This leads us to a computational representation of new components and procedures for automated theorem proving systems.