Results 1  10
of
10
Principles of Metareasoning
 Artificial Intelligence
, 1991
"... In this paper we outline a general approach to the study of metareasoning, not in the sense of explicating the semantics of explicitly specified metalevel control policies, but in the sense of providing a basis for selecting and justifying computational actions. This research contributes to a devel ..."
Abstract

Cited by 162 (10 self)
 Add to MetaCart
In this paper we outline a general approach to the study of metareasoning, not in the sense of explicating the semantics of explicitly specified metalevel control policies, but in the sense of providing a basis for selecting and justifying computational actions. This research contributes to a developing attack on the problem of resourcebounded rationality, by providing a means for analysing and generating optimal computational strategies. Because reasoning about a computation without doing it necessarily involves uncertainty as to its outcome, probability and decision theory will be our main tools. We develop a general formula for the utility of computations, this utility being derived directly from the ability of computations to affect an agent's external actions. We address some philosophical difficulties that arise in specifying this formula, given our assumption of limited rationality. We also describe a methodology for applying the theory to particular problemsolving systems, a...
Exploiting the deep structure of constraint problems
 Artificial Intelligence
, 1994
"... We introduce a technique for analyzing the behavior of sophisticated A.I. search programs working on realistic, largescale problems. This approach allows us to predict where, in a space of problem instances, the hardest problems are to be found and where the fluctuations in difficulty are greatest. ..."
Abstract

Cited by 73 (8 self)
 Add to MetaCart
We introduce a technique for analyzing the behavior of sophisticated A.I. search programs working on realistic, largescale problems. This approach allows us to predict where, in a space of problem instances, the hardest problems are to be found and where the fluctuations in difficulty are greatest. Our key insight is to shift emphasis from modelling sophisticated algorithms directly to modelling a search space that captures their principal effects. We compare our modelâ€™s predictions with actual data on real problems obtained independently and show that the agreement is quite good. By systematically relaxing our underlying modelling assumptions we identify their relative contribution to the remaining error and then remedy it. We also discuss further applications of our model and suggest how this type of analysis can be generalized to other kinds of A.I. problems. Chapter 1
Pruning Duplicate Nodes in DepthFirst Search
 In AAAI National Conference
, 1993
"... Bestfirst search algorithms require exponential memory, while depthfirst algorithms require only linear memory. On graphs with cycles, however, depthfirst searches do not detect duplicate nodes, and hence may generate asymptotically more nodes than bestfirst searches. We present a technique for ..."
Abstract

Cited by 37 (3 self)
 Add to MetaCart
Bestfirst search algorithms require exponential memory, while depthfirst algorithms require only linear memory. On graphs with cycles, however, depthfirst searches do not detect duplicate nodes, and hence may generate asymptotically more nodes than bestfirst searches. We present a technique for reducing the asymptotic complexity of depthfirst search by eliminating the generation of duplicate nodes. The automatic discovery and application of a finite state machine (FSM) that enforces pruning rules in a depthfirst search, has significantly extended the power of search in several domains. We have implemented and tested the technique on a grid, the Fifteen Puzzle, the TwentyFour Puzzle, and two versions of Rubik's Cube. In each case, the effective branching factor of the depthfirst search is reduced, reducing the asymptotic time complexity. IntroductionThe Problem Search techniques are fundamental to artificial intelligence. Bestfirst search algorithms such as breadthfirst se...
Autarky pruning in propositional model elimination reduces failure redundancy
 Journal of Automated Reasoning
, 1999
"... Goalsensitive resolution methods, such as Model Elimination, have been observed to have a higher degree of search redundancy than modelsearch methods, Therefore, resolution methods have not been seen in high performance propositional satis ability testers. A method to reduce search redundancy in g ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Goalsensitive resolution methods, such as Model Elimination, have been observed to have a higher degree of search redundancy than modelsearch methods, Therefore, resolution methods have not been seen in high performance propositional satis ability testers. A method to reduce search redundancy in goalsensitive resolution methods is introduced. The idea at the heart of the method is to attempt to construct a refutation and a model simultaneously and incrementally, based on subsearch outcomes. The method exploits the concept of \autarky", which can be informally described as a \selfsu cient " model for some clauses, but which does not a ect the remaining clauses of the formula. Incorporating this method into Model Elimination leads to an algorithm called Modoc. Modoc is shown, both analytically and experimentally, to be faster than Model Elimination by an exponential factor. Modoc, unlike Model Elimination, is able to nd a model if it fails to nd a refutation, essentially by combining autarkies. Unlike the pruning strategies of most re nements of resolution, autarkyrelated pruning does not prune any successful refutation; it only prunes attempts that ultimately will be unsuccessful; consequently, it will not force the underlying Modoc search to nd an unnecessarily long refutation. To prove correctness and other properties, a game characterization of refutation search isintroduced, which demonstrates
Simultaneous Construction of Refutations and Models for Propositional Formulas
, 1995
"... Methodology is developed to attempt to construct simultaneously either a refutation or a model for a propositional formula in conjunctive normal form. The method exploits the concept of "autarky", which was introduced by Monien and Speckenmeyer. Informally, an autarky is a "selfsufficient" model ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
Methodology is developed to attempt to construct simultaneously either a refutation or a model for a propositional formula in conjunctive normal form. The method exploits the concept of "autarky", which was introduced by Monien and Speckenmeyer. Informally, an autarky is a "selfsufficient" model for some clauses, but which does not affect the remaining clauses of the formula. Whereas their work was oriented toward finding a model, our method has as its primary goal to find a refutation in the style of model elimination. It also finds a model if it fails to find a refutation, essentially by combining autarkies. However, the autarkyrelated processing is integrated with the refutation search, and can greatly improve the efficiency of that search even when a refutation does exist. Unlike the pruning strategies of most refinements of resolution, autarkyrelated pruning does not prune any successful refutation; it only prunes attempts that ultimately will be unsuccessful; conseque...
An Analysis of Search Techniques for a TotallyOrdered Nonlinear Planner
 In Proceedings of the First International Conference on AI Planning Systems
, 1992
"... In this paper we present several domainindependent search optimizations and heuristics that have been developed in a totallyordered nonlinear planner in prodigy. We also describe the extension of the system into a full hierarchical planner with the ability to search among the different levels of a ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In this paper we present several domainindependent search optimizations and heuristics that have been developed in a totallyordered nonlinear planner in prodigy. We also describe the extension of the system into a full hierarchical planner with the ability to search among the different levels of abstraction. We analyze and illustrate the performance of the system with its different search capabilities in a few domains.
The Partial Rehabilitation of Propositional Resolution
, 1996
"... Resolution has not been an effective tool for deciding satisfiability of propositional CNF formulas, due to explosion of the search space, particularly when the formula is satisfiable. A new pruning method is described, which is designed to eliminate certain refutation attempts that cannot succeed. ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Resolution has not been an effective tool for deciding satisfiability of propositional CNF formulas, due to explosion of the search space, particularly when the formula is satisfiable. A new pruning method is described, which is designed to eliminate certain refutation attempts that cannot succeed. The method exploits the concept of "autarky", which was introduced by Monien and Speckenmeyer. New forms of lemma creation are also introduced, which eliminate the need to carry out refutation attempts that must succeed. The resulting algorithm, called "Modoc", is a modification of propositional model elimination. Informally, an autarky is a "selfsufficient" model for some clauses, but which does not affect the remaining clauses of the formula. Whereas Monien and Speckenmeyer's work was oriented toward finding a model, our method has as its primary goal to find a refutation in the style of model elimination. However, Modoc finds a model if it fails to find a refutation, essentially by combi...
A Prolog Technique Of Implementing Search Of A/o Graphs With Constraints
, 1997
"... . Our research has been motivated by the task of forming a solution subgraph which satisfies given constraints. The problem is represented by an A=O graph. Our approach is to apply a suitably modified technique of dependencydirected backtracking. We present our formulation of the standard chronolog ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. Our research has been motivated by the task of forming a solution subgraph which satisfies given constraints. The problem is represented by an A=O graph. Our approach is to apply a suitably modified technique of dependencydirected backtracking. We present our formulation of the standard chronological backtracking algorithm in Prolog. Based on it, we have developed an enhanced algorithm which makes use of special heuristic knowledge. It involves also the technique of node marking. We have gathered experience with the prototype Prolog implementation of the algorithm in applying it to (one step of) the problem of building a software configuration. Our experience shows that Prolog programming techniques offer a considerable flexibility in implementing the above outlined tasks. Keywords. A=Ograph, nonchronological backtrack, Prolog 1 PROBLEM AREA AND GOAL Many problems to which artificial intelligence techniques are often applied can be described as constraint satisfaction problems. We...
PRODIGY4.0: The Manual and Tutorial
, 1992
"... PRODIGY is a generalpurpose problemsolving architecture that serves as a basis for research in planning, machine learning, apprenticetype knowledgerefinement interfaces, and expert systems. This document is a manual for the latest version of the PRODIGY system, PRODIGY4.0, and includes descripti ..."
Abstract
 Add to MetaCart
PRODIGY is a generalpurpose problemsolving architecture that serves as a basis for research in planning, machine learning, apprenticetype knowledgerefinement interfaces, and expert systems. This document is a manual for the latest version of the PRODIGY system, PRODIGY4.0, and includes descriptions of the PRODIGY representation language, control structure, user interface, abstraction module, and other features. The tutorial style is meant to provide the reader with the ability to run PRODIGY and make use of all the basic features, as well as gradually learning the more esoteric aspects of PRODIGY4.0. 1 This research was sponsored by the Avionics Laboratory, Wright Research and Development Center, Aeronautical Systems Division (AFSC), U. S. Air Force, WrightPatterson AFB, OH 454336543 under Contract F3361590C1465, Arpa Order No. 7597. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official polici...
Pruning uplieate Nodes in thFirst
"... Bestfirst search algorithms require exponential memory, while depthfirst algorithms require only linear memory. On graphs with cycles, however, depthfirst searches do not detect duplicate nodes, and hence may generate asymptotically more nodes than bestfirst searches. We present a technique for ..."
Abstract
 Add to MetaCart
Bestfirst search algorithms require exponential memory, while depthfirst algorithms require only linear memory. On graphs with cycles, however, depthfirst searches do not detect duplicate nodes, and hence may generate asymptotically more nodes than bestfirst searches. We present a technique for reducing the asymptotic complexity of depthfirst search by eliminating the generation of duplicate nodes. The automatic discovery and application of a finite state machine (FSM) that enforces pruning rules in a depthfirst search, has significantly extended the power of search in several domains. We have implemented and tested the technique on a grid, the Fifteen Puzzle, the