Results 1  10
of
34
Finding Optimal Solutions to Rubik's Cube Using Pattern Databases
, 1997
"... We have found the first optimal solutions to random instances of Rubik's Cube. The median optimal solution length appears to be 18 moves. The algorithm used is iterativedeepeningA* (IDA*), with a lowerbound heuristic function based on large memorybased lookup tables, or "pattern databas ..."
Abstract

Cited by 160 (7 self)
 Add to MetaCart
(Show Context)
We have found the first optimal solutions to random instances of Rubik's Cube. The median optimal solution length appears to be 18 moves. The algorithm used is iterativedeepeningA* (IDA*), with a lowerbound heuristic function based on large memorybased lookup tables, or "pattern databases" (Culberson and Schaeffer 1996). These tables store the exact numberofmoves required to solve various subgoals of the problem, in this case subsets of the individual movable cubies. We characterize the effectiveness of an admissible heuristic function by its expected value, and hypothesize that the overall performance of the program obeys a relation in which the product of the time and space used equals the size of the state space. Thus, the speed of the program increases linearly with the amount of memory available. As computer memories become larger and cheaper, we believe that this approach will become increasingly costeffective.
Abstraction via Approximate Symmetry
 In Proc. of the 13 th IJCAI
, 1993
"... Abstraction techniques are important for solving constraint satisfaction problems with global constraints and low solution density. In the presence of global constraints, backtracking search is unable to prune partial solutions. It therefore operates like pure generateandtest. Abstraction improves ..."
Abstract

Cited by 45 (4 self)
 Add to MetaCart
Abstraction techniques are important for solving constraint satisfaction problems with global constraints and low solution density. In the presence of global constraints, backtracking search is unable to prune partial solutions. It therefore operates like pure generateandtest. Abstraction improves on generateandtest by enabling entire subsets of the solution space to be pruned early in a search process. This paper describes how abstraction spaces can be characterized in terms of approximate symmetries of the original, concrete search space. It defines two special types of approximate symmetry, called "range symmetry" and "domain symmetry", which apply to function finding problems. It also presents algorithms for automatically synthesizing hierarchic problem solvers based on range or domain symmetry. The algorithms operate by analyzing declarative descriptions of classes of constraint satisfaction problems. Both algorithms have been fully implemented. This paper concludes by presenting data from experiments testing the two synthesis algorithms and the resulting problem solvers on NPhard scheduling and partitioning problems.
Artificial Intelligence Search Algorithms
 In Algorithms and Theory of Computation Handbook
, 1996
"... Introduction Search is a universal problemsolving mechanism in artificial intelligence (AI). In AI problems, the sequence of steps required for solution of a problem are not known a priori, but often must be determined by a systematic trialanderror exploration of alternatives. The problems that h ..."
Abstract

Cited by 39 (0 self)
 Add to MetaCart
(Show Context)
Introduction Search is a universal problemsolving mechanism in artificial intelligence (AI). In AI problems, the sequence of steps required for solution of a problem are not known a priori, but often must be determined by a systematic trialanderror exploration of alternatives. The problems that have been addressed by AI search algorithms fall into three general classes: singleagent pathfinding problems, twoplayer games, and constraintsatisfaction problems. Classic examples in the AI literature of pathfinding problems are the slidingtile puzzles, including the 3 \Theta 3 Eight Puzzle (see Fig. 1) and its larger relatives the 4 \Theta 4 Fifteen Puzzle, and 5 \Theta 5 TwentyFour Puzzle. The Eight Puzzle consists of a 3 \Theta 3 square frame containing eight numbered square tiles, and an empty position called the blank. The legal operators are to slide any tile that is h
A general theory of additive state space abstractions
 JAIR
"... Informally, a set of abstractions of a state space S is additive if the distance between any two states in S is always greater than or equal to the sum of the corresponding distances in the abstract spaces. The first known additive abstractions, called disjoint pattern databases, were experimentally ..."
Abstract

Cited by 25 (15 self)
 Add to MetaCart
(Show Context)
Informally, a set of abstractions of a state space S is additive if the distance between any two states in S is always greater than or equal to the sum of the corresponding distances in the abstract spaces. The first known additive abstractions, called disjoint pattern databases, were experimentally demonstrated to produce state of the art performance on certain state spaces. However, previous applications were restricted to state spaces with special properties, which precludes disjoint pattern databases from being defined for several commonly used testbeds, such as Rubik’s Cube, TopSpin and the Pancake puzzle. In this paper we give a general definition of additive abstractions that can be applied to any state space and prove that heuristics based on additive abstractions are consistent as well as admissible. We use this new definition to create additive abstractions for these testbeds and show experimentally that well chosen additive abstractions can reduce search time substantially for the (18,4)TopSpin puzzle and by three orders of magnitude over state of the art methods for the 17Pancake puzzle. We also derive a way of testing if the heuristic value returned by additive abstractions is provably too low and show that the use of this test can reduce search time for the 15puzzle and TopSpin by roughly a factor of two. 1.
PSVN: A Vector Representation for Production Systems
, 1999
"... In this paper we present a production system which acts on fixed length vectors of labels. Our goal is to automatically generate heuristics to search the state space for shortest paths between states efficiently. The heuristic values which guide search in the state space are obtained by searching fo ..."
Abstract

Cited by 22 (11 self)
 Add to MetaCart
In this paper we present a production system which acts on fixed length vectors of labels. Our goal is to automatically generate heuristics to search the state space for shortest paths between states efficiently. The heuristic values which guide search in the state space are obtained by searching for the shortest path in an abstract space derived from the definition of the original space. In PSVN, a state is a fixed length vector of labels and abstractions are generated by simply mapping the set of labels to another smaller set of labels (domain abstraction). A domain abstraction on labels induces space preserves important properties of the original space while usually being significantly smaller in size. It is guaranteed that the shortest path between two states in the original space is at least as long as the shortest path between their images in the abstract space. Hence, such abstractions provide admissible heuristics for search algorithms such as A* and IDA*. The mapping of states and operators can be efficiently obtained by applying the domain map on the labels. We explore important properties of state spaces defined in PSVN and abstractions generated by domain maps. Despite its simplicity, PSVN is capable to define all finitely generated permutation groups and such benchmark problems as Rubik's Cube, the slidingtile puzzles and the Blocks World.
Compiling Comp Ling: Practical weighted dynamic programming and the Dyna language
 In Advances in Probabilistic and Other Parsing
, 2005
"... Weighted deduction with aggregation is a powerful theoretical formalism that encompasses many NLP algorithms. This paper proposes a declarative specification language, Dyna; gives general agendabased algorithms for computing weights and gradients; briefly discusses DynatoDyna program transformati ..."
Abstract

Cited by 21 (12 self)
 Add to MetaCart
(Show Context)
Weighted deduction with aggregation is a powerful theoretical formalism that encompasses many NLP algorithms. This paper proposes a declarative specification language, Dyna; gives general agendabased algorithms for computing weights and gradients; briefly discusses DynatoDyna program transformations; and shows that a first implementation of a DynatoC++ compiler produces code that is efficient enough for real NLP research, though still several times slower than handcrafted code. 1
Implicit abstraction heuristics
"... Statespace search with explicit abstraction heuristics is at the state of the art of costoptimal planning. These heuristics are inherently limited, nonetheless, because the size of the abstract space must be bounded by some, even if a very large, constant. Targeting this shortcoming, we introduce t ..."
Abstract

Cited by 17 (9 self)
 Add to MetaCart
Statespace search with explicit abstraction heuristics is at the state of the art of costoptimal planning. These heuristics are inherently limited, nonetheless, because the size of the abstract space must be bounded by some, even if a very large, constant. Targeting this shortcoming, we introduce the notion of (additive) implicit abstractions, in which the planning task is abstracted by instances of tractable fragments of optimal planning. We then introduce a concrete setting of this framework, called forkdecomposition, that is based on two novel fragments of tractable costoptimal planning. The induced admissible heuristics are then studied formally and empirically. This study testifies for the accuracy of the fork decomposition heuristics, yet our empirical evaluation also stresses the tradeoff between their accuracy and the runtime complexity of computing them. Indeed, some of the power of the explicit abstraction heuristics comes from precomputing the heuristic function offline and then determining h(s) for each evaluated state s by a very fast lookup in a “database. ” By contrast, while forkdecomposition heuristics can be calculated in polynomial time, computing them is far from being fast. To address this problem, we show that the timepernode complexity bottleneck of the forkdecomposition heuristics can be successfully overcome. We demonstrate that an equivalent of the explicit abstraction notion of a “database ” exists for the forkdecomposition abstractions as well, despite their exponentialsize abstract spaces. We then verify empirically that heuristic search with the “databased ” forkdecomposition heuristics favorably competes with the state of the art of costoptimal planning. 1.
A SpaceTime Tradeoff for MemoryBased Heuristics
 Proceedings of the Sixteenth National Conference on Artificial Intelligence (AAAI99
, 1999
"... A memorybased heuristic is a function, h(s), stored in the form of a lookup table (pattern database): h(s) is computed by mapping s to an index and then retrieving the appropriate entry in the table. (Korf 1997) conjectures for search using memorybased heuristics that m \Delta t is a constant ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
A memorybased heuristic is a function, h(s), stored in the form of a lookup table (pattern database): h(s) is computed by mapping s to an index and then retrieving the appropriate entry in the table. (Korf 1997) conjectures for search using memorybased heuristics that m \Delta t is a constant, where m is the size of the heuristic's lookup table and t is search time. In this paper we present a method for automatically generating memorybased heuristics and use this to test Korf's conjecture in a largescale experiment. Our results confirm that there is a direct relationship between m and t. Introduction A heuristic is a function, h(s), that computes an estimate of the distance from state s to a goal state. In a memorybased heuristic this computation consists of mapping s to an index which is then used to look up h(s) in a table. Even heuristics that have a normal functional definition are often precomputed and stored in a lookup table in order to speed up search ((Priedi...
The Compression Power of Symbolic Pattern Databases
"... The heuristics used for planning and search often take the form of pattern databases generated from abstracted versions of the given state space. Pattern databases are typically stored space, which limits the size of the abstract state space and therefore the quality of the heuristic that can be use ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
The heuristics used for planning and search often take the form of pattern databases generated from abstracted versions of the given state space. Pattern databases are typically stored space, which limits the size of the abstract state space and therefore the quality of the heuristic that can be used with a given amount of memory. In the AIPS2002 conference Stefan Edelkamp introduced an alternative representation, called symbolic pattern databases, which, for the Blocks World, required two orders of magnitude less memory than a lookup table to store a pattern database. This paper presents experimental evidence that Edelkamp’s result is not restricted to a single domain. Symbolic pattern databases, in the form of Algebraic Decision Diagrams, are one or more orders of magnitude smaller than lookup tables on a wide variety of problem domains and abstractions.
Improving heuristics through relaxed search – an analysis of TP4 and hsp ∗ a in the 2004 planning competition
 Journal of AI Research
"... The hm admissible heuristics for (sequential and temporal) regression planning are defined by a parameterized relaxation of the optimal cost function in the regression search space, where the parameter m offers a tradeoff between the accuracy and computational cost of the heuristic. Existing method ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
(Show Context)
The hm admissible heuristics for (sequential and temporal) regression planning are defined by a parameterized relaxation of the optimal cost function in the regression search space, where the parameter m offers a tradeoff between the accuracy and computational cost of the heuristic. Existing methods for computing the hm heuristic require time exponential in m, limiting them to small values (m � 2). The hm heuristic can also be viewed as the optimal cost function in a relaxation of the search space: this paper presents relaxed search, a method for computing this function partially by searching in the relaxed space. The relaxed search method, because it computes hm only partially, is computationally cheaper and therefore usable for higher values of m. The (complete) h2 heuristic is combined with partial hm heuristics, for m = 3,..., computed by relaxed search, resulting in a more accurate heuristic. This use of the relaxed search method to improve on the h2 heuristic is evaluated by comparing two optimal temporal planners: TP4, which does not use it, and hsp ∗ a, which uses it but is otherwise identical to TP4. The comparison is made on the domains used in the 2004 International Planning Competition, in which both planners participated. Relaxed search is found to be cost effective in some of these domains, but not all. Analysis reveals a characterization of the domains in which relaxed search can be expected to be cost effective, in terms of two measures on the original and relaxed search spaces. In the domains where relaxed search is cost effective, expanding small states is computationally cheaper than expanding large states and small states tend to have small successor states.