Results 1  10
of
56
Better kbest parsing
, 2005
"... We discuss the relevance of kbest parsing to recent applications in natural language processing, and develop efficient algorithms for kbest trees in the framework of hypergraph parsing. To demonstrate the efficiency, scalability and accuracy of these algorithms, we present experiments on Bikel’s i ..."
Abstract

Cited by 147 (16 self)
 Add to MetaCart
We discuss the relevance of kbest parsing to recent applications in natural language processing, and develop efficient algorithms for kbest trees in the framework of hypergraph parsing. To demonstrate the efficiency, scalability and accuracy of these algorithms, we present experiments on Bikel’s implementation of Collins ’ lexicalized PCFG model, and on Chiang’s CFGbased decoder for hierarchical phrasebased translation. We show in particular how the improved output of our algorithms has the potential to improve results from parse reranking systems and other applications. 1
An Incremental Algorithm for a Generalization of the ShortestPath Problem
, 1992
"... The grammar problem, a generalization of the singlesource shortestpath problem introduced by Knuth, is to compute the minimumcost derivation of a terminal string from each nonterminal of a given contextfree grammar, with the cost of a derivation being suitably defined. This problem also subsume ..."
Abstract

Cited by 116 (1 self)
 Add to MetaCart
The grammar problem, a generalization of the singlesource shortestpath problem introduced by Knuth, is to compute the minimumcost derivation of a terminal string from each nonterminal of a given contextfree grammar, with the cost of a derivation being suitably defined. This problem also subsumes the problem of finding optimal hyperpaths in directed hypergraphs (under varying optimization criteria) that has received attention recently. In this paper we present an incremental algorithm for a version of the grammar problem. As a special case of this algorithm we obtain an efficient incremental algorithm for the singlesource shortestpath problem with positive edge lengths. The aspect of our work that distinguishes it from other work on the dynamic shortestpath problem is its ability to handle "multiple heterogeneous modifications": between updates, the input graph is allowed to be restructured by an arbitrary mixture of edge insertions, edge deletions, and edgelength changes.
Hierarchical matching of deformable shapes
 In CVPR
, 2007
"... We describe a new hierarchical representation for twodimensional objects that captures shape information at multiple levels of resolution. The representation is based on a hierarchical description of an object’s boundary, and can be used in an elastic matching framework, both for comparing pairs of ..."
Abstract

Cited by 74 (0 self)
 Add to MetaCart
We describe a new hierarchical representation for twodimensional objects that captures shape information at multiple levels of resolution. The representation is based on a hierarchical description of an object’s boundary, and can be used in an elastic matching framework, both for comparing pairs of objects and for detecting objects in cluttered images. In contrast to classical elastic models, our representation explicitly captures global shape information. This leads to richer geometric models and more accurate recognition results. Our experiments demonstrate classification results that are significantly better than the current stateoftheart in several shape datasets. We also show initial experiments in matching shapes to cluttered images. 1 1.
Finding the Hidden Path: Time Bounds for AllPairs Shortest Paths
, 1993
"... We investigate the allpairs shortest paths problem in weighted graphs. We present an algorithmthe Hidden Paths Algorithmthat finds these paths in time O(m* n+n² log n), where m is the number of edges participating in shortest paths. Our algorithm is a practical substitute for Dijkstra's ..."
Abstract

Cited by 64 (0 self)
 Add to MetaCart
We investigate the allpairs shortest paths problem in weighted graphs. We present an algorithmthe Hidden Paths Algorithmthat finds these paths in time O(m* n+n² log n), where m is the number of edges participating in shortest paths. Our algorithm is a practical substitute for Dijkstra's algorithm. We argue that m* is likely to be small in practice, since m* = O(n log n) with high probability for many probability distributions on edge weights. We also prove an Ω(mn) lower bound on the running time of any pathcomparison based algorithm for the allpairs shortest paths problem. Pathcomparison based algorithms form a natural class containing the Hidden Paths Algorithm, as well as the algorithms of Dijkstra and Floyd. Lastly, we consider generalized forms of the shortest paths problem, and show that many of the standard shortest paths algorithms are effective in this more general setting.
Escape Analysis: Correctness Proof, Implementation and Experimental Results
 In Conference Record of the 25th Annual ACM Symposium on Principles of Programming Languages
, 1998
"... We describe an escape analysis [32, 14], used to determine whether the lifetime of data exceeds its static scope. We give a new correctness proof starting directly from a semantics. Contrary to previous proofs, it takes into account all the features of functional languages, including imperative fea ..."
Abstract

Cited by 61 (2 self)
 Add to MetaCart
We describe an escape analysis [32, 14], used to determine whether the lifetime of data exceeds its static scope. We give a new correctness proof starting directly from a semantics. Contrary to previous proofs, it takes into account all the features of functional languages, including imperative features and polymorphism. The analysis has been designed so that it can be implemented under the small complexity bound of O(n log 2 n) where n is the size of the analyzed program. We have included it in the Caml Special Light compiler (an implementation of ML), and applied it to very large programs. We plan to apply these techniques to the Java programming language. Escape analysis has been applied to stack allocation. We improve the optimization technique by determining minimal lifetime for stack allocated data, and using inlining. We manage to stack allocate 25% of data in the theorem prover Coq. We analyzed the effect of this optimization, and noticed that its main effect is to improve ...
Parsing and hypergraphs
 In IWPT
, 2001
"... While symbolic parsers can be viewed as deduction systems, this view is less natural for probabilistic parsers. We present a view of parsing as directed hypergraph analysis which naturally covers both symbolic and probabilistic parsing. We illustrate the approach by showing how a dynamic extension o ..."
Abstract

Cited by 56 (3 self)
 Add to MetaCart
While symbolic parsers can be viewed as deduction systems, this view is less natural for probabilistic parsers. We present a view of parsing as directed hypergraph analysis which naturally covers both symbolic and probabilistic parsing. We illustrate the approach by showing how a dynamic extension of Dijkstra’s algorithm can be used to construct a probabilistic chart parser with an Ç Ò time bound for arbitrary PCFGs, while preserving as much of the flexibility of symbolic chart parsers as allowed by the inherent ordering of probabilistic dependencies. 1
Preference Logic Programming
 In Proc. 12th Intl. Conf. on Logic Programming
, 1995
"... Preference logic programming (PLP) is an extension of constraint logic programming (CLP) for declaratively specifying problems requiring optimization or comparison and selection among alternative solutions to a query. In the PLP framework, the definite clauses of a constraint logic program are augme ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
Preference logic programming (PLP) is an extension of constraint logic programming (CLP) for declaratively specifying problems requiring optimization or comparison and selection among alternative solutions to a query. In the PLP framework, the definite clauses of a constraint logic program are augmented by two new kinds of clauses, which we call optimization clauses and arbiter clauses. Optimization clauses specify which predicates are to be optimized and arbiter clauses specify the criteria to be used for optimization. We illustrate their use with representative examples: one from dynamic programming and another from ambiguity resolution in grammars. We formalize the semantics of PLP using concepts from modal logic: Essentially, each world in the possibleworlds semantics for a preference logic program is a model of the program, and an ordering over these worlds is enforced by the arbiter clauses in the program. We introduce a new notion called preferential consequence to refer to tru...
The generalized A* architecture
 Journal of Artificial Intelligence Research
, 2007
"... We consider the problem of computing a lightest derivation of a global structure using a set of weighted rules. A large variety of inference problems in AI can be formulated in this framework. We generalize A * search and heuristics derived from abstractions to a broad class of lightest derivation p ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
We consider the problem of computing a lightest derivation of a global structure using a set of weighted rules. A large variety of inference problems in AI can be formulated in this framework. We generalize A * search and heuristics derived from abstractions to a broad class of lightest derivation problems. We also describe a new algorithm that searches for lightest derivations using a hierarchy of abstractions. Our generalization of A * gives a new algorithm for searching AND/OR graphs in a bottomup fashion. We discuss how the algorithms described here provide a general architecture for addressing the pipeline problem — the problem of passing information back and forth between various stages of processing in a perceptual system. We consider examples in computer vision and natural language processing. We apply the hierarchical search algorithm to the problem of estimating the boundaries of convex objects in grayscale images and compare it to other search methods. A second set of experiments demonstrate the use of a new compositional model for finding salient curves in images. 1.
Binarization of Synchronous ContextFree Grammars
"... Systems based on synchronous grammars and tree transducers promise to improve the quality of statistical machine translation output, but are often very computationally intensive. The complexity is exponential in the size of individual grammar rules due to arbitrary reorderings between the two langu ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Systems based on synchronous grammars and tree transducers promise to improve the quality of statistical machine translation output, but are often very computationally intensive. The complexity is exponential in the size of individual grammar rules due to arbitrary reorderings between the two languages. We develop a theory of binarization for synchronous contextfree grammars and present a lineartime algorithm for binarizing synchronous rules when possible. In our largescale experiments, we found that almost all rules are binarizable and the resulting binarized rule set significantly improves the speed and accuracy of a stateoftheart syntaxbased machine translation system. We also discuss the more general, and computationally more difficult, problem of finding good parsing strategies for nonbinarizable rules, and present an approximate polynomialtime algorithm for this problem. 1.
Speeding Up the Calculation of Heuristics for Heuristic SearchBased Planning
, 2002
"... Heuristic searchbased planners, such as HSP 2.0, solve STRIPSstyle planning problems efficiently but spend about eighty percent of their planning time on calculating the heuristic values. In this paper, we systematically evaluate alternative methods for calculating the heuristic values for HSP 2.0 ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
Heuristic searchbased planners, such as HSP 2.0, solve STRIPSstyle planning problems efficiently but spend about eighty percent of their planning time on calculating the heuristic values. In this paper, we systematically evaluate alternative methods for calculating the heuristic values for HSP 2.0 and demonstrate that the resulting planning times differ substantially. HSP 2.0 calculates each heuristic value by solving a relaxed planning problem with a dynamic programming method similar to value iteration. We identify two different approaches for speeding up the calculation of heuristic values, namely to order the value updates and to reuse information from the calculation of previous heuristic values. We then show how these two approaches can be combined, resulting in our PINCH method. PINCH outperforms both of the other approaches individually as well as the methods used by HSP 1.0 and HSP 2.0 for most of the large planning problems tested. In fact, it speeds up the planning time of HSP 2.0 by up to eighty percent in several domains and, in general, the amount of savings grows with the size of the domains, allowing HSP 2.0 to solve larger planning problems than was possible before in the same amount of time and without changing its overall operation.