Results 1  10
of
900,484
A New Method for Solving Hard Satisfiability Problems
 AAAI
, 1992
"... We introduce a greedy local search procedure called GSAT for solving propositional satisfiability problems. Our experiments show that this procedure can be used to solve hard, randomly generated problems that are an order of magnitude larger than those that can be handled by more traditional approac ..."
Abstract

Cited by 734 (21 self)
 Add to MetaCart
We introduce a greedy local search procedure called GSAT for solving propositional satisfiability problems. Our experiments show that this procedure can be used to solve hard, randomly generated problems that are an order of magnitude larger than those that can be handled by more traditional
SNOPT: An SQP Algorithm For LargeScale Constrained Optimization
, 2002
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 582 (23 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first
Making LargeScale Support Vector Machine Learning Practical
, 1998
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract

Cited by 620 (1 self)
 Add to MetaCart
learning tasks with many training examples, offtheshelf optimization techniques for general quadratic programs quickly become intractable in their memory and time requirements. SVM light1 is an implementation of an SVM learner which addresses the problem of large tasks. This chapter presents
Large margin methods for structured and interdependent output variables
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract

Cited by 612 (12 self)
 Add to MetaCart
that solves the optimization problem in polynomial time for a large class of problems. The proposed method has important applications in areas such as computational biology, natural language processing, information retrieval/extraction, and optical character recognition. Experiments from various domains
Efficient and Effective Querying by Image Content
 Journal of Intelligent Information Systems
, 1994
"... In the QBIC (Query By Image Content) project we are studying methods to query large online image databases using the images' content as the basis of the queries. Examples of the content we use include color, texture, and shape of image objects and regions. Potential applications include med ..."
Abstract

Cited by 500 (13 self)
 Add to MetaCart
In the QBIC (Query By Image Content) project we are studying methods to query large online image databases using the images' content as the basis of the queries. Examples of the content we use include color, texture, and shape of image objects and regions. Potential applications include
Efficient semantic matching
, 2004
"... We think of Match as an operator which takes two graphlike structures and produces a mapping between semantically related nodes. We concentrate on classifications with tree structures. In semantic matching, correspondences are discovered by translating the natural language labels of nodes into prop ..."
Abstract

Cited by 817 (67 self)
 Add to MetaCart
into propositional formulas, and by codifying matching into a propositional unsatisfiability problem. We distinguish between problems with conjunctive formulas and problems with disjunctive formulas, and present various optimizations. For instance, we propose a linear time algorithm which solves the first class
Efficient Variants of the ICP Algorithm
 INTERNATIONAL CONFERENCE ON 3D DIGITAL IMAGING AND MODELING
, 2001
"... The ICP (Iterative Closest Point) algorithm is widely used for geometric alignment of threedimensional models when an initial estimate of the relative pose is known. Many variants of ICP have been proposed, affecting all phases of the algorithm from the selection and matching of points to the minim ..."
Abstract

Cited by 702 (5 self)
 Add to MetaCart
to the minimization strategy. We enumerate and classify many of these variants, and evaluate their effect on the speed with which the correct alignment is reached. In order to improve convergence for nearlyflat meshes with small features, such as inscribed surfaces, we introduce a new variant based on uniform
Global Optimization with Polynomials and the Problem of Moments
 SIAM Journal on Optimization
, 2001
"... We consider the problem of finding the unconstrained global minimum of a realvalued polynomial p(x) : R R, as well as the global minimum of p(x), in a compact set K defined by polynomial inequalities. It is shown that this problem reduces to solving an (often finite) sequence of convex linear mat ..."
Abstract

Cited by 569 (47 self)
 Add to MetaCart
We consider the problem of finding the unconstrained global minimum of a realvalued polynomial p(x) : R R, as well as the global minimum of p(x), in a compact set K defined by polynomial inequalities. It is shown that this problem reduces to solving an (often finite) sequence of convex linear
GMRES: A generalized minimal residual algorithm for solving nonsymmetric linear systems
 SIAM J. SCI. STAT. COMPUT
, 1986
"... We present an iterative method for solving linear systems, which has the property ofminimizing at every step the norm of the residual vector over a Krylov subspace. The algorithm is derived from the Arnoldi process for constructing an l2orthogonal basis of Krylov subspaces. It can be considered a ..."
Abstract

Cited by 2046 (40 self)
 Add to MetaCart
We present an iterative method for solving linear systems, which has the property ofminimizing at every step the norm of the residual vector over a Krylov subspace. The algorithm is derived from the Arnoldi process for constructing an l2orthogonal basis of Krylov subspaces. It can be considered
Results 1  10
of
900,484