Results 1 - 10
of
7,033
Minimax Programs
- University of California Press
, 1997
"... We introduce an optimization problem called a minimax program that is similar to a linear program, except that the addition operator is replaced in the constraint equations by the maximum operator. We clarify the relation of this problem to some better-known problems. We identify an interesting spec ..."
Abstract
-
Cited by 482 (5 self)
- Add to MetaCart
special case and present an efficient algorithm for its solution. 1 Introduction Over the last fifty years, thousands of problems of practical interest have been formulated as a linear program. Not only has the linear programming model proven to be widely applicable, but ongoing research has discovered
Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems
- IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING
, 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract
-
Cited by 539 (17 self)
- Add to MetaCart
-constrained quadratic programming (BCQP) formulation of these problems. We test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method. Computational experiments show that these GP approaches perform well in a wide range
Benchmarking Least Squares Support Vector Machine Classifiers
- NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract
-
Cited by 476 (46 self)
- Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set
Multiple kernel learning, conic duality, and the SMO algorithm
- In Proceedings of the 21st International Conference on Machine Learning (ICML
, 2004
"... While classical kernel-based classifiers are based on a single kernel, in practice it is often desirable to base classifiers on combinations of multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for the support vector machine (SVM), and showed that the optimiz ..."
Abstract
-
Cited by 445 (31 self)
- Add to MetaCart
; moreover, the sequential minimal optimization (SMO) techniques that are essential in large-scale implementations of the SVM cannot be applied because the cost function is non-differentiable. We propose a novel dual formulation of the QCQP as a second-order cone programming problem, and show how to exploit
Logic Programming in a Fragment of Intuitionistic Linear Logic
, 1994
"... When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ..."
Abstract
-
Cited by 340 (44 self)
- Add to MetaCart
if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goal-directed interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop
Large scale multiple kernel learning
- JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... While classical kernel-based learning algorithms are based on a single kernel, in practice it is often desirable to use multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for classification, leading to a convex quadratically constrained quadratic program. We s ..."
Abstract
-
Cited by 340 (20 self)
- Add to MetaCart
show that it can be rewritten as a semi-infinite linear program that can be efficiently solved by recycling the standard SVM implementations. Moreover, we generalize the formulation and our method to a larger class of problems, including regression and one-class classification. Experimental results
Robust Linear Programming Discrimination Of Two Linearly Inseparable Sets
, 1992
"... INTRODUCTION We consider the two point-sets A and B in the n-dimensional real space R n represented by the m \Theta n matrix A and the k \Theta n matrix B respectively. Our principal objective here is to formulate a single linear program with the following properties: (i) If the convex hulls of A ..."
Abstract
-
Cited by 239 (32 self)
- Add to MetaCart
INTRODUCTION We consider the two point-sets A and B in the n-dimensional real space R n represented by the m \Theta n matrix A and the k \Theta n matrix B respectively. Our principal objective here is to formulate a single linear program with the following properties: (i) If the convex hulls
Smodels - an Implementation of the Stable Model and Well-Founded Semantics for Normal Logic Programs
, 1997
"... The Smodels system is a C++ implementation of the wellfounded and stable model semantics for range-restricted function-free normal programs. The system includes two modules: (i) smodels which implements the two semantics for ground programs and (ii) parse which computes a grounded version of a range ..."
Abstract
-
Cited by 294 (9 self)
- Add to MetaCart
-up backtracking search where a powerful pruning method is employed. The pruning method exploits an approximation technique for stable models which is closely related to the well-founded semantics. One of the advantages of this novel technique is that it can be implemented to work in linear space. This makes
Support vector machines for multiple-instance learning
- Advances in Neural Information Processing Systems 15
, 2003
"... This paper presents two new formulations of multiple-instance learning as a maximum margin problem. The proposed extensions of the Support Vector Machine (SVM) learning approach lead to mixed integer quadratic programs that can be solved heuristically. Our generalization of SVMs makes a state-of-the ..."
Abstract
-
Cited by 314 (2 self)
- Add to MetaCart
This paper presents two new formulations of multiple-instance learning as a maximum margin problem. The proposed extensions of the Support Vector Machine (SVM) learning approach lead to mixed integer quadratic programs that can be solved heuristically. Our generalization of SVMs makes a state
The Space of Human Body Shapes: Reconstruction And Parameterization from Range Scans
- ACM TRANS. GRAPH
, 2003
"... We develop a novel method for fitting high-resolution template meshes to detailed human body range scans with sparse 3D markers. We formulate an optimization problem in which the degrees of freedom are an affine transformation at each template vertex. The objective function is a weighted combination ..."
Abstract
-
Cited by 290 (4 self)
- Add to MetaCart
We develop a novel method for fitting high-resolution template meshes to detailed human body range scans with sparse 3D markers. We formulate an optimization problem in which the degrees of freedom are an affine transformation at each template vertex. The objective function is a weighted
Results 1 - 10
of
7,033