Results 1  10
of
2,577,943
A global solution to sparse correspondence problems
 IEEE Transactions on pattern Analysis and Machine Intelligence
, 2003
"... Abstract—We propose a new methodology for reliably solving the correspondence problem between sparse sets of points of two or more images. This is a key step in most problems of computer vision and, so far, no general method exists to solve it. Our methodology is able to handle most of the commonly ..."
Abstract

Cited by 72 (3 self)
 Add to MetaCart
optimization problem. This is a blunt formulation, which considers the whole combinatorial space of possible point selections and correspondences. To find its global optimal solution, we build a concave objective function and relax the search domain into its convexhull. The special structure of this extended
Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems
 IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING
, 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to underdetermined, or illconditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract

Cited by 539 (17 self)
 Add to MetaCart
Many problems in signal processing and statistical inference involve finding sparse solutions to underdetermined, or illconditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a
Learning with local and global consistency.
 In NIPS,
, 2003
"... Abstract We consider the general problem of learning from labeled and unlabeled data, which is often called semisupervised learning or transductive inference. A principled approach to semisupervised learning is to design a classifying function which is sufficiently smooth with respect to the intr ..."
Abstract

Cited by 669 (21 self)
 Add to MetaCart
to the intrinsic structure collectively revealed by known labeled and unlabeled points. We present a simple algorithm to obtain such a smooth solution. Our method yields encouraging experimental results on a number of classification problems and demonstrates effective use of unlabeled data.
Estimation of probabilities from sparse data for the language model component of a speech recognizer
 IEEE Transactions on Acoustics, Speech and Signal Processing
, 1987
"... AbstractThe description of a novel type of rngram language model is given. The model offers, via a nonlinear recursive procedure, a computation and space efficient solution to the problem of estimating probabilities from sparse data. This solution compares favorably to other proposed methods. Wh ..."
Abstract

Cited by 798 (2 self)
 Add to MetaCart
AbstractThe description of a novel type of rngram language model is given. The model offers, via a nonlinear recursive procedure, a computation and space efficient solution to the problem of estimating probabilities from sparse data. This solution compares favorably to other proposed methods
An Efficient Solution to the FivePoint Relative Pose Problem
, 2004
"... An efficient algorithmic solution to the classical fivepoint relative pose problem is presented. The problem is to find the possible solutions for relative camera pose between two calibrated views given five corresponding points. The algorithm consists of computing the coefficients of a tenth degre ..."
Abstract

Cited by 482 (13 self)
 Add to MetaCart
An efficient algorithmic solution to the classical fivepoint relative pose problem is presented. The problem is to find the possible solutions for relative camera pose between two calibrated views given five corresponding points. The algorithm consists of computing the coefficients of a tenth
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
, 2007
"... A fullrank matrix A ∈ IR n×m with n < m generates an underdetermined system of linear equations Ax = b having infinitely many solutions. Suppose we seek the sparsest solution, i.e., the one with the fewest nonzero entries: can it ever be unique? If so, when? As optimization of sparsity is combin ..."
Abstract

Cited by 427 (35 self)
 Add to MetaCart
is combinatorial in nature, are there efficient methods for finding the sparsest solution? These questions have been answered positively and constructively in recent years, exposing a wide variety of surprising phenomena; in particular, the existence of easilyverifiable conditions under which optimallysparse
A solution to Plato’s problem: The latent semantic analysis theory of acquisition, induction, and representation of knowledge
 PSYCHOLOGICAL REVIEW
, 1997
"... How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis (LS ..."
Abstract

Cited by 1810 (10 self)
 Add to MetaCart
How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis
Trace Scheduling: A Technique for Global Microcode Compaction
 IEEE TRANSACTIONS ON COMPUTERS
, 1981
"... Microcode compaction is the conversion of sequential microcode into efficient parallel (horizontal) microcode. Local compaction techniques are those whose domain is basic blocks of code, while global methods attack code with a general flow control. Compilation of highlevel microcode languages int ..."
Abstract

Cited by 682 (5 self)
 Add to MetaCart
into efficient horizontal microcode and good hand coding probably both require effective global compaction techniques. In this paper "trace scheduling" is developed as a solution to the global compaction problem. Trace scheduling works on traces (or paths) through microprograms. Compacting is thus done
For Most Large Underdetermined Systems of Linear Equations the Minimal ℓ1norm Solution is also the Sparsest Solution
 Comm. Pure Appl. Math
, 2004
"... We consider linear equations y = Φα where y is a given vector in R n, Φ is a given n by m matrix with n < m ≤ An, and we wish to solve for α ∈ R m. We suppose that the columns of Φ are normalized to unit ℓ 2 norm 1 and we place uniform measure on such Φ. We prove the existence of ρ = ρ(A) so that ..."
Abstract

Cited by 567 (10 self)
 Add to MetaCart
that for large n, and for all Φ’s except a negligible fraction, the following property holds: For every y having a representation y = Φα0 by a coefficient vector α0 ∈ R m with fewer than ρ · n nonzeros, the solution α1 of the ℓ 1 minimization problem min �x�1 subject to Φα = y is unique and equal to α0
Closedform solution of absolute orientation using unit quaternions
 J. Opt. Soc. Am. A
, 1987
"... Finding the relationship between two coordinate systems using pairs of measurements of the coordinates of a number of points in both systems is a classic photogrammetric task. It finds applications in stereophotogrammetry and in robotics. I present here a closedform solution to the leastsquares pr ..."
Abstract

Cited by 990 (4 self)
 Add to MetaCart
squares problem for three or more points. Currently various empirical, graphical, and numerical iterative methods are in use. Derivation of the solution is simplified by use of unit quaternions to represent rotation. I emphasize a symmetry property that a solution to this problem ought to possess. The best
Results 1  10
of
2,577,943