Results 1  10
of
50
Applications of parametric maxflow in computer vision
"... The maximum flow algorithm for minimizing energy functions of binary variables has become a standard tool in computer vision. In many cases, unary costs of the energy depend linearly on parameter λ. In this paper we study vision applications for which it is important to solve the maxflow problem for ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
The maximum flow algorithm for minimizing energy functions of binary variables has become a standard tool in computer vision. In many cases, unary costs of the energy depend linearly on parameter λ. In this paper we study vision applications for which it is important to solve the maxflow problem for different λ’s. An example is a weighting between data and regularization terms in image segmentation or stereo: it is desirable to vary it both during training (to learn λ from ground truth data) and testing (to select best λ using highknowledge constraints, e.g. user input). We review algorithmic aspects of this parametric maximum flow problem previously unknown in vision, such as the ability to compute all breakpoints of λ and corresponding optimal configurations in finite time. These results allow, in particular, to minimize the ratio of some geometric functionals, such as flux of a vector field over length (or area). Previously, such functionals were tackled with shortest path techniques applicable only in 2D. We give theoretical improvements for “PDE cuts ” [5]. We present experimental results for image segmentation, 3D reconstruction, and the cosegmentation problem. 1.
A PushRelabel Framework for Submodular Function Minimization and Applications to Parametric Optimization
 Discrete Applied Mathematics
, 2001
"... Recently, the first combinatorial strongly polynomial algorithms for submodular function minimization have been devised independently by Iwata, Fleischer, and Fujishige and by Schrijver. In this paper, we improve the running time of Schrijver's algorithm by designing a pushrelabel framework for ..."
Abstract

Cited by 24 (4 self)
 Add to MetaCart
Recently, the first combinatorial strongly polynomial algorithms for submodular function minimization have been devised independently by Iwata, Fleischer, and Fujishige and by Schrijver. In this paper, we improve the running time of Schrijver's algorithm by designing a pushrelabel framework for submodular function minimization (SFM). We also extend this algorithm to carry out parametric minimization for a strong map sequence of submodular functions in the same asymptotic running time as a single SFM. Applications include an eicient algorithm for finding a lexicographically optimal base.
An interior point algorithm for minimum sum of squares clustering
 SIAM J. Sci. Comput
, 1997
"... Abstract. An exact algorithm is proposed for minimum sumofsquares nonhierarchical clustering, i.e., for partitioning a given set of points from a Euclidean mspace into a given number of clusters in order to minimize the sum of squared distances from all points to the centroid of the cluster to wh ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
Abstract. An exact algorithm is proposed for minimum sumofsquares nonhierarchical clustering, i.e., for partitioning a given set of points from a Euclidean mspace into a given number of clusters in order to minimize the sum of squared distances from all points to the centroid of the cluster to which they belong. This problem is expressed as a constrained hyperbolic program in 01 variables. The resolution method combines an interior point algorithm, i.e., a weighted analytic center column generation method, with branchandbound. The auxiliary problem of determining the entering column (i.e., the oracle) is an unconstrained hyperbolic program in 01 variables with a quadratic numerator and linear denominator. It is solved through a sequence of unconstrained quadratic programs in 01 variables. To accelerate resolution, variable neighborhood search heuristics are used both to get a good initial solution and to solve quickly the auxiliary problem as long as global optimality is not reached. Estimated bounds for the dual variables are deduced from the heuristic solution and used in the resolution process as a trust region. Proved minimum sumofsquares partitions are determined for the first time for several fairly large data sets from the literature, including Fisher’s 150 iris. Key words. classification and discrimination, cluster analysis, interiorpoint methods, combinatorial optimization
Finding a global optimal solution for a quadratically constrained fractional quadratic problem with applications to the regularized total least squares
 SIAM J. Matrix Anal. Appl
"... Abstract. We consider the problem of minimizing a fractional quadratic problem involving the ratio of two indefinite quadratic functions, subject to a twosided quadratic form constraint. This formulation is motivated by the socalled regularized total least squares (RTLS) problem. A key difficulty ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
Abstract. We consider the problem of minimizing a fractional quadratic problem involving the ratio of two indefinite quadratic functions, subject to a twosided quadratic form constraint. This formulation is motivated by the socalled regularized total least squares (RTLS) problem. A key difficulty with this problem is its nonconvexity, and all current known methods to solve it are guaranteed only to converge to a point satisfying first order necessary optimality conditions. We prove that a global optimal solution to this problem can be found by solving a sequence of very simple convex minimization problems parameterized by a single parameter. As a result, we derive an efficient algorithm that produces an ɛglobal optimal solution in a computational effort of O(n3 log ɛ−1). The algorithm is tested on problems arising from the inverse Laplace transform and image deblurring. Comparison to other wellknown RTLS solvers illustrates the attractiveness of our new method. Key words. regularized total least squares, fractional programming, nonconvex quadratic optimization, convex programming
On the solution of the Tikhonov regularization of the total least squares problem
 SIAM J. Optim
"... Abstract. Total least squares (TLS) is a method for treating an overdetermined system of linear equations Ax ≈ b, where both the matrix A and the vector b are contaminated by noise. Tikhonov regularization of the TLS (TRTLS) leads to an optimization problem of minimizing the sum of fractional quadra ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
Abstract. Total least squares (TLS) is a method for treating an overdetermined system of linear equations Ax ≈ b, where both the matrix A and the vector b are contaminated by noise. Tikhonov regularization of the TLS (TRTLS) leads to an optimization problem of minimizing the sum of fractional quadratic and quadratic functions. As such, the problem is nonconvex. We show how to reduce the problem to a single variable minimization of a function G over a closed interval. Computing a value and a derivative of G consists of solving a single trust region subproblem. For the special case of regularization with a squared Euclidean norm we show that G is unimodal and provide an alternative algorithm, which requires only one spectral decomposition. A numerical example is given to illustrate the effectiveness of our method.
Optimal Time Domain Equalization Design for Maximizing Data Rate of Discrete MultiTone Systems
, 2003
"... The traditional discrete multitone equalizer is a cascade of a time domain equalizer (TEQ) as a single finite impulse response filter, a multicarrier demodulator as a fast Fourier transform (FFT), and a frequency domain equalizer (FEQ) as a onetap filter bank. The TEQ shortens the transmission cha ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
The traditional discrete multitone equalizer is a cascade of a time domain equalizer (TEQ) as a single finite impulse response filter, a multicarrier demodulator as a fast Fourier transform (FFT), and a frequency domain equalizer (FEQ) as a onetap filter bank. The TEQ shortens the transmission channel impulse response (CIR) to mitigate intersymbol interference (ISI). Maximum Bit Rate (MBR) and Minimum ISI (MinISI) methods achieve higher data rates at the TEQ output than previously published methods. As an alternative to the traditional equalizer, the pertone equalizer (PTE) moves the TEQ into the FEQ and customizes a multitap FEQ for each tone. In this paper, we propose a time domain TEQ filter bank (TEQFB) and single TEQ that demonstrate better data rates at the FEQ output than MBR, MinISI, and leastsquares PTE methods with standard CIRs, transmit filters, and receive filters. The contributions of this paper are: (1) a model for the signaltonoise ratio (SNR) at the FFT output that includes ISI, nearend crosstalk, white Gaussian noise, analogtodigital converter quantization noise and the digital noise floor; (2) data rate optimal time domain pertone TEQ filter bank and the upper bound on bit rate performance it achieves; and (3) data rate maximization single TEQ design algorithm. SP EDICS: SP 3TDSL: Telephone Networks and Digital Subscriber Loops. Milos Milosevic *, B. L. Evans and R. Baldick are with the Dept. of Electrical and Computer Engineering, The University of Texas at Austin, Austin, TX 787121084, L. F. C. Pessoa is with Motorola, Inc.,7700 West Parmer Lane, MD: TX32PL30, Austin, TX 78729. Email: {milos,bevans,baldick}@ece.utexas.edu and Lucio.Pessoa@motorola.com. B. L. Evans was supported by a gift from the Motorola Semiconductor Products S...
Fractional programming: the sumofratios case
 Optimization Methods and Software
, 2003
"... One of the most difficult fractional programs encountered so far is the sumofratios problem. Contrary to earlier expectations it is much more removed from convex programming than other multiratio problems analyzed before. It really should be viewed in the context of global optimization. It proves ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
One of the most difficult fractional programs encountered so far is the sumofratios problem. Contrary to earlier expectations it is much more removed from convex programming than other multiratio problems analyzed before. It really should be viewed in the context of global optimization. It proves to be essentially NPcomplete in spite of its special structure under the usual assumptions on numerators and denominators. The paper provides a recent survey of applications, theoretical results and various algorithmic approaches for this challenging problem.
Homology Search with Fragmented Nucleic Acid Sequence Patterns
"... Abstract. The comprehensive annotation of noncoding RNAs in newly sequenced genomes is still a largely unsolved problem because many functional RNAs exhibit not only poorly conserved sequences but also large variability in structure. In many cases, such as Y RNAs, vault RNAs, or telomerase RNAs, se ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Abstract. The comprehensive annotation of noncoding RNAs in newly sequenced genomes is still a largely unsolved problem because many functional RNAs exhibit not only poorly conserved sequences but also large variability in structure. In many cases, such as Y RNAs, vault RNAs, or telomerase RNAs, sequences differ by large insertions or deletions and have only a few small sequence patterns in common. Here we present fragrep2, a purely sequencebased approach to detect such patterns in complete genomes. Afragrep2 pattern consists of an ordered list of positionspecific weight matrices (PWMs) describing short, approximately conserved sequence elements, that are separated by intervals of nonconserved regions of bounded length. The program uses a fractional programming approach to align the PWMs to genomic DNA in order to allow for a bounded number of insertions and deletions in the patterns. These patterns are then combined to significant combinations of PWMs. At this step, a subset of PWMs may be deleted, i.e., have no
D.: Continuous ratio optimization via convex relaxation with applications to multiview 3d reconstruction
 In: CVPR
, 2009
"... We introduce a convex relaxation framework to optimally minimize continuous surface ratios. The key idea is to minimize the continuous surface ratio by solving a sequence of convex optimization problems. We show that such minimal ratios are superior to traditionally used minimal surface formulations ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We introduce a convex relaxation framework to optimally minimize continuous surface ratios. The key idea is to minimize the continuous surface ratio by solving a sequence of convex optimization problems. We show that such minimal ratios are superior to traditionally used minimal surface formulations in that they do not suffer from a shrinking bias and no longer require the choice of a regularity parameter. The absence of a shrinking bias in the minimal ratio model is proven analytically. Furthermore we demonstrate that continuous ratio optimization can be applied to derive a new algorithm for reconstructing threedimensional silhouetteconsistent objects from multiple views. Experimental results confirm that our approach allows to accurately reconstruct deep concavities even without the specification of tuning parameters. 1.
Fast Algorithms for L ∞ Problems in Multiview Geometry
"... Many problems in multiview geometry, when posed as minimization of the maximum reprojection error across observations, can be solved optimally in polynomial time. We show that these problems are instances of a convexconcave generalized fractional program. We survey the major solution methods for s ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Many problems in multiview geometry, when posed as minimization of the maximum reprojection error across observations, can be solved optimally in polynomial time. We show that these problems are instances of a convexconcave generalized fractional program. We survey the major solution methods for solving problems of this form and present them in a unified framework centered around a single parametric optimization problem. We propose two new algorithms and show that the algorithm proposed by Olsson et al. [21] is a special case of a classical algorithm for generalized fractional programming. The performance of all the algorithms is compared on a variety of datasets, and the algorithm proposed by Gugat [12] stands out as a clear winner. An open source MATLAB toolbox thats implements all the algorithms presented here is made available. 1.