Results 1 
9 of
9
Strict L ∞ Isotonic Regression
"... Given a realvalued function f with weights w on a finite DAG G = (V, E), an isotonic regression of (f, w) is an orderpreserving realvalued function on V which minimizes the regression error among all such functions. When the regression error is defined via the L ∞ norm typically there is not a un ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Given a realvalued function f with weights w on a finite DAG G = (V, E), an isotonic regression of (f, w) is an orderpreserving realvalued function on V which minimizes the regression error among all such functions. When the regression error is defined via the L ∞ norm typically there is not a unique isotonic regression, unlike the behavior for the Lp norms, 1 < p < ∞. Here a partial ordering is imposed on isotonic regressions, one that refines the notion of minimizing the largest regression errors. This order results in a unique minimal L ∞ isotonic regression, called the strict L ∞ isotonic regression. Further, strict L ∞ isotonic regression is the limit, as p goes to infinity, of Lp isotonic regression. Algorithms are given showing that if G has n vertices, then for linear or tree orderings pool adjacent violators (PAV) yields the strict isotonic regression in or Θ(n logn) time, and for arbitrary DAGs it can be determined in time proportional to the time required to generate the transitive closure. Several algorithms for generating nonstrict L ∞ isotonic regressions have previously appeared in the literature. We examine their behavior as mappings from weighted functions over G to isotonic functions over G, showing that the fastest algorithms are not monotonic mappings, and no previously studied algorithm preserves level set trimming. In contrast, the strict L ∞ isotonic regression, and Lp regression for all 1 < p < ∞, is monotonic and preserves level set trimming. 1
Lipschitz unimodal and isotonic regression on paths and trees
, 2008
"... Let M = (V, A) be a planar graph, let γ ≥ 0 be a real parameter, and t: V → R a height function. A γLipschitz unimodal regression (γLUR) of t is a function s: V → R such that s has a unique local minimum, s(u) − s(v)  ≤ γ for each {u, v} ∈ A, and ‖s − t‖2 = ∑ v∈V (s(v) − t(v))2 is minimized. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Let M = (V, A) be a planar graph, let γ ≥ 0 be a real parameter, and t: V → R a height function. A γLipschitz unimodal regression (γLUR) of t is a function s: V → R such that s has a unique local minimum, s(u) − s(v)  ≤ γ for each {u, v} ∈ A, and ‖s − t‖2 = ∑ v∈V (s(v) − t(v))2 is minimized. Here, a local minimum of s is a vertex v such that s(u)> s(v) for any neighbor u of v. For a directed planar graph, s: V → R is the γLipschitz isotonic regression (γLIR) of t if s(u) ≤ s(v) ≤ s(u)+γ for each directed edge (u, v) and ‖s − t‖2 is minimized. These problems arise, for example, in topological simplification of a height function. We present nearlineartime algorithms for LUR and LIR problems for two special cases where M is a path or a tree.
Algorithms for L ∞ Isotonic Regression
, 2009
"... This paper gives algorithms for determining L ∞ weighted isotonic regressions satisfying order constraints given by a DAG with n vertices and m edges. Throughout, topological sorting plays an important role. A modification to an algorithm of Kaufman and Tamir gives an algorithm taking Θ(m log n) tim ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper gives algorithms for determining L ∞ weighted isotonic regressions satisfying order constraints given by a DAG with n vertices and m edges. Throughout, topological sorting plays an important role. A modification to an algorithm of Kaufman and Tamir gives an algorithm taking Θ(m log n) time for the general case, improving upon theirs when the graph is sparse. When the regression values are restricted to a set S then scaling can be used to find an optimal regression in Θ(m log S) time. The prefix isotonic regression problem is used as an intermediate step in finding isotonic regressions for some specific orders. For rooted trees the prefix isotonic regression problem is solved in Θ(n log n) time, allowing one to find the unimodal regression of a linear order in the same time bound. When the vertices are points in ddimensional space ordered by domination then the prefix isotonic problem can be solved, and hence the isotonic regression determined, in Θ(n log d n) time. 1
Isotonic Median Regression via Scaling
"... This paper gives algorithms for determining isotonic median regressions (i.e., isotonic regression using the L1 metric) satisfying order constraints given by various ordered sets. For a rooted tree the regression can be found in Θ(n log n) time, while for a star it can be found in Θ(n) time, where n ..."
Abstract
 Add to MetaCart
This paper gives algorithms for determining isotonic median regressions (i.e., isotonic regression using the L1 metric) satisfying order constraints given by various ordered sets. For a rooted tree the regression can be found in Θ(n log n) time, while for a star it can be found in Θ(n) time, where n is the number of vertices. For bivariate data, when the set is a grid the regression can be found in Θ(n log n) time, while for general sets it can be found in Θ(n log 2 n) time. For vertices in ddimensional index space, d ≥ 3, the regression can be found in Θ(n 2 log 2d−1 n) time for general placement. When there are multiple data values per point, with N total values, the regression for tree and bivariate grid orders can be determined in Θ(n log n + N log log N) time. Most of the algorithms are based on a scaling approach which exploits the fact that L1 regression values can always be chosen to be data values.
An Approach to Computing Multidimensional Isotonic Regressions
"... This paper gives an approach for determining isotonic regressions for data at points in multidimensional space, with the ordering given by domination. Recent algorithmic advances for 2dimensional isotonic regressions have made them useful for significantly larger data sets, and here we provide an a ..."
Abstract
 Add to MetaCart
This paper gives an approach for determining isotonic regressions for data at points in multidimensional space, with the ordering given by domination. Recent algorithmic advances for 2dimensional isotonic regressions have made them useful for significantly larger data sets, and here we provide an advance for dimensions 3 and larger. Given a set V of n ddimensional points, it is shown that an isotonic regression on V can be determined in ˜ Θ(n2), ˜ Θ(n3), and ˜ Θ(n) time for the L1, L2, and L ∞ metrics, respectively. This improves upon previous results by a factor of ˜ Θ(n). The core of the approach is in extending the regression to a set of points V ′ ⊃ V where the domination ordering on V ′ can be represented with relatively few edges.
In Proceedings Interface 2012: Future of Statistical Computing Optimal Reduced Isotonic Regression
"... Isotonic regression is a shapeconstrained nonparametric regression in which the ordinate is a nondecreasing function of the abscissa. The regression outcome is an increasing step function. For an initial set of n points, the number of steps in the isotonic regression, m, may be as large as n. As a ..."
Abstract
 Add to MetaCart
Isotonic regression is a shapeconstrained nonparametric regression in which the ordinate is a nondecreasing function of the abscissa. The regression outcome is an increasing step function. For an initial set of n points, the number of steps in the isotonic regression, m, may be as large as n. As a result, the full isotonic regression has been criticized as overfitting the data or making the representation too complicated. Socalled “reduced ” isotonic regression constrains the outcome to be a specified number of steps, b. The fastest previous algorithm for determining an optimal reduced isotonic regression takes Θ(n + bm 2) time for the L2 metric. However, researchers have found this to be too slow and have instead used approximations. In this paper, we reduce the time for the exact solution to Θ(n+bm logm). Our approach is based on a new algorithm for finding an optimal bstep approximation of isotonic data. This algorithm takes Θ(n log n) time for the L1 and L2 metrics.
Ecole Doctorale: MATISSE présentée par
, 2013
"... Je tiens tout d’abord à remercier mes encadrants. Après avoir grandemment contribué à me faire venir enseigner à l’Université, Eric MatznerLøber m’a fait confiance pour ce travail de recherche; je tiens à lui témoigner toute ma reconnaissance et mes remerciements à son égard dépassent très largemen ..."
Abstract
 Add to MetaCart
Je tiens tout d’abord à remercier mes encadrants. Après avoir grandemment contribué à me faire venir enseigner à l’Université, Eric MatznerLøber m’a fait confiance pour ce travail de recherche; je tiens à lui témoigner toute ma reconnaissance et mes remerciements à son égard dépassent très largement le cadre professionnel. Arnaud Guyader a accepté sans hésitation de m’aider au moment même où les difficultés à venir s’annonçaient grandes. Je le remercie infiniment pour son écoute et sa très grande disponibilité; ce manuscrit lui doit beaucoup. Enfin, rendons à César ce qui appartient à César: l’idée de la méthode présentée ici est tout droit sortie de l’esprit prodigieux de Nicolas Hengartner. En m’invitant deux fois à Los Alamos ces dernières années, il m’a donné le privilège de travailler à ses côtés. Je suis désormais au moins sûr d’une chose: le génie et l’enthousiasme sont liés! Je tiens à remercier chaleureusement Cécile Durot et Sylvain Sardy pour l’intérêt qu’ils ont porté à mon travail en acceptant de rapporter sur cette thèse. Je suis également très reconnaissant à Christophe Abraham et Gérard Biau d’avoir bien voulu prendre sur leur temps précieux pour faire partie du jury. Je voudrais aussi remercier Marie de Tayrac pour m’avoir fourni les données médicales permettant
This appears in Algorithmica 66 (2013), pp. 93–112. Isotonic Regression via Partitioning
"... Algorithms are given for determining weighted isotonic regressions satisfying order constraints specified via a directed acyclic graph (DAG). For the L1 metric a partitioning approach is used which exploits the fact that L1 regression values can always be chosen to be data values. Extending this app ..."
Abstract
 Add to MetaCart
Algorithms are given for determining weighted isotonic regressions satisfying order constraints specified via a directed acyclic graph (DAG). For the L1 metric a partitioning approach is used which exploits the fact that L1 regression values can always be chosen to be data values. Extending this approach, algorithms for binaryvalued L1 isotonic regression are used to find Lp isotonic regressions for 1 < p < ∞. Algorithms are given for trees, 2dimensional and multidimensional orderings, and arbitrary DAGs. Algorithms are also given for Lp isotonic regression with constrained data and weight values, L1 regression with unweighted data, and L1 regression for DAGs where there are multiple data values at the vertices.
Weighted L ∞ Isotonic Regression
"... Algorithms are given for determining weighted L ∞ isotonic regressions satisfying order constraints given by a directed acyclic graph (dag) withnvertices andmedges. An algorithm is given takingΘ(mlogn) time for the general case. However, it relies on parametric search, so a practical approach is int ..."
Abstract
 Add to MetaCart
Algorithms are given for determining weighted L ∞ isotonic regressions satisfying order constraints given by a directed acyclic graph (dag) withnvertices andmedges. An algorithm is given takingΘ(mlogn) time for the general case. However, it relies on parametric search, so a practical approach is introduced, based on calculating prefix solutions. While not as fast in the general case, for linear and tree orderings prefix algorithms are used to determine isotonic and unimodal regressions inΘ(nlogn) time. Algorithms are also given for determining isotonic regressions when the values are constrained to a specified set of values, such as the integers, and for situations where there are significantly fewer different weights, or fewer different values, than vertices. L ∞ isotonic regressions are not unique, so we examine properties of the regressions an algorithm produces, in addition to the time it takes. In this aspect the prefix approach is superior to the parametric search approach.