Results 1 
4 of
4
Mathematical Programming Algorithms for RegressionBased Nonlinear Filtering in R^N
 N ,” IEEE Transactions on Signal Processing
, 1999
"... This paper is concerned with regression under a "sum" of partial order constraints. Examples include locally monotonic, piecewise monotonic, runlength constrained, and unimodal and oligomodal regression. These are of interest not only in nonlinear filtering but also in density estimation and chromat ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
This paper is concerned with regression under a "sum" of partial order constraints. Examples include locally monotonic, piecewise monotonic, runlength constrained, and unimodal and oligomodal regression. These are of interest not only in nonlinear filtering but also in density estimation and chromatographic analysis. It is shown that under a least absolute error criterion, these problems can be transformed into appropriate finite problems, which can then be efficiently solved via dynamic programming techniques. Although the result does not carry over to least squares regression, hybrid programming algorithms can be developed to solve least squares counterparts of certain problems in the class. Index Terms Dynamic programming, locally monotonic, monotone regression, nonlinear filtering, oligomodal, piecewise monotonic, regression under order constraints, runlength constrained, unimodal. I.
Fast Digital Locally Monotonic Regression
, 1997
"... Locally monotonic regression is the optimal counterpart of iterated median filtering. In [1], Restrepo and Bovik developed an elegant mathematical framework in which they studied locally monotonic regressions in R N . The drawback is that the complexity of their algorithms is exponential in N . In ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Locally monotonic regression is the optimal counterpart of iterated median filtering. In [1], Restrepo and Bovik developed an elegant mathematical framework in which they studied locally monotonic regressions in R N . The drawback is that the complexity of their algorithms is exponential in N . In this paper, we consider digital locally monotonic regressions, in which the output symbols are drawn from a finite alphabet, and, by making a connection to Viterbi decoding, provide a fast O(jAj 2 ffN) algorithm that computes any such regression, where jAj is the size of the digital output alphabet, ff stands for lomodegree, and N is sample size. This is linear in N , and it renders the technique applicable in practice. I. Introduction Local monotonicity is a property that appears in the study of the set of root signals of the median filter [2], [3], [4], [5], [6], [7], [8]; it constraints the roughness of a signal by limiting the rate at which the signal undergoes changes of trend (inc...
The Viterbi Optimal RunlengthConstrained Approximation Nonlinear Filter
, 1995
"... Simple nonlinear filters are often used to enforce "hard" syntactic constraints while remaining close to the observation data; e.g., in the binary case it is common practice to employ iterations of a suitable median, or a onepass recursive median, openclose, or closopen filter to impose a minimum s ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Simple nonlinear filters are often used to enforce "hard" syntactic constraints while remaining close to the observation data; e.g., in the binary case it is common practice to employ iterations of a suitable median, or a onepass recursive median, openclose, or closopen filter to impose a minimum symbol runlength constraint while remaining "faithful" to the observation. Unfortunately, these filters are  in general  suboptimal. Motivated by this observation, we pose the following optimization: Given a finitealphabet sequence of finite extent, y = fy(n)g N \Gamma1 n=0 , find a sequence, b x = fbx(n)g N \Gamma1 n=0 , which minimizes d(x; y) = P N \Gamma1 n=0 dn (y(n); x(n)) subject to: x is piecewise constant of plateau runlength M . We show how a suitable reformulation of the problem naturally leads to a simple and efficient Viterbitype optimal algorithmic solution. We call the resulting nonlinear inputoutput operator the Viterbi Optimal RunlengthConstrained Approximation...
EDITOR: R. Amor
, 2006
"... SUMMARY: Building operators are confronted with large volumes of continuous data from multiple environmental sensors which require interpretation. The ABSTRACTOR system under development summarises historical data for interpretation and building performance assessment. The ABSTRACTOR algorithm conve ..."
Abstract
 Add to MetaCart
SUMMARY: Building operators are confronted with large volumes of continuous data from multiple environmental sensors which require interpretation. The ABSTRACTOR system under development summarises historical data for interpretation and building performance assessment. The ABSTRACTOR algorithm converts time series data into a set of linear trends which achieves data compression and facilitates the identification of significant events on concurrent data streams. It uses a temporal expert system based on associational reasoning and applies three consecutive processes: filtering, which is used to remove noise; interval identification to generate temporal intervals from the filtered data intervals which are characterised by a common direction of change (i.e increasing, decreasing or steady); and interpretation which performs summarisation and assists building performance assessments. Using the temporal intervals, interpretation involves differentiating between events which are environmentally insignificant and events which are environmentally significant. Inherent in this process are rules to represent these events. These rules support temporal reasoning and encapsulate knowledge to differentiate between events.