Results 1  10
of
52
Waveletbased statistical signal processing using hidden Markov models
 IEEE Transactions on Signal Processing
, 1998
"... Abstract — Waveletbased statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many realworld signals. In this paper, we develop a new framework for statistical signal pr ..."
Abstract

Cited by 325 (52 self)
 Add to MetaCart
Abstract — Waveletbased statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many realworld signals. In this paper, we develop a new framework for statistical signal processing based on waveletdomain hidden Markov models (HMM’s) that concisely models the statistical dependencies and nonGaussian statistics encountered in realworld signals. Waveletdomain HMM’s are designed with the intrinsic properties of the wavelet transform in mind and provide powerful, yet tractable, probabilistic signal models. Efficient expectation maximization algorithms are developed for fitting the HMM’s to observational signal data. The new framework is suitable for a wide range of applications, including signal estimation, detection, classification, prediction, and even synthesis. To demonstrate the utility of waveletdomain HMM’s, we develop novel algorithms for signal denoising, classification, and detection. Index Terms — Hidden Markov model, probabilistic graph, wavelets.
Efficient multiscale regularization with applications to the computation of optical flow
 IEEE Trans. Image Process
, 1994
"... AbsfruetA new approach to regularization methods for image processing is introduced and developed using as a vehicle the problem of computing dense optical flow fields in an image sequence. Standard formulations of this problem require the computationally intensive solution of an elliptic partial d ..."
Abstract

Cited by 98 (33 self)
 Add to MetaCart
AbsfruetA new approach to regularization methods for image processing is introduced and developed using as a vehicle the problem of computing dense optical flow fields in an image sequence. Standard formulations of this problem require the computationally intensive solution of an elliptic partial differential equation that arises from the often used “smoothness constraint” ’yl”. regularization. The interpretation of the smoothness constraint is utilized as a “fractal prior ” to motivate regularization based on a recently introduced class of multiscale stochastic models. The solution of the new problem formulation is computed with an efficient multiscale algorithm. Experiments on several image sequences demonstrate the substantial computational savings that can be achieved due to the fact that the algorithm is noniterative and in fact has a per pixel computational complexity that is independent of image size. The new approach also has a number of other important advantages. Specifically, multiresolution flow field estimates are available, allowing great flexibility in dealing with the tradeoff between resolution and accuracy. Multiscale error covariance information is also available, which is of considerable use in assessing the accuracy of the estimates. In particular, these error statistics can be used as the basis for a rational procedure for determining the spatiallyvarying optimal reconstruction resolution. Furthermore, if there are compelling reasons to insist upon a standard smoothness constraint, our algorithm provides an excellent initialization for the iterative algorithms associated with the smoothness constraint problem formulation. Finally, the usefulness of our approach should extend to a wide variety of illposed inverse problems in which variational techniques seeking a “smooth ” solution are generally Used. I.
InformationTheoretic Analysis of Neural Coding
 J. Comp. Neuroscience
, 1998
"... We describe an approach to analyzing single and multiunit (ensemble) discharge patterns based on informationtheoretic distance measures and on empirical theories derived from work in universal signal processing. In this approach, we quantify the difference between response patterns, be they tim ..."
Abstract

Cited by 57 (13 self)
 Add to MetaCart
We describe an approach to analyzing single and multiunit (ensemble) discharge patterns based on informationtheoretic distance measures and on empirical theories derived from work in universal signal processing. In this approach, we quantify the difference between response patterns, be they timevarying or not, using informationtheoretic distance measures. We apply these techniques to single and multiple unit processing of sound amplitude and sound location. These examples illustrate that neurons can simultaneously represent at least two kinds of information with different levels of fidelity. The fidelity can persist through a transient and a subsequent steadystate response, indicating that it is possible for an evolving neural code to represent information with constant fidelity. 1 Johnson et al. Analysis of Neural Coding 1 Introduction Neural coding has been classified into two broadly defined types: rate codes the average rate of spike discharge and timing codes the t...
Image Processing with Multiscale Stochastic Models
, 1993
"... In this thesis, we develop image processing algorithms and applications for a particular class of multiscale stochastic models. First, we provide background on the model class, including a discussion of its relationship to wavelet transforms and the details of a twosweep algorithm for estimation. A ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
In this thesis, we develop image processing algorithms and applications for a particular class of multiscale stochastic models. First, we provide background on the model class, including a discussion of its relationship to wavelet transforms and the details of a twosweep algorithm for estimation. A multiscale model for the error process associated with this algorithm is derived. Next, we illustrate how the multiscale models can be used in the context of regularizing illposed inverse problems and demonstrate the substantial computational savings that such an approach offers. Several novel features of the approach are developed including a technique for choosing the optimal resolution at which to recover the object of interest. Next, we show that this class of models contains other widely used classes of statistical models including 1D Markov processes and 2D Markov random fields, and we propose a class of multiscale models for approximately representing Gaussian Markov random fields...
Features Extraction and Temporal Segmentation of Acoustic Signals
, 1998
"... This paper deals with temporal segmentation of acoustic signals and features extraction. Segmentation and features extraction are aimed at being a first step for sound signal representation, coding, transformation, indexation and multimedia. Three interdependent schemes of segmentation are defined, ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
This paper deals with temporal segmentation of acoustic signals and features extraction. Segmentation and features extraction are aimed at being a first step for sound signal representation, coding, transformation, indexation and multimedia. Three interdependent schemes of segmentation are defined, which correspond to different levels of signal attributes. A complete segmentation and features extraction system is shown. Applications and results on various examples are presented. The paper is divided into four sections: a definition of the segmentation schemes; and description of the techniques used for each of them.
The Marginalized Likelihood Ratio Test For Detecting Abrupt Changes
 IEEE TRANS. AUTOMATIC CONTROL
, 1996
"... The generalized likelihood ratio (GLR) test is a widely used method for detecting abrupt changes in linear systems and signals. In this paper the marginalized likelihood ratio (MLR) test is introduced for eliminating three shortcomings of GLR, while preserving its applicability and generality. First ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
The generalized likelihood ratio (GLR) test is a widely used method for detecting abrupt changes in linear systems and signals. In this paper the marginalized likelihood ratio (MLR) test is introduced for eliminating three shortcomings of GLR, while preserving its applicability and generality. Firstly, the need for a userchosen threshold is eliminated in MLR. Secondly, the noise levels need not be known exactly and may even change over time, which means that MLR is robust. Finally, a very efficient exact implementation with linear in time complexity for batchwise data processing is developed. This should be compared to the quadratic in time complexity of the exact GLR.
Knowledge discovery from heterogeneous dynamic systems using changepoint correlations
 In ”Proceedings of the 2005 SIAM International Data Mining Conference
, 2005
"... Most of the stream mining techniques presented so far have primary paid attention to discovering association rules by direct comparison between timeseries data sets. However, their utility is very limited for heterogeneous systems, where time series of various types (discrete, continuous, oscillato ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
Most of the stream mining techniques presented so far have primary paid attention to discovering association rules by direct comparison between timeseries data sets. However, their utility is very limited for heterogeneous systems, where time series of various types (discrete, continuous, oscillatory, noisy, etc.) act dynamically in a strongly correlated manner. In this paper, we introduce a new nonlinear transformation, singular spectrum transformation (SST), to address the problem of knowledge discovery of causal relationships from a set of time series. SST is a transformation that transforms a time series into the probability density function that represents a chance to observe some particular change. For an automobile data set, we demonstrate that SST enables us to discover a hidden and useful dependency between variables.
Kernel changepoint analysis
 in "Proc. Neural Info. Proc. Systems
, 2008
"... We introduce a kernelbased method for changepoint analysis within a sequence of temporal observations. Changepoint analysis of an unlabelled sample of observations consists in, first, testing whether a change in the distribution occurs within the sample, and second, if a change occurs, estimating ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
We introduce a kernelbased method for changepoint analysis within a sequence of temporal observations. Changepoint analysis of an unlabelled sample of observations consists in, first, testing whether a change in the distribution occurs within the sample, and second, if a change occurs, estimating the changepoint instant after which the distribution of the observations switches from one distribution to another different distribution. We propose a test statistic based upon the maximum kernel Fisher discriminant ratio as a measure of homogeneity between segments. We derive its limiting distribution under the null hypothesis (no change occurs), and establish the consistency under the alternative hypothesis (a change occurs). This allows to build a statistical hypothesis testing procedure for testing the presence of a changepoint, with a prescribed falsealarm probability and detection probability tending to one in the largesample setting. If a change actually occurs, the test statistic also yields an estimator of the changepoint location. Promising experimental results in temporal segmentation of mental tasks from BCI data and pop song indexation are presented. 1
Estimating network loss rates using active tomography
"... Active network tomography refers to an interesting class of largescale inverse problems that arise in estimating the quality of service parameters of computer and communications networks. This article focuses on estimation of loss rates of the internal links of a network using endtoend measurem ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
Active network tomography refers to an interesting class of largescale inverse problems that arise in estimating the quality of service parameters of computer and communications networks. This article focuses on estimation of loss rates of the internal links of a network using endtoend measurements of nodes located on the periphery. A class of flexible experiments for actively probing the network is introduced, and conditions under which all of the linklevel information is estimable are obtained. Maximum likelihood estimation using the EM algorithm, the structure of the algorithm, and the properties of the maximum likelihood estimators are investigated. This includes simulation studies using the ns (network simulator) to obtain realistic network traffic. The optimal design of probing experiments is also studied. Finally, application of the results to network monitoring is briefly illustrated.