Results 1  10
of
1,864
Detection of Abrupt Changes: Theory and Application
 HTTP://PEOPLE.IRISA.FR/MICHELE.BASSEVILLE/KNIGA/
, 1993
"... ..."
A Blind Source Separation Technique Using Second Order Statistics
, 1997
"... Separation of sources consists in recovering a set of signals of which only instantaneous linear mixtures are observed. In many situations, no a priori information on the mixing matrix is available: the linear mixture should be `blindly' processed. This typically occurs in narrowband array pro ..."
Abstract

Cited by 333 (9 self)
 Add to MetaCart
Separation of sources consists in recovering a set of signals of which only instantaneous linear mixtures are observed. In many situations, no a priori information on the mixing matrix is available: the linear mixture should be `blindly' processed. This typically occurs in narrowband array processing applications when the array manifold is unknown or distorted. This paper introduces a new source separation technique exploiting the time coherence of the source signals. In contrast to other previously reported techniques, the proposed approach relies only on stationary secondorder statistics, being based on a joint diagonalization of a set of covariance matrices. Asymptotic performance analysis of this method is carried out; some numerical simulations are provided to illustrate the effectiveness of the proposed method. I. Introduction I N many situations of practical interest, one has to process multidimensional observations of the form: x(t) = y(t) + n(t) = As(t) + n(t); (1) i.e. x...
Distributed Algorithmic Mechanism Design: Recent Results and Future Directions
, 2002
"... Distributed Algorithmic Mechanism Design (DAMD) combines theoretical computer science’s traditional focus on computational tractability with its more recent interest in incentive compatibility and distributed computing. The Internet’s decentralized nature, in which distributed computation and autono ..."
Abstract

Cited by 288 (22 self)
 Add to MetaCart
Distributed Algorithmic Mechanism Design (DAMD) combines theoretical computer science’s traditional focus on computational tractability with its more recent interest in incentive compatibility and distributed computing. The Internet’s decentralized nature, in which distributed computation and autonomous agents prevail, makes DAMD a very natural approach for many Internet problems. This paper first outlines the basics of DAMD and then reviews previous DAMD results on multicast cost sharing and interdomain routing. The remainder of the paper describes several promising research directions and poses some specific open problems.
The Social Cost of Cheap Pseudonyms
 Journal of Economics and Management Strategy
, 2000
"... We consider the problems of societal norms for cooperation and reputation when it is possible to obtain "cheap pseudonyms", something which is becoming quite common in a wide variety of interactions on the Internet. This introduces opportunities to misbehave without paying reputational con ..."
Abstract

Cited by 270 (11 self)
 Add to MetaCart
We consider the problems of societal norms for cooperation and reputation when it is possible to obtain "cheap pseudonyms", something which is becoming quite common in a wide variety of interactions on the Internet. This introduces opportunities to misbehave without paying reputational consequences. A large degree of cooperation can still emerge, through a convention in which newcomers "pay their dues" by accepting poor treatment from players who have established positive reputations. One might hope for an open society where newcomers are treated well, but there is an inherent social cost in making the spread of reputations optional. We prove that no equilibrium can sustain significantly more cooperation than the duespaying equilibrium in a repeated random matching game with a large number of players in which players have finite lives and the ability to change their identities, and there is a small but nonvanishing probability of mistakes. Although one could remove the ineffici...
Hidden Markov processes
 IEEE Trans. Inform. Theory
, 2002
"... Abstract—An overview of statistical and informationtheoretic aspects of hidden Markov processes (HMPs) is presented. An HMP is a discretetime finitestate homogeneous Markov chain observed through a discretetime memoryless invariant channel. In recent years, the work of Baum and Petrie on finite ..."
Abstract

Cited by 258 (5 self)
 Add to MetaCart
Abstract—An overview of statistical and informationtheoretic aspects of hidden Markov processes (HMPs) is presented. An HMP is a discretetime finitestate homogeneous Markov chain observed through a discretetime memoryless invariant channel. In recent years, the work of Baum and Petrie on finitestate finitealphabet HMPs was expanded to HMPs with finite as well as continuous state spaces and a general alphabet. In particular, statistical properties and ergodic theorems for relative entropy densities of HMPs were developed. Consistency and asymptotic normality of the maximumlikelihood (ML) parameter estimator were proved under some mild conditions. Similar results were established for switching autoregressive processes. These processes generalize HMPs. New algorithms were developed for estimating the state, parameter, and order of an HMP, for universal coding and classification of HMPs, and for universal decoding of hidden Markov channels. These and other related topics are reviewed in this paper. Index Terms—Baum–Petrie algorithm, entropy ergodic theorems, finitestate channels, hidden Markov models, identifiability, Kalman filter, maximumlikelihood (ML) estimation, order estimation, recursive parameter estimation, switching autoregressive processes, Ziv inequality. I.
Convergence of a stochastic approximation version of the EM algorithm
, 1997
"... The Expectation Maximization (EM) algorithm is a powerful computational technique for locating maxima of functions... ..."
Abstract

Cited by 152 (15 self)
 Add to MetaCart
The Expectation Maximization (EM) algorithm is a powerful computational technique for locating maxima of functions...
Central limit theorem for sequential monte carlo methods and its application to bayesian inference
 Ann. Statist
"... “particle filters, ” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result ..."
Abstract

Cited by 146 (4 self)
 Add to MetaCart
“particle filters, ” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result holds under minimal assumptions on the distributions πt, and applies in a general framework which encompasses most of the sequential Monte Carlo methods that have been considered in the literature, including the resamplemove algorithm of Gilks and Berzuini [J. R. Stat. Soc. Ser. B Stat. Methodol. 63 (2001) 127–146] and the residual resampling scheme. The corresponding asymptotic variances provide a convenient measurement of the precision of a given particle filter. We study, in particular, in some typical examples of Bayesian applications, whether and at which rate these asymptotic variances diverge in time, in order to assess the long term reliability of the considered algorithm. 1. Introduction. Sequential Monte Carlo methods form an emerging
Nonlinear Wavelet Methods for Recovery of Signals, Densities, and Spectra from Indirect and Noisy Data
 In Proceedings of Symposia in Applied Mathematics
, 1993
"... . We describe wavelet methods for recovery of objects from noisy and incomplete data. The common themes: (a) the new methods utilize nonlinear operations in the wavelet domain; (b) they accomplish tasks which are not possible by traditional linear/Fourier approaches to such problems. We attempt to i ..."
Abstract

Cited by 133 (5 self)
 Add to MetaCart
. We describe wavelet methods for recovery of objects from noisy and incomplete data. The common themes: (a) the new methods utilize nonlinear operations in the wavelet domain; (b) they accomplish tasks which are not possible by traditional linear/Fourier approaches to such problems. We attempt to indicate the heuristic principles, theoretical foundations, and possible application areas for these methods. Areas covered: (1) Wavelet DeNoising. (2) Wavelet Approaches to Linear Inverse Problems. (4) Wavelet Packet DeNoising. (5) Segmented MultiResolutions. (6) Nonlinear Multiresolutions. 1. Introduction. With the rapid development of computerized scientific instruments comes a wide variety of interesting problems for data analysis and signal processing. In fields ranging from Extragalactic Astronomy to Molecular Spectroscopy to Medical Imaging to Computer Vision, one must recover a signal, curve, image, spectrum, or density from incomplete, indirect, and noisy data. What can wavelets ...
Results 1  10
of
1,864