Results 1  10
of
6,288
Blind Signal Separation: Statistical Principles
, 2003
"... Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mut ..."
Abstract

Cited by 529 (4 self)
 Add to MetaCart
of mutual independence between the signals. The weakness of the assumptions makes it a powerful approach but requires to venture beyond familiar second order statistics. The objective of this paper is to review some of the approaches that have been recently developed to address this exciting problem
Additive Logistic Regression: a Statistical View of Boosting
 Annals of Statistics
, 1998
"... Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classification methodology. The performance of many classification algorithms can often be dramatically improved by sequentially applying them to reweighted versions of the input dat ..."
Abstract

Cited by 1750 (25 self)
 Add to MetaCart
data, and taking a weighted majority vote of the sequence of classifiers thereby produced. We show that this seemingly mysterious phenomenon can be understood in terms of well known statistical principles, namely additive modeling and maximum likelihood. For the twoclass problem, boosting can
Active Learning with Statistical Models
, 1995
"... For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statist ..."
Abstract

Cited by 679 (10 self)
 Add to MetaCart
For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative
The importance of being random: Statistical principles of iris recognition,”
 Pattern Recognition,
, 2003
"... Abstract The statistical variability that is the basis of iris recognition is analysed in this paper using new large databases. The principle underlying the recognition algorithm is the failure of a test of statistical independence on iris phase structure encoded by multiscale quadrature wavelets. ..."
Abstract

Cited by 193 (4 self)
 Add to MetaCart
Abstract The statistical variability that is the basis of iris recognition is analysed in this paper using new large databases. The principle underlying the recognition algorithm is the failure of a test of statistical independence on iris phase structure encoded by multiscale quadrature wavelets
A Statistically Principled Approach to
"... This paper outlines a statistically principled approach to clustering one dimensional data. Given a dataset, the idea is to fit a density function that is as simple as possible, but still compatible with the data. Simplicity is measured in terms of a standard smoothness functional. Datacompatibilit ..."
Abstract
 Add to MetaCart
This paper outlines a statistically principled approach to clustering one dimensional data. Given a dataset, the idea is to fit a density function that is as simple as possible, but still compatible with the data. Simplicity is measured in terms of a standard smoothness functional. Data
Statistical Principles in Image Modeling
, 2007
"... Images of natural scenes contain a rich variety of visual patterns. To learn and recognize these patterns from natural images, it is necessary to construct statistical models for these patterns. In this review article we describe three statistical principles for modeling image patterns: the sparse c ..."
Abstract
 Add to MetaCart
Images of natural scenes contain a rich variety of visual patterns. To learn and recognize these patterns from natural images, it is necessary to construct statistical models for these patterns. In this review article we describe three statistical principles for modeling image patterns: the sparse
The Dantzig selector: statistical estimation when p is much larger than n
, 2005
"... In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n ≪ ..."
Abstract

Cited by 879 (14 self)
 Add to MetaCart
In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n
Statistical Principles Of Source Separation
, 1997
"... : Blind signal separation (BSS) is an emerging signal processing technique, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mutual independence between the signals. The weakness of the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
of the assumptions makes it a powerful approach but requires to venture beyond familiar second order statistics. The objective of this paper is to review some of the approaches that have been recently developed to address this exciting problem, to show how they stem from basic principles and how they relate together
Probabilistic Latent Semantic Analysis
 In Proc. of Uncertainty in Artificial Intelligence, UAI’99
, 1999
"... Probabilistic Latent Semantic Analysis is a novel statistical technique for the analysis of twomode and cooccurrence data, which has applications in information retrieval and filtering, natural language processing, machine learning from text, and in related areas. Compared to standard Latent Sema ..."
Abstract

Cited by 771 (9 self)
 Add to MetaCart
Semantic Analysis which stems from linear algebra and performs a Singular Value Decomposition of cooccurrence tables, the proposed method is based on a mixture decomposition derived from a latent class model. This results in a more principled approach which has a solid foundation in statistics. In order
Compressive sampling
, 2006
"... Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired res ..."
Abstract

Cited by 1441 (15 self)
 Add to MetaCart
Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired
Results 1  10
of
6,288