Results 1  10
of
730
CONDENSATION  conditional density propagation for visual tracking
 International Journal of Computer Vision
, 1998
"... The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimodal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses "factored sampling", previously appli ..."
Abstract

Cited by 1424 (12 self)
 Add to MetaCart
The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimodal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses "factored sampling", previously applied to the interpretation of static images, in which the probability distribution of possible interpretations is represented by a randomly generated set. Condensation uses learned dynamical models, together with visual observations, to propagate the random set over time. The result is highly robust tracking of agile motion. Notwithstanding the use of stochastic methods, the algorithm runs in near realtime. Contents 1 Tracking curves in clutter 2 2 Discretetime propagation of state density 3 3 Factored sampling 6 4 The Condensation algorithm 8 5 Stochastic dynamical models for curve motion 10 6 Observation model 13 7 Applying the Condensation algorithm to videostreams 17 8 Conclusions 26 A Nonline...
Efficient and Effective Querying by Image Content
 Journal of Intelligent Information Systems
, 1994
"... In the QBIC (Query By Image Content) project we are studying methods to query large online image databases using the images' content as the basis of the queries. Examples of the content we use include color, texture, and shape of image objects and regions. Potential applications include med ..."
Abstract

Cited by 488 (13 self)
 Add to MetaCart
(Show Context)
In the QBIC (Query By Image Content) project we are studying methods to query large online image databases using the images' content as the basis of the queries. Examples of the content we use include color, texture, and shape of image objects and regions. Potential applications include medical ("Give me other images that contain a tumor with a texture like this one"), photojournalism ("Give me images that have blue at the top and red at the bottom"), and many others in art, fashion, cataloging, retailing, and industry. We describe a set of novel features and similarity measures allowing query by color, texture, and shape of image object. We demonstrate the effectiveness of the QBIC system with normalized precision and recall experiments on test databases containing over 1000 images and 1000 objects populated from commercially available photo clip art images, and of images of airplane silhouettes. We also consider the efficient indexing of these features, specifically addre...
Some Impossibility Theorems In Econometrics With Applications To Instrumental Variables, Dynamic Models And Cointegration
 Econometrica
, 1995
"... General characterizations of valid confidence sets and tests in problems which involve locally almost unidentified (LAU) parameters are provided and applied to several econometric models. Two types of inference problems are studied: (1) inference about parameters which are not identifiable on certai ..."
Abstract

Cited by 202 (38 self)
 Add to MetaCart
General characterizations of valid confidence sets and tests in problems which involve locally almost unidentified (LAU) parameters are provided and applied to several econometric models. Two types of inference problems are studied: (1) inference about parameters which are not identifiable on certain subsets of the parameter space, and (2) inference about parameter transformations with singularities (discontinuities). When a LAU parameter or parametric function has an unbounded range, it is shown under general regularity conditions that any valid confidence set with level 1 \Gamma ff for this parameter should be unbounded with probability close to 1 \Gamma ff in the neighborhood of nonidentification subsets and should as well have a nonzero probability of being unbounded under any distribution compatible with the model: no valid confidence set which is bounded with probability one does exist. These properties hold even if "identifying restrictions" are imposed. Similar results also ob...
Maximum score estimation of the stochastic utility model of choice
 Journal of Econometrics
, 1975
"... This paper introduces a class of robust estimators of the parameters of a stochastic utility function. Existing maximum likelihood and regression estimation methods require the assumption of a particular distributional family for the random component of utility. In contrast, estimators of the ‘maxi ..."
Abstract

Cited by 180 (2 self)
 Add to MetaCart
This paper introduces a class of robust estimators of the parameters of a stochastic utility function. Existing maximum likelihood and regression estimation methods require the assumption of a particular distributional family for the random component of utility. In contrast, estimators of the ‘maximum score ’ class require only weak distributional assumptions for consistency. Following presentation and proof of the basic consistency theorem, additional results are given. An algorithm for achieving maximum score estimates and some small sample Monte Carlo tests are also described. 1.
Recent advances in the automatic recognition of audiovisual speech
 Proceedings of the IEEE
"... Abstract — Visual speech information from the speaker’s mouth region has been successfully shown to improve noise robustness of automatic speech recognizers, thus promising to extend their usability into the human computer interface. In this paper, we review the main components of audiovisual autom ..."
Abstract

Cited by 157 (15 self)
 Add to MetaCart
(Show Context)
Abstract — Visual speech information from the speaker’s mouth region has been successfully shown to improve noise robustness of automatic speech recognizers, thus promising to extend their usability into the human computer interface. In this paper, we review the main components of audiovisual automatic speech recognition and present novel contributions in two main areas: First, the visual front end design, based on a cascade of linear image transforms of an appropriate video regionofinterest, and subsequently, audiovisual speech integration. On the later topic, we discuss new work on feature and decision fusion combination, the modeling of audiovisual speech asynchrony, and incorporating modality reliability estimates to the bimodal recognition process. We also briefly touch upon the issue of audiovisual speaker adaptation. We apply our algorithms to three multisubject bimodal databases, ranging from small to largevocabulary recognition tasks, recorded at both visually controlled and challenging environments. Our experiments demonstrate that the visual modality improves automatic speech recognition over all conditions and data considered, however less so for visually challenging environments and large vocabulary tasks. Index Terms — Audiovisual speech recognition, speechreading, visual feature extraction, audiovisual fusion, hidden Markov model, multistream HMM, product HMM, reliability estimation, adaptation, audiovisual databases. I.
Optimal motion and structure estimation
 IEEE Trans. Pattern Anal. Mach. Intell
, 1993
"... This paper studies optimal estimation for motion and structure from point correspondences. (1) A study of the characteristics of thc problem provides insight into the need for optimal estimation. (2) Methods have been developed for optimal estimation with known or unknown noise distribution. The sim ..."
Abstract

Cited by 148 (5 self)
 Add to MetaCart
This paper studies optimal estimation for motion and structure from point correspondences. (1) A study of the characteristics of thc problem provides insight into the need for optimal estimation. (2) Methods have been developed for optimal estimation with known or unknown noise distribution. The simulations showed that the optimal estimations achieve remarkable improvement over the preliminary estimates given by the linear algorithm. (3) An approach to estimating errors in the optimized solution is presented. (4) The performance of the algorithm is compared with a theoretical lower bound CramCrRao bound. Simulations show that the actual errors have essentially reached the bound. (5) A batch leastsquares technique (LevenbergMarquardt) and a sequential leastsquares technique (iterated extended Kalman filtering) are analyzed and compared. The analysis and experiments show that, in general, a batch technique will perform better than a sequential technique for any nonlinear problems. Recursive batch processing technique is proposed for nonlinear problems that require recursive estimation. 1.
Unsupervised feature selection using feature similarity
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2002
"... AbstractÐIn this article, we describe an unsupervised feature selection algorithm suitable for data sets, large in both dimension and size. The method is based on measuring similarity between features whereby redundancy therein is removed. This does not need any search and, therefore, is fast. A new ..."
Abstract

Cited by 142 (3 self)
 Add to MetaCart
(Show Context)
AbstractÐIn this article, we describe an unsupervised feature selection algorithm suitable for data sets, large in both dimension and size. The method is based on measuring similarity between features whereby redundancy therein is removed. This does not need any search and, therefore, is fast. A new feature similarity measure, called maximum information compression index, is introduced. The algorithm is generic in nature and has the capability of multiscale representation of data sets. The superiority of the algorithm, in terms of speed and performance, is established extensively over various reallife data sets of different sizes and dimensions. It is also demonstrated how redundancy and information loss in feature selection can be quantified with an entropy measure. Index TermsÐData mining, pattern recognition, dimensionality reduction, feature clustering, multiscale representation, entropy measures. 1
2004) Higher order properties of GMM and generalized empirical likelihood estimators
 Econometrica
"... In an effort to improve the small sample properties of generalized method of moments (GMM) estimators, a number of alternative estimators have been suggested. These include empirical likelihood (EL), continuous updating, and exponential tilting estimators. We show that these estimators share a com ..."
Abstract

Cited by 127 (5 self)
 Add to MetaCart
In an effort to improve the small sample properties of generalized method of moments (GMM) estimators, a number of alternative estimators have been suggested. These include empirical likelihood (EL), continuous updating, and exponential tilting estimators. We show that these estimators share a common structure, being members of a class of generalized empirical likelihood (GEL) estimators. We use this structure to compare their higher order asymptotic properties. We find that GEL has no asymptotic bias due to correlation of the moment functions with their Jacobian, eliminating an important source of bias for GMM in models with endogeneity. We also find that EL has no asymptotic bias from estimating the optimal weight matrix, eliminating a further important source of bias for GMM in panel data models. We give bias corrected GMM and GEL estimators. We also show that bias corrected EL inherits the higher order property of maximum likelihood, that it is higher order asymptotically efficient relative to the other bias corrected estimators.
Convergence of a stochastic approximation version of the EM algorithm
, 1997
"... The Expectation Maximization (EM) algorithm is a powerful computational technique for locating maxima of functions... ..."
Abstract

Cited by 122 (12 self)
 Add to MetaCart
The Expectation Maximization (EM) algorithm is a powerful computational technique for locating maxima of functions...