Results 1  10
of
14
Cell Population Tracking and Lineage Construction with Spatiotemporal Context
, 2009
"... Automated visualtracking of cell populations in vitro using timelapse phase contrast microscopy enables quantitative, systematic and highthroughput measurements of cell behaviors. These measurements include the spatiotemporal quantification of cell migration, mitosis, apoptosis, and the reconstru ..."
Abstract

Cited by 41 (10 self)
 Add to MetaCart
Automated visualtracking of cell populations in vitro using timelapse phase contrast microscopy enables quantitative, systematic and highthroughput measurements of cell behaviors. These measurements include the spatiotemporal quantification of cell migration, mitosis, apoptosis, and the reconstruction of cell lineages. The combination of low signaltonoise ratio of phase contrast microscopy images, high and varying densities of the cell cultures, topological complexities of cell shapes, and wide range of cell behaviors poses many challenges to existing tracking techniques. This paper presents a fullyautomated multitarget tracking system that can efficiently cope with these challenges while simultaneously tracking and analyzing thousands of cells observed using timelapse phase contrast microscopy. The system combines bottomup and topdown image analysis by integrating multiple collaborative modules, which exploit a fast geometric active contour tracker in conjunction with adaptive interacting multiple models (IMM) motion filtering and spatiotemporal trajectory optimization. The system, which was tested using a variety of cell populations, achieved tracking accuracy in the range of 86.9%92.5%.
Learning Probabilistic Networks
 THE KNOWLEDGE ENGINEERING REVIEW
, 1998
"... A probabilistic network is a graphical model that encodes probabilistic relationships between variables of interest. Such a model records qualitative influences between variables in addition to the numerical parameters of the probability distribution. As such it provides an ideal form for combini ..."
Abstract

Cited by 36 (1 self)
 Add to MetaCart
A probabilistic network is a graphical model that encodes probabilistic relationships between variables of interest. Such a model records qualitative influences between variables in addition to the numerical parameters of the probability distribution. As such it provides an ideal form for combining prior knowledge, which might be limited solely to experience of the influences between some of the variables of interest, and data. In this paper, we first show how data can be used to revise initial estimates of the parameters of a model. We then progress to showing how the structure of the model can be revised as data is obtained. Techniques for learning with incomplete data are also covered.
On adaptive decision rules and decision parameter adaptation for automatic speech recognition
 Proc. IEEE
, 2000
"... Recent advances in automatic speech recognition are accomplished by designing a plugin maximum a posteriori decision rule such that the forms of the acoustic and language model distributions are specified and the parameters of the assumed distributions are estimated from a collection of speech and ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
Recent advances in automatic speech recognition are accomplished by designing a plugin maximum a posteriori decision rule such that the forms of the acoustic and language model distributions are specified and the parameters of the assumed distributions are estimated from a collection of speech and language training corpora. Maximumlikelihood point estimation is by far the most prevailing training method. However, due to the problems of unknown speech distributions, sparse training data, high spectral and temporal variabilities in speech, and possible mismatch between training and testing conditions, a dynamic training strategy is needed. To cope with the changing speakers and speaking conditions in real operational conditions for highperformance speech recognition, such paradigms incorporate a small amount of speaker and environment specific adaptation data into the training process. Bayesian adaptive learning is an optimal way to combine
Bayesian Adaptive Learning of the Parameters of Hidden Markov Model for Speech Recognition
"... In this paper a theoretical framework for Bayesian adaptive learning of discrete HMM and semicontinuous one with Gaussian mixture state observation densities is presented. Corresponding to the wellknown BaumWelch and segmental kmeans algorithms respectively for HMM training, formulations of MAP ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
In this paper a theoretical framework for Bayesian adaptive learning of discrete HMM and semicontinuous one with Gaussian mixture state observation densities is presented. Corresponding to the wellknown BaumWelch and segmental kmeans algorithms respectively for HMM training, formulations of MAP (maximum aposteriori) and segmental MAP estimation of HMM parameters are developed. Furthermore, a computationally efficient method of the segmental quasiBayes estimation for semicontinuous HMM is also presented. The important issue of prior density estimation is discussed and a simplified method of moment estimate is given. The method proposed in this paper will be applicable to some problems in HMM training for speech recognition such as sequential or batch training, model adaptation, and parameter smoothing, etc.
Nonparametric Bayes methods using predictive updating
, 1998
"... Approximate nonparametric Bayes estimates calculated under a Dirichlet process prior are readily obtained in a wide range of models using a simple recursive algorithm. This chapter develops the recursion using elementary facts about nonparametric predictive distributions, and applies it to an interv ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Approximate nonparametric Bayes estimates calculated under a Dirichlet process prior are readily obtained in a wide range of models using a simple recursive algorithm. This chapter develops the recursion using elementary facts about nonparametric predictive distributions, and applies it to an interval censoring problem and to a Markov chain mixture model. SPlus code is provided. 1 Introduction Sampling models that enforce relatively weak assumptions are naturally favored in many applications, but it is well known that the corresponding posterior computations can become very intensive when a Dirichlet process encodes prior uncertainty in the weakly specified part of the model. In all but the most simple models, posterior calculations involve a mixture of Dirichlet processes. As evidenced by companion chapters, advances in Markov chain Monte Carlo (MCMC) provide critical methodology for enabling these calculations, and have opened up a wide range of interesting applications to Dirichle...
Optimization of Inspection and Maintenance Decisions for Infrastructure Facilities under Performance Model Uncertainty: A QuasiBayes Approach
, 2006
"... We present an optimization model to find joint inspection and maintenance policies for infrastructure facilities under performance model uncertainty. The objective in the formulation is to minimize the total expected social cost of managing facilities over a finite planning horizon. As in recent opt ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We present an optimization model to find joint inspection and maintenance policies for infrastructure facilities under performance model uncertainty. The objective in the formulation is to minimize the total expected social cost of managing facilities over a finite planning horizon. As in recent optimization models, performance model uncertainty is accounted for by representing facility deterioration as a mixture of known models taken from a finite set. The mixture proportions are assumed to be continuous random variables, with probability densities that are updated over time. In this paper, we relax the assumptions of fixed and errorfree inspections. We present a parametric study to analyze the effect of initial performance model uncertainty and bias on the expected total cost of managing a facility. The main observation is that reducing the initial variance in model uncertainty may be more important than reducing the initial bias. Our study also shows that cost savings can result from relaxing the constraint of a fixed inspection schedule.
Online EM and QuasiBayes or: How I Learned to Stop Worrying and Love Stochastic Approximation
, 2003
"... We accept this thesis as conforming ..."
Monitoring Message Streams: Algorithmic Methods for Automatic Processing of Messages
, 2003
"... The problem of monitoring message streams is decomposed into problems in compression, representation, matching, learning and data fusion. A coordinated approach is anticipated to yield incremental improvements in each of the several components. Exploration of the space of possible combinations of ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The problem of monitoring message streams is decomposed into problems in compression, representation, matching, learning and data fusion. A coordinated approach is anticipated to yield incremental improvements in each of the several components. Exploration of the space of possible combinations of approaches, and of fusion of multiple methods for each of the component problems may yield order of magnitude improvements in overall performance. Initial investigations are reported, with pointers to more detailed presentations.
Why can't Jos e read? The problem of learning semantic associations in a robot environment
 In Human Language Technology Conference Workshop on Learning Word Meaning from NonLinguistic Data
, 2003
"... We study the problem of learning to recognise objects in the context of autonomous agents. ..."
Abstract
 Add to MetaCart
We study the problem of learning to recognise objects in the context of autonomous agents.
In T. G. Dietterich, S. Becker, Z. Ghahramani, eds., NIPS 14. MIT Press, Cambridge MA, 2002. (In Press)
 in Advances in Neural Information Processing Systems 14
, 2002
"... The Temporal Coding Hypothesis of Miller and colleagues [7] suggests that animals integrate related temporal patterns of stimuli into single memory representations. We formalize this concept using quasiBayes estimation to update the parameters of a constrained hidden Markov model. This approach ..."
Abstract
 Add to MetaCart
The Temporal Coding Hypothesis of Miller and colleagues [7] suggests that animals integrate related temporal patterns of stimuli into single memory representations. We formalize this concept using quasiBayes estimation to update the parameters of a constrained hidden Markov model. This approach allows us to account for some surprising temporal e#ects in the second order conditioning experiments of Miller et al. [1, 2, 3], which other models are unable to explain.