Results 1  10
of
121,516
BiasVariance Techniques for Monte Carlo Optimization: Crossvalidation for the CE Method
, 2008
"... In this paper, we examine the CE method in the broad context of Monte Carlo Optimization (MCO) [Ermoliev and Norkin, 1998, Robert and Casella, 2004] and Parametric Learning (PL), a type of machine ..."
Abstract
 Add to MetaCart
In this paper, we examine the CE method in the broad context of Monte Carlo Optimization (MCO) [Ermoliev and Norkin, 1998, Robert and Casella, 2004] and Parametric Learning (PL), a type of machine
Robust Monte Carlo Localization for Mobile Robots
, 2001
"... Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples), whi ..."
Abstract

Cited by 826 (88 self)
 Add to MetaCart
Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples
Sequential data assimilation with a nonlinear quasigeostrophic model using Monte Carlo methods to forecast error statistics
 J. Geophys. Res
, 1994
"... . A new sequential data assimilation method is discussed. It is based on forecasting the error statistics using Monte Carlo methods, a better alternative than solving the traditional and computationally extremely demanding approximate error covariance equation used in the extended Kalman filter. The ..."
Abstract

Cited by 782 (22 self)
 Add to MetaCart
. A new sequential data assimilation method is discussed. It is based on forecasting the error statistics using Monte Carlo methods, a better alternative than solving the traditional and computationally extremely demanding approximate error covariance equation used in the extended Kalman filter
The BiasVariance dilemma of the Monte Carlo method
 in Artificial Neural Networks: ICANN
, 2001
"... We investigate the setting in which Monte Carlo methods are used and draw a parallel to the formal setting of statistical inference. In particular, we find that Monte Carlo approximation gives rise to a biasvariance dilemma. We show that it is possible to construct a biased approximation schemes wi ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We investigate the setting in which Monte Carlo methods are used and draw a parallel to the formal setting of statistical inference. In particular, we find that Monte Carlo approximation gives rise to a biasvariance dilemma. We show that it is possible to construct a biased approximation schemes
Clustering using Monte Carlo CrossValidation
, 1996
"... Finding the "right" number of clusters, k, for a data set is a difficult, and often illposed, problem. In a probabilistic clustering context, likelihoodratios, penalized likelihoods, and Bayesian techniques are among the more popular techniques. In this paper a new crossvalidated likeli ..."
Abstract

Cited by 68 (0 self)
 Add to MetaCart
Finding the "right" number of clusters, k, for a data set is a difficult, and often illposed, problem. In a probabilistic clustering context, likelihoodratios, penalized likelihoods, and Bayesian techniques are among the more popular techniques. In this paper a new crossvalidated
SMOTE: Synthetic Minority Oversampling Technique
 Journal of Artificial Intelligence Research
, 2002
"... An approach to the construction of classifiers from imbalanced datasets is described. A dataset is imbalanced if the classification categories are not approximately equally represented. Often realworld data sets are predominately composed of ``normal'' examples with only a small percentag ..."
Abstract

Cited by 614 (28 self)
 Add to MetaCart
good means of increasing the sensitivity of a classifier to the minority class. This paper shows that a combination of our method of oversampling the minority (abnormal) class and undersampling the majority (normal) class can achieve better classifier performance (in ROC space) than only under
Ensemble Methods in Machine Learning
 MULTIPLE CLASSIFIER SYSTEMS, LBCS1857
, 2000
"... Ensemble methods are learning algorithms that construct a set of classifiers and then classify new data points by taking a (weighted) vote of their predictions. The original ensemble method is Bayesian averaging, but more recent algorithms include errorcorrecting output coding, Bagging, and boostin ..."
Abstract

Cited by 607 (3 self)
 Add to MetaCart
Ensemble methods are learning algorithms that construct a set of classifiers and then classify new data points by taking a (weighted) vote of their predictions. The original ensemble method is Bayesian averaging, but more recent algorithms include errorcorrecting output coding, Bagging
Synonyms BiasVariance tradeoffs: Novel Applications Biasvariance tradeoffs, bias plus variance. Definition
, 2007
"... Consider a given random variable F and a random variable that we can modify, ˆ F. We wish to use a sample of ˆ F as an estimate of a sample of F. The mean squared error between such a pair of samples is a sum of four terms. The first term reflects the statistical coupling between F and ˆ F and is c ..."
Abstract
 Add to MetaCart
broader context and in a variety of situations. We also show, using experiments, how techniques for optimizing biasvariance tradeoffs introduced in machine learning can be applied in novel circumstances to improve the performance of a class of optimization algorithms.
An introduction to variational methods for graphical models
 TO APPEAR: M. I. JORDAN, (ED.), LEARNING IN GRAPHICAL MODELS
"... ..."
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
Results 1  10
of
121,516