Results 1  10
of
355
Region Competition: Unifying Snakes, Region Growing, and Bayes/MDL for Multiband Image Segmentation
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1996
"... We present a novel statistical and variational approach to image segmentation based on a new algorithm named region competition. This algorithm is derived by minimizing a generalized Bayes/MDL criterion using the variational principle. The algorithm is guaranteed to converge to a local minimum and c ..."
Abstract

Cited by 774 (20 self)
 Add to MetaCart
We present a novel statistical and variational approach to image segmentation based on a new algorithm named region competition. This algorithm is derived by minimizing a generalized Bayes/MDL criterion using the variational principle. The algorithm is guaranteed to converge to a local minimum
A filtering algorithm for constraints of difference in CSPs
 Proceedings of AAAI’94, the 12th (US) National Conference on Artificial Intelligence
, 1994
"... Abstract Many reallife Constraint Satisfaction Problems (CSPs) involve some constraints similar to the alldifferent constraints. These constraints are called constraints of difference. They are defined on a subset of variables by a set of tuples for which the values occuring in the same tuple are ..."
Abstract

Cited by 378 (6 self)
 Add to MetaCart
are all different. In this paper, a new filtering algorithm for these constraints is presented. It achieves the generalized arcconsistency condition for these nonbinary constraints. It is based on matching theory and its complexity is low. In fact, for a constraint defined on a subset of p variables
The Learnability of Naive Bayes
 In: Proceedings of Canadian Artificial Intelligence Conference
, 2005
"... Naive Bayes is an efficient and effective learning algorithm, but previous results show that its representation ability is severely limited since it can only represent certain linearly separable functions in the binary domain. We give necessary and sufficient conditions on linearly separable functio ..."
Abstract

Cited by 164 (0 self)
 Add to MetaCart
Naive Bayes is an efficient and effective learning algorithm, but previous results show that its representation ability is severely limited since it can only represent certain linearly separable functions in the binary domain. We give necessary and sufficient conditions on linearly separable
A Probabilistic Technique for Simultaneous Localization and Door State Estimation with Mobile Robots in Dynamic Environments
 In IROS2002
, 2002
"... Virtually all existing mobile robot localization techniques operate on a static map of the environment. When the environment changes (e.g., doors are opened or closed), there is an opportunity to simultaneously estimate the robot's pose and the state of the environment. The resulting estimation ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
estimation problem is highdimensional, rendering current localization techniques inapplicable. This paper proposes an efficient, factored estimation algorithm for mixed discretecontinuous state estimation. Our algorithm integrates particle filters for robot localization, and conditional binary Bayes
OneClass SVMs for Document Classification
 Journal of Machine Learning Research
, 2001
"... We implemented versions of the SVM appropriate for oneclass classification in the context of information retrieval. The experiments were conducted on the standard Reuters data set. For the SVM implementation we used both a version of Schölkopf et al. and a somewhat different version of oneclass SV ..."
Abstract

Cited by 185 (3 self)
 Add to MetaCart
” representation. Then we compared it with oneclass versions of the algorithms prototype (Rocchio), nearest neighbor, naive Bayes, and finally a natural oneclass neural network classification method based on “bottleneck” compression generated filters. The SVM approach as represented by Schölkopf was superior
Glymour: Linearity properties of Bayes nets with binary variables
 Uncertainty in Artificial Intelligence: Proceedings of the 17th Conference (UAI2001
, 2001
"... It is “well known ” that in linear models: (1) testable constraints on the marginal distribution of observed variables distinguish certain cases in which an unobserved cause jointly influences several observed variables; (2) the technique of “instrumental variables ” sometimes permits an estimation ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
of the influence of one variable on another even when the association between the variables may be confounded by unobserved common causes; (3) the association (or conditional probability distribution of one variable given another) of two variables connected by a path or pair of paths with a single common vertex (a
Published In Linearity Properties of Bayes Nets with Binary Variables
, 2001
"... It is “well known ” that in linear models: (1) testable constraints on the marginal distribution of observed variables distinguish certain cases in which an unobserved cause jointly influences several observed variables; (2) the technique of “instrumental variables ” sometimes permits an estimation ..."
Abstract
 Add to MetaCart
an estimation of the influence of one variable on another even when the association between the variables may be confounded by unobserved common causes; (3) the association (or conditional probability distribution of one variable given another) of two variables connected by a path or pair of paths with a single
Document Image Restoration Using Binary Morphological Filters
 In Proceedings of SPIE
, 1996
"... This paper discusses a method for binary morphological filter design to restore document images degraded by subtractive or additive noise, given a constraint on the size of filters. With a filter size restriction (for example 3 \Theta 3), each pixel in output image depends only on its ( 3 \Theta 3 ) ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
This paper discusses a method for binary morphological filter design to restore document images degraded by subtractive or additive noise, given a constraint on the size of filters. With a filter size restriction (for example 3 \Theta 3), each pixel in output image depends only on its ( 3 \Theta 3
Abstract Text Filtering by Boosting Naive Bayes Classifiers
"... Several machine learning algorithms have recently been used for text categorization and filtering. In particular, boosting methods such as AdaBoost have shown good performance applied to real text data. However, most of existing boosting algorithms are based on classifiers that use binaryvalued fea ..."
Abstract
 Add to MetaCart
Several machine learning algorithms have recently been used for text categorization and filtering. In particular, boosting methods such as AdaBoost have shown good performance applied to real text data. However, most of existing boosting algorithms are based on classifiers that use binary
THE RESTRICTED VARIATIONAL BAYES APPROXIMATION IN BAYESIAN FILTERING
"... The Variational Bayes (VB) approach is used as a onestep approximation for Bayesian filtering. It requires the availability of moments of the freeform distributional optimizers. The latter may have intractable functional forms. In this contribution, we replace these by appropriate fixedform distr ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The Variational Bayes (VB) approach is used as a onestep approximation for Bayesian filtering. It requires the availability of moments of the freeform distributional optimizers. The latter may have intractable functional forms. In this contribution, we replace these by appropriate fixed
Results 1  10
of
355