Results 1  10
of
193
An analysis of Bayesian classifiers
 IN PROCEEDINGS OF THE TENTH NATIONAL CONFERENCE ON ARTI CIAL INTELLIGENCE
, 1992
"... In this paper we present anaveragecase analysis of the Bayesian classifier, a simple induction algorithm that fares remarkably well on many learning tasks. Our analysis assumes a monotone conjunctive target concept, and independent, noisefree Boolean attributes. We calculate the probability that t ..."
Abstract

Cited by 440 (17 self)
 Add to MetaCart
In this paper we present anaveragecase analysis of the Bayesian classifier, a simple induction algorithm that fares remarkably well on many learning tasks. Our analysis assumes a monotone conjunctive target concept, and independent, noisefree Boolean attributes. We calculate the probability
AB45 1 Punctuality, conjunctivity, and monotonicity
, 1993
"... In this paper we study the relationships between three properties of predicate transformers that were defined in [3]. Many of the results, proofs, and examples that follow are not new; these are included in the interest of selfcontainedness. 1 Definitions A predicate transformer f is called punctua ..."
Abstract
 Add to MetaCart
, [x ⇒ y] ⇒ [f.x ⇒ f.y]. (3) 2 Conjunctivity implies monotonicity The most obvious connection between the above concepts is the following fact. Theorem 1 [3, page 28] Every conjunctive predicate transformer is monotonic. Proof Let f be conjunctive. Then, for any predicates x, y, [f.x ⇒ f
Conjunctive Predicate Detection
, 1995
"... This paper discusses efficient detection of global predicates in a distributed program. Previous work in detection of global predicates was restricted to predicates that could be specified as a boolean formula of local predicates. Many properties in distributed systems, however, use the state of cha ..."
Abstract
 Add to MetaCart
of channels. In this paper, we introduce the concept of a channel predicate and provide an efficient algorithm to detect any boolean formula of local and channel predicates. We define a property called monotonicity for channel predicates. Monotonicity is crucial for efficient detection of global predicates
Class Preserving Monotonic and Dual Monotonic Language Learning
, 1992
"... In the present paper strongmonotonic, monotonic, weakmonotonic as well as dual strongmonotonic, dual monotonic and dual weakmonotonic reasoning is studied. We investigate the power of all types of monotonic and dual monotonic inference from positive as well as from positive and negative data ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
as good as the previous one with respect to the language to be learnt. Weakmonotonicity is the analogue of cumulativity in learning theory. Dual strongmonotonic learning expresses the concept to generate a chain of descending specializations converging in the limit to the target language. Dual
Learning Monotonic Linear Functions
 Proceedings of the 17th Annual Conference on Learning Theory, 2004
, 2004
"... Abstract. Learning probabilities (pconcepts [13]) and other realvalued concepts (regression) is an important role of machine learning. For example, a doctor may need to predict the probability of getting a disease P [yx], which depends on a number of risk factors. Generalized additive models [9] ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
] are a wellstudied nonparametric model in the statistics literature, usually with monotonic link functions. However, no known efficient algorithms exist for learning such a general class. We show that regression graphs efficiently learn such realvalued concepts, while regression trees inefficiently
Characterizations of Class Preserving Monotonic and Dual Monotonic Language Learning
, 1994
"... The present paper deals with monotonic and dual monotonic language learning from positive as well as from positive and negative examples. The three notions of monotonicity reflect different formalizations of the requirement that the learner has to always produce better and better generalizations whe ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
when fed more and more data on the concept to be learnt. The three versions of dual monotonicity describe the concept that the inference device has to exclusively produce specializations that fit better and better to the target language. We characterize strongmonotonic, monotonic, weakmonotonic
Trading Monotonicity Demands versus Efficiency
 Bull. Inf. Cybern
, 1995
"... The present paper deals with the learnability of indexed families L of uniformly recursive languages from positive data. We consider the influence of three monotonicity demands and their dual counterparts to the efficiency of the learning process. The efficiency of learning is measured in depend ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
in dependence on the number of mind changes a learning algorithm is allowed to perform. The three notions of (dual) monotonicity reflect different formalizations of the requirement that the learner has to produce better and better (specializations) generalizations when fed more and more data on the target
Evolution with Drifting Targets
"... We consider the question of the stability of evolutionary algorithms to gradual changes, or drift, in the target concept. We define an algorithm to be resistant to drift if, for some inverse polynomial drift rate in the target function, it converges to accuracy 1 − ǫ with polynomial resources, and t ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
We consider the question of the stability of evolutionary algorithms to gradual changes, or drift, in the target concept. We define an algorithm to be resistant to drift if, for some inverse polynomial drift rate in the target function, it converges to accuracy 1 − ǫ with polynomial resources
Inductive Constraint Logic
, 1995
"... . A novel approach to learning first order logic formulae from positive and negative examples is presented. Whereas present inductive logic programming systems employ examples as true and false ground facts (or clauses), we view examples as interpretations which are true or false for the target theo ..."
Abstract

Cited by 95 (19 self)
 Add to MetaCart
order formulae. However, whereas classical learning techniques have concentrated on concept representations in disjunctive normal form, we will use a clausal representation, which corresponds to a conjuctive normal form where each conjunct forms a constraint on positive examples. This representation
Two Kinds of NonMonotonic Analogical Inference
"... . This paper addresses two modi of analogical reasoning. The first modus is based on the explicit representation of the justification for the analogical inference. The second modus is based on the representation of typical instances by concept structures. The two kinds of analogical inferences rely ..."
Abstract
 Add to MetaCart
. This paper addresses two modi of analogical reasoning. The first modus is based on the explicit representation of the justification for the analogical inference. The second modus is based on the representation of typical instances by concept structures. The two kinds of analogical inferences rely
Results 1  10
of
193