Results 1  10
of
112
Learning Stochastic Logic Programs
, 2000
"... Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a firstorder r ..."
Abstract

Cited by 1096 (69 self)
 Add to MetaCart
Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a firstorder rangerestricted definite clause. This paper summarises the syntax, distributional semantics and proof techniques for SLPs and then discusses how a standard Inductive Logic Programming (ILP) system, Progol, has been modied to support learning of SLPs. The resulting system 1) nds an SLP with uniform probability labels on each definition and nearmaximal Bayes posterior probability and then 2) alters the probability labels to further increase the posterior probability. Stage 1) is implemented within CProgol4.5, which differs from previous versions of Progol by allowing userdefined evaluation functions written in Prolog. It is shown that maximising the Bayesian posterior function involves nding SLPs with short derivations of the examples. Search pruning with the Bayesian evaluation function is carried out in the same way as in previous versions of CProgol. The system is demonstrated with worked examples involving the learning of probability distributions over sequences as well as the learning of simple forms of uncertain knowledge.
Exploiting Causal Independence in Bayesian Network Inference
 Journal of Artificial Intelligence Research
, 1996
"... A new method is proposed for exploiting causal independencies in exact Bayesian network inference. ..."
Abstract

Cited by 160 (9 self)
 Add to MetaCart
(Show Context)
A new method is proposed for exploiting causal independencies in exact Bayesian network inference.
Reverend Bayes on inference engines: a distributed hierarchical approach
 in Proceedings of the National Conference on Artificial Intelligence
, 1982
"... This paper presents generalizations of Bayes likelihoodratio updating rule which facilitate an asynchronous propagation of the impacts of new beliefs and/or new evidence in hierarchically organized inference structures with multihypotheses variables. The computational scheme proposed specifies a s ..."
Abstract

Cited by 101 (6 self)
 Add to MetaCart
This paper presents generalizations of Bayes likelihoodratio updating rule which facilitate an asynchronous propagation of the impacts of new beliefs and/or new evidence in hierarchically organized inference structures with multihypotheses variables. The computational scheme proposed specifies a set of belief parameters, communication messages and updating rules which guarantee that the diffusion of updated beliefs is accomplished in a single pass and complies with the tenets of Bayes calculus.
Automated Refinement of FirstOrder HornClause Domain Theories
 MACHINE LEARNING
, 1995
"... Knowledge acquisition is a difficult, errorprone, and timeconsuming task. The task of automatically improving an existing knowledge base using learning methods is addressed by the class of systems performing theory refinement. This paper presents a system, Forte (FirstOrder Revision of Theories f ..."
Abstract

Cited by 88 (8 self)
 Add to MetaCart
Knowledge acquisition is a difficult, errorprone, and timeconsuming task. The task of automatically improving an existing knowledge base using learning methods is addressed by the class of systems performing theory refinement. This paper presents a system, Forte (FirstOrder Revision of Theories from Examples), which refines firstorder Hornclause theories by integrating a variety of different revision techniques into a coherent whole. Forte uses these techniques within a hillclimbing framework, guided by a global heuristic. It identifies possible errors in the theory and calls on a library of operators to develop possible revisions. The best revision is implemented, and the process repeats until no further revisions are possible. Operators are drawn from a variety of sources, including propositional theory refinement, firstorder induction, and inverse resolution. Forte is demonstrated in several domains, including logic programming and qualitative modelling.
Toward normative expert systems: Part I. The Pathfinder project
 Methods of Information in Medicine
, 1992
"... ..."
(Show Context)
Selecting the right objective measure for association analysis
 Information Systems
"... Abstract. Objective measures such as support, confidence, interest factor, correlation, and entropy are often used to evaluate the interestingness of association patterns. However, in many situations, these measures may provide conflicting information about the interestingness of a pattern. Data min ..."
Abstract

Cited by 72 (6 self)
 Add to MetaCart
(Show Context)
Abstract. Objective measures such as support, confidence, interest factor, correlation, and entropy are often used to evaluate the interestingness of association patterns. However, in many situations, these measures may provide conflicting information about the interestingness of a pattern. Data mining practitioners also tend to apply an objective measure without realizing that there may be better alternatives available for their application. In this paper, we describe several key properties one should examine in order to select the right measure for a given application. A comparative study of these properties is made using twentyone measures that were originally developed in diverse fields such as statistics, social science, machine learning, and data mining. We show that depending on its properties, each measure is useful for some application, but not for others. We also demonstrate two scenarios in which many existing measures become consistent with each other, namely, when supportbased pruning and a technique known as table standardization are applied. Finally, we present an algorithm for selecting a small set of patterns such that domain experts can find a measure that best fits their requirements by ranking this small set of patterns. 1
Soft Computing: the Convergence of Emerging Reasoning Technologies
 Soft Computing
, 1997
"... The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to so ..."
Abstract

Cited by 58 (8 self)
 Add to MetaCart
The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to solve complex, realworld problems. After a brief description of each of these technologies, we will analyze some of their most useful combinations, such as the use of FL to control GAs and NNs parameters; the application of GAs to evolve NNs (topologies or weights) or to tune FL controllers; and the implementation of FL controllers as NNs tuned by backpropagationtype algorithms.
Monotonic and Residuated Logic Programs
, 2001
"... In this paper we define the rather general framework of Monotonic Logic Programs, where the main results of (definite) logic programming are validly extrapolated. Whenever defining new logic programming extensions, we can thus turn our attention to the stipulation and study of its intuitive algebrai ..."
Abstract

Cited by 50 (10 self)
 Add to MetaCart
In this paper we define the rather general framework of Monotonic Logic Programs, where the main results of (definite) logic programming are validly extrapolated. Whenever defining new logic programming extensions, we can thus turn our attention to the stipulation and study of its intuitive algebraic properties within the very general setting. Then, the existence of a minimum model and of a monotonic immediate consequences operator is guaranteed, and they are related as in classical logic programming. Afterwards we study the more restricted class of residuated logic programs which is able to capture several quite distinct logic programming semantics. Namely: Generalized Annotated Logic Programs, Fuzzy Logic Programming, Hybrid Probabilistic Logic Programs, and Possibilistic Logic Programming. We provide the embedding of possibilistic logic programming.
Fuzzy association rules: general model and applications
 IEEE Transactions on Fuzzy Systems
, 2003
"... Abstract—The theory of fuzzy sets has been recognized as a suitable tool to model several kinds of patterns that can hold in data. In this paper, we are concerned with the development of a general model to discover association rules among items in a (crisp) set of fuzzy transactions. This general mo ..."
Abstract

Cited by 48 (15 self)
 Add to MetaCart
(Show Context)
Abstract—The theory of fuzzy sets has been recognized as a suitable tool to model several kinds of patterns that can hold in data. In this paper, we are concerned with the development of a general model to discover association rules among items in a (crisp) set of fuzzy transactions. This general model can be particularized in several ways; each particular instance corresponds to a certain kind of pattern and/or repository of data. We describe some applications of this scheme, paying special attention to the discovery of fuzzy association rules in relational databases. Index Terms—Association rules, data mining, fuzzy transactions, quantified sentences. I.
Combining Connectionist and Symbolic Learning to Refine CertaintyFactor Rule Bases
 Connection Science
, 1993
"... This paper describes Rapture  a system for revising probabilistic knowledge bases that combines connectionist and symbolic learning methods. Rapture uses a modified version of backpropagation to refine the certainty factors of a probabilistic rule base and it uses ID3's informationgain heur ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
(Show Context)
This paper describes Rapture  a system for revising probabilistic knowledge bases that combines connectionist and symbolic learning methods. Rapture uses a modified version of backpropagation to refine the certainty factors of a probabilistic rule base and it uses ID3's informationgain heuristic to add new rules. Results on refining three actual expert knowledge bases demonstrate that this combined approach generally performs better than previous methods. 1 Introduction In complex domains, learning needs to be biased with prior knowledge in order to produce satisfactory results from limited training data. Recently, both connectionist and symbolic methods have been developed for biasing learning with prior knowledge (Shavlik and Towell, 1989; Fu, 1989; Ourston and Mooney, 1990; Pazzani and Kibler, 1992; Cohen, 1992). Most of these methods revise an imperfect knowledge base (usually obtained from a domain expert) to fit a set of empirical data. Some of these methods have been succ...