• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 6,479
Next 10 →

Inductive Logic Programming: Theory and Methods

by Stephen Muggleton, Luc De Raedt - JOURNAL OF LOGIC PROGRAMMING , 1994
"... ..."
Abstract - Cited by 533 (46 self) - Add to MetaCart
Abstract not found

Dynamic Logic

by David Harel, Dexter Kozen, Jerzy Tiuryn - Handbook of Philosophical Logic , 1984
"... ed to be true under the valuation u iff there exists an a 2 N such that the formula x = y is true under the valuation u[x=a], where u[x=a] agrees with u everywhere except x, on which it takes the value a. This definition involves a metalogical operation that produces u[x=a] from u for all possibl ..."
Abstract - Cited by 1012 (7 self) - Add to MetaCart
square root of y, if it exists, would be the program x := 0 ; while x < y do x := x + 1: (1) In DL, such programs are first-class objects on a par with formulas, complete with a collection of operators for forming compound programs inductively from a basis of primitive programs. To discuss

Markov Logic Networks

by Matthew Richardson, Pedro Domingos - MACHINE LEARNING , 2006
"... We propose a simple approach to combining first-order logic and probabilistic graphical models in a single representation. A Markov logic network (MLN) is a first-order knowledge base with a weight attached to each formula (or clause). Together with a set of constants representing objects in the ..."
Abstract - Cited by 816 (39 self) - Add to MetaCart
learned from relational databases by iteratively optimizing a pseudo-likelihood measure. Optionally, additional clauses are learned using inductive logic programming techniques. Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach.

Learning Stochastic Logic Programs

by Stephen Muggleton , 2000
"... Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic context-free grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a first-order r ..."
Abstract - Cited by 1194 (81 self) - Add to MetaCart
-order range-restricted definite clause. This paper summarises the syntax, distributional semantics and proof techniques for SLPs and then discusses how a standard Inductive Logic Programming (ILP) system, Progol, has been modied to support learning of SLPs. The resulting system 1) nds an SLP with uniform

A Bayesian method for the induction of probabilistic networks from data

by Gregory F. Cooper, EDWARD HERSKOVITS - MACHINE LEARNING , 1992
"... This paper presents a Bayesian method for constructing probabilistic networks from databases. In particular, we focus on constructing Bayesian belief networks. Potential applications include computer-assisted hypothesis testing, automated scientific discovery, and automated construction of probabili ..."
Abstract - Cited by 1400 (31 self) - Add to MetaCart
This paper presents a Bayesian method for constructing probabilistic networks from databases. In particular, we focus on constructing Bayesian belief networks. Potential applications include computer-assisted hypothesis testing, automated scientific discovery, and automated construction

The Case Against Accuracy Estimation for Comparing Induction Algorithms

by Foster Provost, Tom Fawcett, Ron Kohavi - In Proceedings of the Fifteenth International Conference on Machine Learning , 1997
"... We analyze critically the use of classification accuracy to compare classifiers on natural data sets, providing a thorough investigation using ROC analysis, standard machine learning algorithms, and standard benchmark data sets. The results raise serious concerns about the use of accuracy for compar ..."
Abstract - Cited by 414 (23 self) - Add to MetaCart
is preferable both for making practical choices and for drawing scientific conclusions.

Bottom-Up Relational Learning of Pattern Matching Rules for Information Extraction

by Mary Elaine Califf, Raymond J. Mooney, David Cohn , 2003
"... Information extraction is a form of shallow text processing that locates a specified set of relevant items in a natural-language document. Systems for this task require significant domain-specific knowledge and are time-consuming and difficult to build by hand, making them a good application for ..."
Abstract - Cited by 406 (20 self) - Add to MetaCart
for machine learning. We present an algorithm, RAPIER, that uses pairs of sample documents and filled templates to induce pattern-match rules that directly extract fillers for the slots in the template. RAPIER is a bottom-up learning algorithm that incorporates techniques from several inductive logic

The Use of Explicit Plans to Guide Inductive Proofs

by Alan Bundy - 9TH CONFERENCE ON AUTOMATED DEDUCTION , 1988
"... We propose the use of explicit proof plans to guide the search for a proof in automatic theorem proving. By representing proof plans as the specifications of LCF-like tactics, [Gordon et al 79], and by recording these specifications in a sorted meta-logic, we are able to reason about the conjectures ..."
Abstract - Cited by 295 (40 self) - Add to MetaCart
We propose the use of explicit proof plans to guide the search for a proof in automatic theorem proving. By representing proof plans as the specifications of LCF-like tactics, [Gordon et al 79], and by recording these specifications in a sorted meta-logic, we are able to reason about

A System for Induction of Oblique Decision Trees

by Sreerama K. Murthy, Simon Kasif, Steven Salzberg - Journal of Artificial Intelligence Research , 1994
"... This article describes a new system for induction of oblique decision trees. This system, OC1, combines deterministic hill-climbing with two forms of randomization to find a good oblique split (in the form of a hyperplane) at each node of a decision tree. Oblique decision tree methods are tuned espe ..."
Abstract - Cited by 292 (14 self) - Add to MetaCart
This article describes a new system for induction of oblique decision trees. This system, OC1, combines deterministic hill-climbing with two forms of randomization to find a good oblique split (in the form of a hyperplane) at each node of a decision tree. Oblique decision tree methods are tuned

Modal Languages And Bounded Fragments Of Predicate Logic

by Hajnal Andréka, Johan Van Benthem, István Németi , 1996
"... Model Theory. These are non-empty families I of partial isomorphisms between models M and N , closed under taking restrictions to smaller domains, and satisfying the usual Back-and-Forth properties for extension with objects on either side -- restricted to apply only to partial isomorphisms of size ..."
Abstract - Cited by 273 (12 self) - Add to MetaCart
are preserved under partial isomorphism, by a simple induction. More precisely, one proves, for any assignment A and any partial isomorphism IÎI which is defined on the A-values for all variables x 1 , ..., x k , that M, A |= f iff N , IoA |= f . The crucial step in the induction is the quantifier case
Next 10 →
Results 1 - 10 of 6,479
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University