Results 1  10
of
37
Least Common Subsumers and Most Specific Concepts in a Description Logic with Existential Restrictions and Terminological Cycles
, 2003
"... Computing least common subsumers (Ics) and most specific concepts (msc) are inference tasks that can support the bottomup construction of knowledge bases in description logics. In description logics with existential restrictions, the most specific concept need not exist if one restricts the attenti ..."
Abstract

Cited by 97 (20 self)
 Add to MetaCart
Computing least common subsumers (Ics) and most specific concepts (msc) are inference tasks that can support the bottomup construction of knowledge bases in description logics. In description logics with existential restrictions, the most specific concept need not exist if one restricts the attention to concept descriptions or acyclic TBoxes. In this paper, we extend the notions les and msc to cyclic TBoxes. For the description logic EC (which allows for conjunctions, existential restrictions, and the topconcept), we show that the les and msc always exist and can be computed in polynomial time if we interpret cyclic definitions with greatest fixpoint semantics.
Comparing Concepts in Differentiated Ontologies
 Proceedings of the Twelfth Workshop on Knowledge Acquisition, Modeling and Management (KAW'99
, 1999
"... Concepts in differentiated ontologies inherit definitional structure from concepts in shared ontologies. Shared, inherited structure provides a common ground that supports measures of “description compatibility. ” These algorithms are the primary contribution of this paper. The descriptioncompatibi ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
(Show Context)
Concepts in differentiated ontologies inherit definitional structure from concepts in shared ontologies. Shared, inherited structure provides a common ground that supports measures of “description compatibility. ” These algorithms are the primary contribution of this paper. The descriptioncompatibility measures compare concepts to predict semantic compatibility, the probability that an instance of a recommendation will satisfy a request. The descriptioncompatibility measures cross a spectrum regarding their knowledge of the semantics of roles in concept definitions. Some of the measures identify and analyze correspondences among elements of the definitions, and are thus a form of analogical reasoning. We use simulations to evaluate the descriptioncompatibility measures in detail. Description compatibility can be used to rank alternative query translations, and to guide search for capabilities across communities that subscribe to differentiated ontologies. 1.
CLASSIC Learning
 In Proceedings of the Seventh Annual ACM Conference on Computational Learning Theory
, 1991
"... . Description logics, also called terminological logics, are commonly used in knowledgebased systems to describe objects and their relationships. We investigate the learnability of a typical description logic, Classic, and show that Classic sentences are learnable in polynomial time in the exact lea ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
. Description logics, also called terminological logics, are commonly used in knowledgebased systems to describe objects and their relationships. We investigate the learnability of a typical description logic, Classic, and show that Classic sentences are learnable in polynomial time in the exact learning model using equivalence queries and membership queries (which are in essence, "subsumption queries"we show a prediction hardness result for the more traditional membership queries that convey information about specific individuals). We show that membership queries alone are insufficient for polynomial time learning of Classic sentences. Combined with earlier negative results (Cohen & Hirsh, 1994a) showing that, given standard complexity theoretic assumptions, equivalence queries alone are insufficient (or random examples alone in the PAC setting are insufficient), this shows that both sources of information are necessary for efficient learning in that neither type alone is sufficie...
Learning From Texts  A Terminological Metareasoning Perspective
 Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing
, 1995
"... We introduce a methodology for concept learning from texts that relies upon secondorder reasoning about statements expressed in a (firstorder) terminological representation language. This metareasoning approach allows for qualitybased evaluation and selection of alternative concept hypotheses. App ..."
Abstract

Cited by 26 (21 self)
 Add to MetaCart
We introduce a methodology for concept learning from texts that relies upon secondorder reasoning about statements expressed in a (firstorder) terminological representation language. This metareasoning approach allows for qualitybased evaluation and selection of alternative concept hypotheses. Appeared in: S. Wermter, E. Riloff, G. Scheler (Eds.), Connectionist, Statistical and Symbolic Approaches to Learning for Natural Language Processing, Berlin etc: Springer, 1996, pp.453468, (LNAI 1040) Connectionist,23 0 Language929295 1040) Learning from Texts  A Terminological Metareasoning Perspective Udo Hahn, Manfred Klenner & Klemens Schnattinger Freiburg University Computational Linguistics Group F Europaplatz 1, D79085 Freiburg, Germany fhahn,klenner,schnattingerg@coling.unifreiburg.de Abstract We introduce a methodology for concept learning from texts that relies upon secondorder reasoning about statements expressed in a (firstorder) terminological representation langua...
Learnability of Description Logic Programs
 Inductive Logic Programming
, 2002
"... CarinALN is an interesting new rule learning bias for ILP. ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
CarinALN is an interesting new rule learning bias for ILP.
Learning From a Consistently Ignorant Teacher
, 1994
"... One view of computational learning theory is that of a learner acquiring the knowledge of a teacher. We introduce a formal model of learning capturing the idea that teachers may have gaps in their knowledge. In particular, we consider learning from a teacher who labels examples "+" (a p ..."
Abstract

Cited by 24 (8 self)
 Add to MetaCart
One view of computational learning theory is that of a learner acquiring the knowledge of a teacher. We introduce a formal model of learning capturing the idea that teachers may have gaps in their knowledge. In particular, we consider learning from a teacher who labels examples "+" (a positive instance of the concept being learned), "\Gamma" (a negative instance of the concept being learned), and "?" (an instance with unknown classification), in such a way that knowledge of the concept class and all the positive and negative examples is not sufficient to determine the labelling of any of the examples labelled with "?". The goal of the learner is not to compensate for the ignorance of the teacher by attempting to infer "+" or "\Gamma" labels for the examples labelled with "?", but is rather to learn (an approximation to) the ternary labelling presented by the teacher. Thus, the goal of the learner is still to acquire the knowledge of the teacher, but now the learner must also ...
Learning with feature description logics
 Proceedings of the 12th International Conference on Inductive Logic Programming
, 2002
"... Abstract. We present a paradigm for efficient learning and inference with relational data using propositional means. The paradigm utilizes description logics and concepts graphs in the service of learning relational models using efficient propositional learning algorithms. We introduce a Feature Des ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
Abstract. We present a paradigm for efficient learning and inference with relational data using propositional means. The paradigm utilizes description logics and concepts graphs in the service of learning relational models using efficient propositional learning algorithms. We introduce a Feature Description Logic (FDL) a relational (frame based) language that supports efficient inference, along with a generation function that uses inference with descriptions in the FDL to produce features suitable for use by learning algorithms. These are used within a learning framework that is shown to learn efficiently and accurately relational representations in terms of the FDL descriptions. The paradigm was designed to support learning in domains that are relational but where the amount of data and size of representation learned are very large; we exemplify it here, for clarity, on the classical ILP task of learning family relations. This paradigm provides a natural solution to the problem of learning and representing relational data; it extends and unifies several lines of works in KRR and Machine Learning in ways that provide hope for a coherent usage of learning and reasoning methods in large scale intelligent inference. 1
Towards Learning in CARINALN
, 2000
"... . In this paper we investigate a new language for learning, which combines two wellknown representation formalisms, Description Logics and Horn Clause Logics. Our goal is to study the feasability of learning in such a hybrid description  horn clause language, namely CARINALN [LR98b], in the p ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
. In this paper we investigate a new language for learning, which combines two wellknown representation formalisms, Description Logics and Horn Clause Logics. Our goal is to study the feasability of learning in such a hybrid description  horn clause language, namely CARINALN [LR98b], in the presence of hybrid background knowledge, including a Horn clause and a terminological component. After setting our learning framework, we present algorithms for testing example coverage and subsumption between two hypotheses, based on the existential entailment algorithm studied in[LR98b]. While the hybrid language is more expressive than horn clause logics alone, the complexity of these two steps for CARINALN remains bounded by their respective complexity in horn clause logics. 1
Learning logic programs with structured background knowledge (Extended Abstract)
"... The polynomial PAC  learnability of nonrecursive Horn clauses is studied, based on a characterization of the least general generalization of a set of positive examples in terms of products and homomorphisms. This approach is used to show that nonrecursive Horn clauses are polynomially PAC  learna ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
The polynomial PAC  learnability of nonrecursive Horn clauses is studied, based on a characterization of the least general generalization of a set of positive examples in terms of products and homomorphisms. This approach is used to show that nonrecursive Horn clauses are polynomially PAC  learnable if there is a single binary background predicate and the ground facts in the background knowledge form a forest. If the ground facts in the background knowledge form a disjoint union of cycles then the situation is different as the shortest consistent hypothesis may have exponential length. In this case polynomial PAC  learnability holds if a different concept representation is used. We also consider the learnability of multiple clauses in some restricted cases. 1 Introduction The theoretical study of efficient learnability developed into a separate field of research in the last decade, motivated by the increasing importance of learning in practical applications. Several learning probl...
Terminological MetaReasoning By Reification And Multiple Contexts
 In EPIA'95  Proc. 7th Portuguese Conf. on Artificial Intelligence
, 1995
"... We introduce a model and a system architecture for secondorder reasoning about statements expressed in a (firstorder) terminological representation language. This metareasoning approach is based on the reification of firstorder propositions and the mediation between second and firstorder expre ..."
Abstract

Cited by 13 (13 self)
 Add to MetaCart
We introduce a model and a system architecture for secondorder reasoning about statements expressed in a (firstorder) terminological representation language. This metareasoning approach is based on the reification of firstorder propositions and the mediation between second and firstorder expressions via translation rules operating on multiple contexts. An application to a concept learning task in a text understanding environment shows how different degrees of credibility can be assigned to alternative concept hypotheses on the basis of such a schema for terminological metareasoning. Appeared in: C. PintoFerreira, N.J. Mamede (Eds.), EPIA'95  Progress in Artificial Intelligence. Proceedings of the 7th Portuguese Conference on Artificial Intelligence, Funchal, Madeira Island, Portugal, October 36, 1995. Berlin etc.: Springer, 1995, pp. 116, (LNAI 990). Progress9 Conference9 October959 990). Terminological MetaReasoning by Reification and Multiple Contexts Klemens Schnatti...