Results 1  10
of
68
Computing Least Common Subsumers in Description Logics with Existential Restrictions
, 1999
"... Computing the least common subsumer (lcs) is an inference task that can be used to support the "bottomup " construction of knowledge bases for KR systems based on description logics. Previous work on how to compute the lcs has concentrated on description logics that allow for univ ..."
Abstract

Cited by 95 (24 self)
 Add to MetaCart
Computing the least common subsumer (lcs) is an inference task that can be used to support the &quot;bottomup &quot; construction of knowledge bases for KR systems based on description logics. Previous work on how to compute the lcs has concentrated on description logics that allow for universal value restrictions, but not for existential restrictions. The main new contribution of this paper is the treatment of description logics with existential restrictions. Our approach for computing the lcs is based on an appropriate representation of concept descriptions by certain trees, and a characterization of subsumption by homomorphisms between these trees. The lcs operation then corresponds to the product operation on trees.
Towards a Precise Definition of the OMG/MDA Framework
, 2001
"... Modeles; Metamodeles; Graphes conceptuels; UML; MOF; MDA MOTSCL ES Il y a actuellement un changement de paradigme dans le domaine du developpement de systemes d'information. ..."
Abstract

Cited by 70 (13 self)
 Add to MetaCart
(Show Context)
Modeles; Metamodeles; Graphes conceptuels; UML; MOF; MDA MOTSCL ES Il y a actuellement un changement de paradigme dans le domaine du developpement de systemes d'information.
A scalable distributed parallel breadthfirst search algorithm on bluegene/l
 In SC ’05: Proceedings of the 2005 ACM/IEEE conference on Supercomputing
, 2005
"... Many emerging largescale data science applications require searching large graphs distributed across multiple memories and processors. This paper presents a distributed breadthfirst search (BFS) scheme that scales for random graphs with up to three billion vertices and 30 billion edges. Scalability ..."
Abstract

Cited by 42 (2 self)
 Add to MetaCart
(Show Context)
Many emerging largescale data science applications require searching large graphs distributed across multiple memories and processors. This paper presents a distributed breadthfirst search (BFS) scheme that scales for random graphs with up to three billion vertices and 30 billion edges. Scalability was tested on IBM BlueGene/L with 32,768 nodes at the Lawrence Livermore National Laboratory. Scalability was obtained through a series of optimizations, in particular, those that ensure scalable use of memory. We use 2D (edge) partitioning of the graph instead of conventional 1D (vertex) partitioning to reduce communication overhead. For Poisson random graphs, we show that the expected size of the messages is scalable for both 2D and 1D partitionings. Finally, we have developed efficient collective communication functions for the 3D torus architecture of BlueGene/L that also take advantage of the structure in the problem. The performance and characteristics of the algorithm are measured and reported. 1
Nested Graphs: A Graphbased Knowledge Representation Model with FOL Semantics
 Proceedings of the 6th International Conference on Knowledge Representation (KR'98
, 1998
"... We present a graphbased KR model issued from Sowa's conceptual graphs but studied and developed with a speci c approach. Formal objects are kinds of labelled graphs, which maybesimple graphs or nested graphs. The fundamental notion for doing reasonings, called projection (or subsumption), is a ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
(Show Context)
We present a graphbased KR model issued from Sowa's conceptual graphs but studied and developed with a speci c approach. Formal objects are kinds of labelled graphs, which maybesimple graphs or nested graphs. The fundamental notion for doing reasonings, called projection (or subsumption), is a kind of labelled graph morphism. Thus, we propose a graphical KR model, where \graphical &quot; is used in the sense of [Sch91], i.e. a model that \uses graphtheoretic notions in an essential and nontrivial way&quot;. Indeed, morphism, which is the fundamental notion for any structure, is at the core of our theory. We de ne two rst order logic semantics, which correspond to di erentintuitivesemantics, and proveinboth cases that projection is sound and complete with respect to deduction. This paper is almost identical to the paper appeared in the KR'98 proceedings. It provides minor corrections. 1
Extensions of Simple Conceptual Graphs: the Complexity of Rules and Constraints
 JOUR. OF ARTIF. INTELL. RES
, 2002
"... Simple conceptual graphs are considered as the kernel of most knowledge representation formalisms built upon $owa's model. Reasoning in this model can be expressed by a graph homomorphism called projection, whose semantics is usually given in terms of positive, conjunctive, existential FOL. ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Simple conceptual graphs are considered as the kernel of most knowledge representation formalisms built upon $owa's model. Reasoning in this model can be expressed by a graph homomorphism called projection, whose semantics is usually given in terms of positive, conjunctive, existential FOL. We present here a family of extensions of this model, based on rules and constraints, keeping graph homomorphism as the basic operation. We focus on the formal definitions of the different models obtained, including their operational semantics and relationships with FOL, and we analyze the decidability and complexity of the associated problems (consistency and deduction). As soon as rules are involved in reasonings, these problems are not decidable, but we exhibit a condition under which they fall in the polynomial hierarchy. These results extend and complete the ones already published by the authors. Moreover we systematically study the complexity of some particular cases obtained by restricting the form of constraints and/or rules.
RELIEF: Combining expressiveness and rapidity into a single system
, 1998
"... This paper constitutes a proposal for an efficient and effective logical information retrieval system. Following a relational indexing approach, which is in our opinion a necessity to cope with the emerging applications such as those based on multimedia, we use the conceptual graphs formalism as our ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
This paper constitutes a proposal for an efficient and effective logical information retrieval system. Following a relational indexing approach, which is in our opinion a necessity to cope with the emerging applications such as those based on multimedia, we use the conceptual graphs formalism as our indexing language. This choice allows for relational indexing support and captures all the useful properties of the logical information retrieval model, in a workable system. First order logic and standard information retrieval techniques are combined together, to the same effect: obtaining an expressive system, able to accurately handle complex documents, improve retrieval effectiveness, and achieve good time performance. Experimentations on an image test collection, within a system available on the Web, provide an illustration of the role that logic may have in the future development of information retrieval systems. 1 Introduction The emergence of new applications, such as those based ...
Knowledge Representation and Reasonings Based on Graph Homomorphism
 In Proc. ICCS’00, volume 1867 of LNAI
, 2000
"... The main conceptual contribution in this paper is to present an approach to knowledge representation and reasonings based on labeled graphs and labeled graph homomorphism. Strengths and weaknesses of this graphbased approach are discussed. Main technical contributions are the followings. Fundam ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
(Show Context)
The main conceptual contribution in this paper is to present an approach to knowledge representation and reasonings based on labeled graphs and labeled graph homomorphism. Strengths and weaknesses of this graphbased approach are discussed. Main technical contributions are the followings. Fundamental results about the kernel of this approach, the socalled simple graphs model are synthesized. It is then shown that the basic deduction problem on simple graphs is essentially the same problem as conjunctive query containment in databases and constraint satisfaction; polynomial parsimonious transformations between these problems are exhibited. Grounded on the simple graphs model, a knowledge representation and reasoning model allowing to deal with facts, production rules, transformation rules, and constraints is presented, as an illustration of the graphbased approach. Introduction The main conceptual contribution in this paper is to present an approach to knowledge represen...
Positive Nested Conceptual Graphs
 Proc. 5th Int’l Conf. on Conceptual Structures (ICCS 97), Springer Verlag, LNAI 1257
"... . This paper deals with positive (i.e. without negation) nested conceptual graphs (NCGs). We first give a general framework  graphs of graphs provided with morphism  for defining classes of NCGs. Then we define a new class of NCGs  typed NCGs  and we show that known kinds of NCGs can be desc ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
. This paper deals with positive (i.e. without negation) nested conceptual graphs (NCGs). We first give a general framework  graphs of graphs provided with morphism  for defining classes of NCGs. Then we define a new class of NCGs  typed NCGs  and we show that known kinds of NCGs can be described very simply as classes of the general framework. All NCG models considered generalize the simple CG model in the sense that they involve objects which are generalizations of simple CGs and reasonings on these objects are based on a graph operation (projection) which is a generalization of that used for simple CGs. Furthermore, the general framework introduced allows one to consider all these models as slight variations of a unique notion. This study has been initiated by applications we are currently involved in. 1. Introduction The work reported in this paper concerns conceptual graphs (CGs) for representing and reasoning on hierarchically structured knowledge. In these CGs, ...
Approximating Most Specific Concepts in Description Logics with Existential Restrictions
, 2001
"... Computing the most specic concept (msc) is an inference task that allows to abstract from individuals defined in description logic (DL) knowledge bases. For DLs that allow for existential restrictions or number restrictions, however, the msc need not exist unless one allows for cyclic concepts ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Computing the most specic concept (msc) is an inference task that allows to abstract from individuals defined in description logic (DL) knowledge bases. For DLs that allow for existential restrictions or number restrictions, however, the msc need not exist unless one allows for cyclic concepts interpreted with the greatest fixedpoint semantics. Since such concepts cannot be handled by current DLsystems, we propose to approximate the msc. We show that for the DL ALE , which has concept conjunction, a restricted form of negation, existential restrictions, and value restrictions as constructors, approximations of the msc always exist and can effectively be computed. 1
Acquisition And Structuring Of An Ontology Within Conceptual Graphs
 University of Maryland, College Park, MD
, 1994
"... The elicitation of the ontology  i.e. the objects of a domain  is a key issue of conceptual modelling and therefore of knowledge acquisition. The Conceptual Graph Theory provides a knowledge representation formalism to be used in knowledgebased systems with an explicit "type lattice" ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
The elicitation of the ontology  i.e. the objects of a domain  is a key issue of conceptual modelling and therefore of knowledge acquisition. The Conceptual Graph Theory provides a knowledge representation formalism to be used in knowledgebased systems with an explicit "type lattice" to account for the ontology. Since knowledge is in most AI applications non formal, it has to be normalized to ensure that the formal exploitation of its representation conforms to its meaning in the domain. Noting the intensional nature of types, which reflect the essences of the objects they denote, this normalization relies on a commitment on type definitions by necessary and sufficient conditions at the knowledge level. Our claim is that the taxonomic structure that accounts for the intensional nature of the ontology can be nothing but a tree, precluding tangled taxonomies. From this starting point, we derive methodological principles to constrain the acquisition of the "type tree", thus...