Results 1  10
of
31
The Model Evolution Calculus
, 2003
"... The DPLL procedure is the basis of some of the most successful propositional satisfiability solvers to date. Although originally devised as a proofprocedure for firstorder logic, it has been used almost exclusively for propositional logic so far because of its highly inefficient treatment of quanti ..."
Abstract

Cited by 87 (14 self)
 Add to MetaCart
The DPLL procedure is the basis of some of the most successful propositional satisfiability solvers to date. Although originally devised as a proofprocedure for firstorder logic, it has been used almost exclusively for propositional logic so far because of its highly inefficient treatment of quantifiers, based on instantiation into ground formulas. The recent FDPLL calculus by Baumgartner was the first successful attempt to lift the procedure to the firstorder level without resorting to ground instantiations. FDPLL lifts to the firstorder case the core of the DPLL procedure, the splitting rule, but ignores other aspects of the procedure that, although not necessary for completeness, are crucial for its effectiveness in practice. In this paper, we present a new calculus loosely based on FDPLL that lifts these aspects as well. In addition to being a more faithful litfing of the DPLL procedure, the new calculus contains a more systematic treatment of universal literals, one of FDPLL's optimizations, and so has the potential of leading to much faster implementations.
Relating Defeasible and Normal Logic Programming through Transformation Properties
, 2001
"... This paper relates the Defeasible Logic Programming (DeLP ) framework and its semantics SEM DeLP to classical logic programming frameworks. In DeLP we distinguish between two different sorts of rules: strict and defeasible rules. Negative literals (A) in these rules are considered to represent cl ..."
Abstract

Cited by 75 (28 self)
 Add to MetaCart
This paper relates the Defeasible Logic Programming (DeLP ) framework and its semantics SEM DeLP to classical logic programming frameworks. In DeLP we distinguish between two different sorts of rules: strict and defeasible rules. Negative literals (A) in these rules are considered to represent classical negation. In contrast to this, in normal logic programming (NLP ), there is only one kind of rules, but the meaning of negative literals (notA) is different: they represent a kind of negation as failure, and thereby introduce defeasibility. Various semantics have been defined for NLP, notably the wellfounded semantics WFS and the stable semantics Stable. In this paper we consider the transformation properties for NLP introduced by Brass and Dix and suitably adjusted for the DeLP framework. We show which transformation properties are satisfied, thereby identifying aspects in which NLP and DeLP differ. We contend that the transformation rules presented in this paper can he...
Adaptive Information Extraction: Core Technologies For Information Agents
, 2003
"... This paper gives a state of the art overview about machine learning approaches for information extraction from documents based on finite state techniques and relational learning methods related to inductive logic programming. ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
This paper gives a state of the art overview about machine learning approaches for information extraction from documents based on finite state techniques and relational learning methods related to inductive logic programming.
Qualitative Velocity and Ball Interception
, 2002
"... In many approaches for qualitative spatial reasoning, navigation of an agent in a more or less static environment is considered (e.g. in the doublecross calculus [13]). However, in general, the environment is dynamic, which means that both the agent itself and also other objects and agents in th ..."
Abstract

Cited by 47 (7 self)
 Add to MetaCart
In many approaches for qualitative spatial reasoning, navigation of an agent in a more or less static environment is considered (e.g. in the doublecross calculus [13]). However, in general, the environment is dynamic, which means that both the agent itself and also other objects and agents in the environment may move. Thus, in order to perform spatial reasoning, not only (qualitative) distance and orientation information is needed (as e.g. in [1]), but also information about (relative) velocity of objects (see e.g. [2]). Therefore, we will introduce concepts for qualitative and relative velocity: (quick) to left, neutral, (quick) to right. We investigate the usefulness of this approach in a case study, namely ball interception of simulated soccer agents in the RoboCup [11]. We compare a numerical approach where the interception point is computed exactly, a strategy based on reinforcement learning, a method with qualitative velocities developed in this paper, and the nave method where the agent simply goes directly to the actual ball position.
Automated Deduction Techniques for the Management Of Personalized Documents
"... This work is about a "realworld" application of automated deduction. The application is the management of documents (such as mathematical textbooks) as they occur in a readily available tool. In this "Slicing Information Technology tool", documents are decomposed ("sliced") into small units. A part ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
This work is about a "realworld" application of automated deduction. The application is the management of documents (such as mathematical textbooks) as they occur in a readily available tool. In this "Slicing Information Technology tool", documents are decomposed ("sliced") into small units. A particular application task is to assemble a new document from such units in a selective way, based on the user's current interest and knowledge. It is
From the Specification of Multiagent Systems by Statecharts to their Formal Analysis by Model Checking
, 2001
"... A formalism for the specification of multiagent systems should be expressive and illustrative enough to model not only the behavior of one single agent, but also the collaboration among several agents and the influences caused by external events from the environment. For this, state machines [25] ..."
Abstract

Cited by 45 (5 self)
 Add to MetaCart
A formalism for the specification of multiagent systems should be expressive and illustrative enough to model not only the behavior of one single agent, but also the collaboration among several agents and the influences caused by external events from the environment. For this, state machines [25] seem to provide an adequate means. Furthermore, it should be easily possible to obtain an implementation for each agent automatically from this specification. Last but not least, it is desirable to be able to check whether the multiagent system satisfies some interesting properties. Therefore, the formalism should also allow for the verification or formal analysis of multiagent systems, e.g. by model checking [6]. In this paper, a framework is introduced, which allows us to express declarative aspects of multiagent systems by means of (classical) propositional logic and procedural aspects of these systems by means of state machines (statecharts). Nowadays statecharts are a well accepted means to specify dynamic behavior of software systems. They are a part of the Unified Modeling Language (UML). We describe in a rigorously formal manner, how the specification of spatial knowledge and robot interaction and its verification by model checking can be done, integrating different methods from the field of artificial intelligence such as qualitative (spatial) reasoning and the situation calculus. As example application domain, we will consider robotic soccer, see also [24, 31], which present predecessor work towards a formal logicbased approach for agents engineering.
Conceptual Modelling and Web Site Generation using Graph Technology
, 2001
"... Starting with a conceptual model when designing a web site is the state of the art. A conceptual model helps to grasp and structure the problem domain and is the first step towards a formal representation of the web site, provided that the chosen technology has a formal foundation. Applying the exte ..."
Abstract

Cited by 40 (2 self)
 Add to MetaCart
Starting with a conceptual model when designing a web site is the state of the art. A conceptual model helps to grasp and structure the problem domain and is the first step towards a formal representation of the web site, provided that the chosen technology has a formal foundation. Applying the extended entity relationship driven EER/GRALapproach to specifying graph classes, we show that graph technology can be utilised to ensure a coherent and consistent usage of a conceptual model and its instances for defining and generating an arbitrary complex web site. During this process, graphs are used as repository structures in conformance with the conceptual model, allowing for descriptive graph queries to define the contents of the web pages. Along withthe application of XSL (extensible style sheet language) as a means to foster separation of content and layout, this approach ensures a permanently consistent website. Some examples are given.
A FirstOrder Logic DavisPutnamLogemannLoveland Procedure
"... The DavisPutnamLogemannLoveland procedure (DPLL) was introduced in the early ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
The DavisPutnamLogemannLoveland procedure (DPLL) was introduced in the early