Results 1  10
of
134
Logical foundations of objectoriented and framebased languages
 JOURNAL OF THE ACM
, 1995
"... We propose a novel formalism, called Frame Logic (abbr., Flogic), that accounts in a clean and declarative fashion for most of the structural aspects of objectoriented and framebased languages. These features include object identity, complex objects, inheritance, polymorphic types, query methods, ..."
Abstract

Cited by 824 (61 self)
 Add to MetaCart
(Show Context)
We propose a novel formalism, called Frame Logic (abbr., Flogic), that accounts in a clean and declarative fashion for most of the structural aspects of objectoriented and framebased languages. These features include object identity, complex objects, inheritance, polymorphic types, query methods, encapsulation, and others. In a sense, Flogic stands in the same relationship to the objectoriented paradigm as classical predicate calculus stands to relational programming. Flogic has a modeltheoretic semantics and a sound and complete resolutionbased proof theory. A small number of fundamental concepts that come from objectoriented programming have direct representation in Flogic; other, secondary aspects of this paradigm are easily modeled as well. The paper also discusses semantic issues pertaining to programming with a deductive objectoriented language based on a subset of Flogic.
Abduction in Logic Programming
"... Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over th ..."
Abstract

Cited by 548 (74 self)
 Add to MetaCart
(Show Context)
Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over the last ten years and to take a critical view of these developments from several perspectives: logical, epistemological, computational and suitability to application. The paper attempts to expose some of the challenges and prospects for the further development of the field.
Solving ShapeAnalysis Problems in Languages with Destructive Updating
 POPL '96
, 1996
"... This paper concerns the static analysis of programs that perform destructive updating on heapallocated storage. We give an algorithm that conservatively solves this problem by using a finite shapegraph to approximate the possible “shapes” that heapallocated structures in a program can take on. In ..."
Abstract

Cited by 301 (20 self)
 Add to MetaCart
This paper concerns the static analysis of programs that perform destructive updating on heapallocated storage. We give an algorithm that conservatively solves this problem by using a finite shapegraph to approximate the possible “shapes” that heapallocated structures in a program can take on. In contrast with previous work, our method M even accurate for certain programs that update cyclic data structures. For example, our method can determine that when the input to a program that searches a list and splices in a new element is a possibly circular list, the output is a possibly circular list.
Query Caching and Optimization in Distributed Mediator Systems
 In Proc. of ACM SIGMOD Conf. on Management of Data
, 1996
"... Query processing and optimization in mediator systems that access distributed nonproprietary sources pose many novel problems. Costbased query optimization is hard because the mediator does not have access to source statistics information and furthermore it may not be easy to model the source&apos ..."
Abstract

Cited by 189 (12 self)
 Add to MetaCart
(Show Context)
Query processing and optimization in mediator systems that access distributed nonproprietary sources pose many novel problems. Costbased query optimization is hard because the mediator does not have access to source statistics information and furthermore it may not be easy to model the source's performance. At the same time, querying remote sources may be very expensive because of high connection overhead, long computation time, financial charges, and temporary unavailability. We propose a costbased optimization technique that caches statistics of actual calls to the sources and consequently estimates the cost of the possible execution plans based on the statistics cache. We investigate issues pertaining to the design of the statistics cache and experimentally analyze various tradeoffs. We also present a query result caching mechanism that allows us to effectively use results of prior queries when the source is not readily available. We employ the novel invariants mechanism, which s...
A Survey of Research on Deductive Database Systems
 JOURNAL OF LOGIC PROGRAMMING
, 1993
"... The area of deductive databases has matured in recent years, and it now seems appropriate to re ect upon what has been achieved and what the future holds. In this paper, we provide an overview of the area and briefly describe a number of projects that have led to implemented systems. ..."
Abstract

Cited by 111 (7 self)
 Add to MetaCart
(Show Context)
The area of deductive databases has matured in recent years, and it now seems appropriate to re ect upon what has been achieved and what the future holds. In this paper, we provide an overview of the area and briefly describe a number of projects that have led to implemented systems.
Weighted pushdown systems and their application to interprocedural dataflow analysis
 Sci. of Comp. Prog
, 2003
"... Abstract. Recently, pushdown systems (PDSs) have been extended to weighted PDSs, in which each transition is labeled with a value, and the goal is to determine the meetoverallpaths value (for paths that meet a certain criterion). This paper shows how weighted PDSs yield new algorithms for certain ..."
Abstract

Cited by 109 (32 self)
 Add to MetaCart
Abstract. Recently, pushdown systems (PDSs) have been extended to weighted PDSs, in which each transition is labeled with a value, and the goal is to determine the meetoverallpaths value (for paths that meet a certain criterion). This paper shows how weighted PDSs yield new algorithms for certain classes of interprocedural dataflowanalysis problems. 1
Parameter learning of logic programs for symbolicstatistical modeling
 Journal of Artificial Intelligence Research
, 2001
"... We propose a logical/mathematical framework for statistical parameter learning of parameterized logic programs, i.e. de nite clause programs containing probabilistic facts with a parameterized distribution. It extends the traditional least Herbrand model semantics in logic programming to distributio ..."
Abstract

Cited by 100 (20 self)
 Add to MetaCart
(Show Context)
We propose a logical/mathematical framework for statistical parameter learning of parameterized logic programs, i.e. de nite clause programs containing probabilistic facts with a parameterized distribution. It extends the traditional least Herbrand model semantics in logic programming to distribution semantics, possible world semantics with a probability distribution which is unconditionally applicable to arbitrary logic programs including ones for HMMs, PCFGs and Bayesian networks. We also propose a new EM algorithm, the graphical EM algorithm, thatrunsfora class of parameterized logic programs representing sequential decision processes where each decision is exclusive and independent. It runs on a new data structure called support graphs describing the logical relationship between observations and their explanations, and learns parameters by computing inside and outside probability generalized for logic programs. The complexity analysis shows that when combined with OLDT search for all explanations for observations, the graphical EM algorithm, despite its generality, has the same time complexity as existing EM algorithms, i.e. the BaumWelch algorithm for HMMs, the InsideOutside algorithm for PCFGs, and the one for singly connected Bayesian networks that have beendeveloped independently in each research eld. Learning experiments with PCFGs using two corpora of moderate size indicate that the graphical EM algorithm can signi cantly outperform the InsideOutside algorithm. 1.
Strategy generation and evaluation for metagame playing
, 1993
"... MetaGame Playing (Metagame) is a new paradigm for research in gameplaying in which we design programs to take in the rules of unknown games and play those games without human assistance. Strong performance in this new paradigm is evidence that the program, instead of its human designer, has perfor ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
MetaGame Playing (Metagame) is a new paradigm for research in gameplaying in which we design programs to take in the rules of unknown games and play those games without human assistance. Strong performance in this new paradigm is evidence that the program, instead of its human designer, has performed the analysis of each specific game.Metagame is a concrete Metagame research problem based around the class of symmetric chesslike games. The class includes the games of chess, draughts, noughts and crosses, Chinesechess, and Shogi. An implemented game generator produces new games in this class, some of which are objects of interest in their own right. is a program that playsMetagame. The program takes as input the rules of a specific game and analyses those rules to construct for that game an efficient representation and an evaluation function, both for use with a generic search engine. The strategic analysis performed by the program relates a set of general knowledge sources to the details of the particular game. Among other properties, this analysis determines the relative value of the different pieces in a given game.