Results 1  10
of
71
Locating Features in Source Code
, 2003
"... Understanding the implementation of a certain feature of a system requires to identify the computational units of the system that contribute to this feature. In many cases, the mapping of features to the source code is poorly documented. In this paper, we present a semiautomatic technique that reco ..."
Abstract

Cited by 166 (2 self)
 Add to MetaCart
Understanding the implementation of a certain feature of a system requires to identify the computational units of the system that contribute to this feature. In many cases, the mapping of features to the source code is poorly documented. In this paper, we present a semiautomatic technique that reconstructs the mapping for features that are triggered by the user and exhibit an observable behavior. The mapping is in general not injective; that is, a computational unit may contribute to several features. Our technique allows to distinguish between general and specific computational units with respect to a given set of features. For a set of features, it also identifies jointly and distinctly required computational units.
Reengineering class hierarchies using concept analysis
 In ACM Trans. Programming Languages and Systems
, 1998
"... A new method is presented for analyzing and reengineering class hierarchies. In our approach, a class hierarchy is processed along with a set of applications that use it, and a finegrained analysis of the access and subtype relationships between objects, variables and class members is performed. Th ..."
Abstract

Cited by 109 (7 self)
 Add to MetaCart
A new method is presented for analyzing and reengineering class hierarchies. In our approach, a class hierarchy is processed along with a set of applications that use it, and a finegrained analysis of the access and subtype relationships between objects, variables and class members is performed. The result of this analysis is again a class hierarchy, which is guaranteed to be behaviorally equivalent to the original hierarchy, but in which each object only contains the members that are required. Our method is semantically wellfounded in concept analysis: the new class hierarchy is a minimal and maximally factorized concept lattice that reflects the access and subtype relationships between variables, objects and class members. The method is primarily intended as a tool for finding imperfections in the design of class hierarchies, and can be used as the basis for tools that largely automate the process of reengineering such hierarchies. The method can also be used as a spaceoptimizing sourcetosource transformation that removes redundant fields from objects. A prototype implementation for Java has been constructed, and used to conduct several case studies. Our results demonstrate that the method can provide valuable insights into the usage of the class hierarchy in a specific context, and lead to useful restructuring proposals.
Domains for Computation in Mathematics, Physics and Exact Real Arithmetic
 Bulletin of Symbolic Logic
, 1997
"... We present a survey of the recent applications of continuous domains for providing simple computational models for classical spaces in mathematics including the real line, countably based locally compact spaces, complete separable metric spaces, separable Banach spaces and spaces of probability dist ..."
Abstract

Cited by 48 (10 self)
 Add to MetaCart
We present a survey of the recent applications of continuous domains for providing simple computational models for classical spaces in mathematics including the real line, countably based locally compact spaces, complete separable metric spaces, separable Banach spaces and spaces of probability distributions. It is shown how these models have a logical and effective presentation and how they are used to give a computational framework in several areas in mathematics and physics. These include fractal geometry, where new results on existence and uniqueness of attractors and invariant distributions have been obtained, measure and integration theory, where a generalization of the Riemann theory of integration has been developed, and real arithmetic, where a feasible setting for exact computer arithmetic has been formulated. We give a number of algorithms for computation in the theory of iterated function systems with applications in statistical physics and in period doubling route to chao...
A Parametric Approach to Deductive Databases with Uncertainty
, 1997
"... Numerous frameworks have been proposed in recent years for deductive databases with uncertainty. These frameworks differ in (i) their underlying notion of uncertainty, (ii) the way in which uncertainties are manipulated, and (iii) the way in which uncertainty is associated with the facts and rules o ..."
Abstract

Cited by 44 (6 self)
 Add to MetaCart
Numerous frameworks have been proposed in recent years for deductive databases with uncertainty. These frameworks differ in (i) their underlying notion of uncertainty, (ii) the way in which uncertainties are manipulated, and (iii) the way in which uncertainty is associated with the facts and rules of a program. On the basis of (iii), these frameworks can be classified into implication based (IB) and annotation based (AB) frameworks. In this paper, we develop a generic framework called the parametric framework as a unifying umbrella for IB frameworks. We develop the declarative, fixpoint, and prooftheoretic semantics of programs in the parametric framework and show their equivalence. Using this framework as a basis, we study the query optimization problem of containment of conjunctive queries in this framework, and establish necessary and sufficient conditions for containment for several classes of parametric conjunctive queries. Our results yield tools for use in the query optimization for large classes of query programs in IB deductive databases with uncertainty.
Pattern Structures and Their Projections
, 2001
"... Pattern structures consist of objects with descriptions (called patterns) that allow a semilattice operation on them. Pattern structures arise naturally from ordered data, e.g., from labeled graphs ordered by graph morphisms. It is shown that pattern structures can be reduced to formal contexts, ..."
Abstract

Cited by 39 (11 self)
 Add to MetaCart
Pattern structures consist of objects with descriptions (called patterns) that allow a semilattice operation on them. Pattern structures arise naturally from ordered data, e.g., from labeled graphs ordered by graph morphisms. It is shown that pattern structures can be reduced to formal contexts, however sometimes processing the former is often more ecient and obvious than processing the latter. Concepts, implications, plausible hypotheses, and classi cations are de ned for data given by pattern structures. Since computation in pattern structures may be intractable, approximations of patterns by means of projections are introduced.
Closed Set Based Discovery of Small Covers for Association Rules
 PROC. 15EMES JOURNEES BASES DE DONNEES AVANCEES, BDA
, 1999
"... In this paper, we address the problem of the usefulness of the set of discovered association rules. This problem is important since reallife databases yield most of the time several thousands of rules with high confidence. We propose new algorithms based on Galois closed sets to reduce the extracti ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
In this paper, we address the problem of the usefulness of the set of discovered association rules. This problem is important since reallife databases yield most of the time several thousands of rules with high confidence. We propose new algorithms based on Galois closed sets to reduce the extraction to small covers, or bases, for exact and approximate rules. Once frequent closed itemsets which constitute a generating set for both frequent itemsets and association rules have been discovered, no additional database pass is needed to derive these bases. Experiments conducted on reallife databases show that these algorithms are efficient and valuable in practice.
A FixedPoint Approach to Stable Matchings and Some Applications
, 2001
"... We describe a fixedpoint based approach to the theory of bipartite stable matchings. By this, we provide a common framework that links together seemingly distant results, like the stable marriage theorem of Gale and Shapley [11], the MenelsohnDulmage theorem [21], the KunduLawler theorem [19], Ta ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
We describe a fixedpoint based approach to the theory of bipartite stable matchings. By this, we provide a common framework that links together seemingly distant results, like the stable marriage theorem of Gale and Shapley [11], the MenelsohnDulmage theorem [21], the KunduLawler theorem [19], Tarski's fixed point theorem [32], the CantorBernstein theorem, Pym's linking theorem [22, 23] or the monochromatic path theorem of Sands et al. [29]. In this framework, we formulate a matroidgeneralization of the stable marriage theorem and study the lattice structure of generalized stable matchings. Based on the theory of lattice polyhedra and blocking polyhedra, we extend results of Vande Vate [33] and Rothblum [28] on the bipartite stable matching polytope.
Fibring NonTruthFunctional Logics: Completeness Preservation
 Journal of Logic, Language and Information
, 2000
"... Fibring has been shown to be useful for combining logics endowed with truthfunctional semantics. One wonders if bring can be extended in order to cope with logics endowed with nontruthfunctional semantics as, for example, paraconsistent logics. The rst main contribution of the paper is a po ..."
Abstract

Cited by 26 (20 self)
 Add to MetaCart
Fibring has been shown to be useful for combining logics endowed with truthfunctional semantics. One wonders if bring can be extended in order to cope with logics endowed with nontruthfunctional semantics as, for example, paraconsistent logics. The rst main contribution of the paper is a positive answer to this question. Furthermore, it is shown that this extended notion of bring preserves completeness under certain reasonable conditions. This completeness transfer result, the second main contribution of the paper, generalizes the one established by Zanardo et al. and is obtained using a new technique exploiting the properties of the metalogic where the (possibly nontruthfunctional) valuations are de ned. The modal paraconsistent logic of da Costa and Carnielli is obtained by bring and its completeness is so established.
Similarity metrics: A formal unification of cardinal and noncardinal similarity measures
 PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON CASEBASED REASONING
, 1997
"... In [9] we introduced a formal framework for constructing ordinal similarity measures, and suggested how this might also be applied to cardinal measures. In this paper we will place this approach in a more general framework, called similarity metrics. In this framework, ordinal similarity metrics ( ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
In [9] we introduced a formal framework for constructing ordinal similarity measures, and suggested how this might also be applied to cardinal measures. In this paper we will place this approach in a more general framework, called similarity metrics. In this framework, ordinal similarity metrics (where comparison returns a boolean value) can be combined with cardinal metrics (returning a numeric value) and, indeed, with metrics returning values of other types, to produce new metrics.