Results 11  20
of
32
Efficient constraints on possible worlds for reasoning about necessity
 University of Pennsylvania. Submitted
, 1997
"... Modal logics offer natural, declarative representations for describing both the modular structure of logical specifications and the attitudes and behaviors of agents. The results of this paper further the goal of building practical, efficient reasoning systems using modal logics. The key problem in ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Modal logics offer natural, declarative representations for describing both the modular structure of logical specifications and the attitudes and behaviors of agents. The results of this paper further the goal of building practical, efficient reasoning systems using modal logics. The key problem in modal deduction is reasoning about the world in a model (or scope in a proof) at which an inference rule is applied—a potentially hard problem. This paper investigates the use of partialorder mechanisms to maintain constraints on the application of modal rules in proof search in restricted languages. The main result is a simple, incremental polynomialtime algorithm to correctly order rules in proof trees for combinations of K, K4, T and S4 necessity operators governed by a variety of interactions, assuming an encoding of negation using a scoped constant?. This contrasts with previous equational unification methods, which have exponential performance in general because they simply guess among possible intercalations of modal operators. The new, fast algorithm is appropriate for use in a wide variety of applications of modal logic, from planning to logic programming. Content area: Reasoning Techniques—deduction, efficiency and complexity. 1
An Adaptive and Efficient Algorithm for Detecting Approximately Duplicate Database Records
, 2000
"... The integration of information is an important area of research in databases. By combining multiple information sources, a more complete and more accurate view of the world is attained, and additional knowledge gained. This is a nontrivial task however. Often there are many sources which contain ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
The integration of information is an important area of research in databases. By combining multiple information sources, a more complete and more accurate view of the world is attained, and additional knowledge gained. This is a nontrivial task however. Often there are many sources which contain information about a certain kind of entity, and some will contain records concerning the same realworld entity. Furthermore, one source may not have the exact information that another source contains. Some of the information may be different due to data entry errors for example or may be missing altogether. Thus, one problem in integrating information sources is to identify possibly different designators of the same entity. Data cleansing is the process of purging databases of inaccurate or inconsistent data. The data is typically manipulated into a form which is useful for other tasks, such as data mining. This paper addresses the data cleansing problem of detecting database records that are approximate duplicates, but not exact duplicates. An efficient algorithm is presented which combines three key ideas. First, the SmithWaterman algorithm for computing the minimum editdistance is used as a domainindependent method to recognize pairs of approximately duplicates.
A genetic algorithm approach to optimal topological design of all terminal networks
 Intelligent Engineering Systems Through Artificial Neural Networks
, 1995
"... In the design of communication networks, one of the fundamental considerations is the reliability and availability of communication paths between all terminals. Together, these form the network system reliability. The other important aspect is the layout of paths to minimize cost while meeting a rel ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In the design of communication networks, one of the fundamental considerations is the reliability and availability of communication paths between all terminals. Together, these form the network system reliability. The other important aspect is the layout of paths to minimize cost while meeting a reliability criterion. In this paper, a new heuristic search algorithm based on Genetic Algorithms (GA) is presented to optimize the design of large scale network topologies subject to a reliability constraint. The search works with an improved Monte Carlo simulation technique to estimate the system reliability of a network topology.
Materialized view selection in xml databases
 In DASFAA
, 2009
"... Abstract. Materialized views, a rdbms silver bullet, demonstrate its efficacy in many applications, especially as a data warehousing/decison support system tool. The pivot of playing materialized views efficiently is view selection. Though studied for over thirty years in rdbms, the selection is har ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. Materialized views, a rdbms silver bullet, demonstrate its efficacy in many applications, especially as a data warehousing/decison support system tool. The pivot of playing materialized views efficiently is view selection. Though studied for over thirty years in rdbms, the selection is hard to make in the context of xml databases, where both the semistructured data and the expressiveness of xml query languages add challenges to the view selection problem. We start our discussion on producing minimal xml views (in terms of size) as candidates for a given workload (a query set). To facilitate intuitionistic view selection, we present a view graph (called vcube) to structurally maintain all generated views. By basing our selection on vcube for materialization, we propose two view selection strategies, targeting at spaceoptimized and spacetime tradeoff, respectively. We built our implementation on top of Berkeley DB XML, demonstrating that significant performance improvement could be obtained using our proposed approaches. 1
Efficient Propagators for Global Constraints
"... I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis may be made electronically available to the public. ClaudeGuy Quimper ii We study in this thesis three well ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis may be made electronically available to the public. ClaudeGuy Quimper ii We study in this thesis three well known global constraints. The AllDifferent constraint restricts a set of variables to be assigned to distinct values. The global cardinality constraint (GCC) ensures that a value v is assigned to at least lv variables and to at most uv variables among a set of given variables where lv and uv are nonnegative integers such that lv ≤ uv. The InterDistance constraint ensures that all variables, among a set of variables x1,..., xn, are pairwise distant from p, i.e. xi − xj  ≥ p for all i � = j. The AllDifferent constraint, the GCC, and the InterDistance constraint are largely used in scheduling problems. For instance, in scheduling problems where tasks with unit processing time compete for a single
Testing the Equivalence of Regular Languages
, 2009
"... The minimal deterministic finite automaton is generally used to determine regular languages equality. Antimirov and Mosses proposed a rewrite system for deciding regular expressions equivalence of which Almeida et al. presented an improved variant. Hopcroft and Karp proposed an almost linear algori ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The minimal deterministic finite automaton is generally used to determine regular languages equality. Antimirov and Mosses proposed a rewrite system for deciding regular expressions equivalence of which Almeida et al. presented an improved variant. Hopcroft and Karp proposed an almost linear algorithm for testing the equivalence of two deterministic finite automata that avoids minimisation. In this paper we improve the bestcase running time, present an extension of this algorithm to nondeterministic finite automaton, and establish a relationship between this algorithm and the one proposed in Almeida et al. We also present some experimental comparative results. All these algorithms are closely related with the recent coalgebraic approach to automata proposed by Rutten.
HICFG: Construction by Binary Analysis, and Application to Attack Polymorphism
"... Abstract. Security analysis often requires understanding both the control and dataflow structure of a binary. We introduce a new program representation, a hybrid information and controlflow graph (HICFG), and give algorithms to infer it from an instructionlevel trace. As an application, we cons ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Security analysis often requires understanding both the control and dataflow structure of a binary. We introduce a new program representation, a hybrid information and controlflow graph (HICFG), and give algorithms to infer it from an instructionlevel trace. As an application, we consider the task of generalizing an attack against a program whose inputs undergo complex transformations before reaching a vulnerability. We apply the HICFG to find the parts of the program that implement each transformation, and then generate new attack inputs under a userspecified combination of transformations. Structural knowledge allows our approach to scale to applications that are infeasible with monolithic symbolic execution. Such attack polymorphism shows the insufficiency of any filter that does not support all the same transformations as the vulnerable application. In case studies, we show this attack capability against a PDF viewer and a word processor. 1
Reflections on complexity of ML type reconstruction
, 1997
"... This is a collection of some more or less chaotic remarks on the ML type system, definitely not sufficient to fill a research paper of reasonable quality, but perhaps interesting enough to be written down as a note. At the beginning the idea was to investigate the complexity of type reconstruction a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This is a collection of some more or less chaotic remarks on the ML type system, definitely not sufficient to fill a research paper of reasonable quality, but perhaps interesting enough to be written down as a note. At the beginning the idea was to investigate the complexity of type reconstruction and typability in bounded order fragments of ML. Unexpectedly the problem turned out to be hard, and finally I obtained only partial results. I do not feel like spending more time on this topic, so the text is not polished, the proofs  if included at all  are only sketched and of rather poor mathematical quality. I believe however, that some remarks, especially those of "philosophical" nature, shed some light on the ML type system and may be of some value to the reader interested especially in the interaction between theory and practice of ML type reconstruction. 1 Introduction The ML type system was developed by Robin Milner in the late seventies [26, 3], but was influenced by much ol...
Efficient Constraints on Possible Worlds . . .
 UNIVERSITY OF PENNSYLVANIA. SUBMITTED
, 1997
"... ..."