Results 1  10
of
28
Introducing the ITP tool: a tutorial
 Journal of Universal Computer Science
, 2006
"... Abstract: We present a tutorial of the ITP tool, a rewritingbased theorem prover that can be used to prove inductive properties of membership equational specifications. We also introduce membership equational logic as a formal language particularly adequate for specifying and verifying semantic dat ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
Abstract: We present a tutorial of the ITP tool, a rewritingbased theorem prover that can be used to prove inductive properties of membership equational specifications. We also introduce membership equational logic as a formal language particularly adequate for specifying and verifying semantic data structures, such as ordered lists, binary search trees, priority queues, and powerlists. The ITP tool is a Maude program that makes extensive use of the reflective capabilities of this system. In fact, rewritingbased proof simplification steps are directly executed by the powerful underlying Maude rewriting engine. The ITP tool is currently available as a webbased application that includes a module editor, a formula editor, and a command editor. These editors allow users to create and modify their specifications, to formalize properties about them, and to guide their proofs by filling and submitting web forms. Key Words: inductive theorem proving, semantic data structures, membership equational logic, ITP
Composition with Target Constraints
, 2010
"... It is known that the composition of schema mappings, each specified by sourcetotarget tgds (sttgds), can be specified by a secondorder tgd (SO tgd). We consider the question of what happens when target constraints are allowed. Specifically, we consider the question of specifying the composition ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
It is known that the composition of schema mappings, each specified by sourcetotarget tgds (sttgds), can be specified by a secondorder tgd (SO tgd). We consider the question of what happens when target constraints are allowed. Specifically, we consider the question of specifying the composition of standard schema mappings (those specified by sttgds, target egds, and a weaklyacyclic set of target tgds). We show that SO tgds, even with the assistance of arbitrary source constraints and target constraints, cannot specify in general the composition of two standard schema mappings. Therefore, we introduce sourcetotarget secondorder dependencies (stSO dependencies), which are similar to SO tgds, but allow equations in the conclusion. We show that stSO dependencies (along with target egds and target tgds) are sufficient to express the composition of every finite sequence of standard schema mappings,
TIOA User Guide and Reference Manual
, 2005
"... TIOA is a simple formal language for modeling distributed systems with timing as collections of interacting state machines, called timed input/output automata. The TIOA Toolkit supports a range of validation methods, including simulation and machinechecked proofs. This user guide and reference manu ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
TIOA is a simple formal language for modeling distributed systems with timing as collections of interacting state machines, called timed input/output automata. The TIOA Toolkit supports a range of validation methods, including simulation and machinechecked proofs. This user guide and reference manual includes a tutorial on the use of timed input/output automata and the TIOA language to model timed systems. It also includes a complete definition of the TIOA language.
Integration in valued fields
 in Algebraic Geometry and Number Theory, Progr. Math. 253, Birkhäuser
, 2006
"... Abstract. We develop a theory of integration over valued fields of residue characteristic zero. In particular we obtain new and basefield independent foundations for integration over local fields of large residue characteristic, extending results of Denef,Loeser, Cluckers. The method depends on an ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. We develop a theory of integration over valued fields of residue characteristic zero. In particular we obtain new and basefield independent foundations for integration over local fields of large residue characteristic, extending results of Denef,Loeser, Cluckers. The method depends on an analysis of definable sets up to definable bijections. We obtain a precise description of the Grothendieck semigroup of such sets in terms of related groups over the residue field and value group. This yields new invariants of all definable bijections, as well as invariants of measure preserving bijections.
Supervaluationism and Its Logics
"... If we adopt a supervaluational semantics for vagueness, what sort of logic results? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recast. There are several ways of doing this within a supervaluational framework, the main alternative bei ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
If we adopt a supervaluational semantics for vagueness, what sort of logic results? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recast. There are several ways of doing this within a supervaluational framework, the main alternative being between ‘global ’ construals (e.g. an argument is valid if and only if it preserves truthunderallprecisifications) and ‘local’ construals (an argument is valid if and only if, under all precisifications, it preserves truth). The former alternative is by far more popular, but I argue in favour of the latter, for (i) it does not suffer from a number of serious objections, and (ii) it makes it possible to restore global validity as a defined notion. Supervaluationism is a mixed bag. It is sometimes described as the ‘standard ’ theory of vagueness, at least in so far as vagueness is construed as a semantic phenomenon, but exactly what that standard theory amounts to is far from clear. In fact, it is pretty clear that there is not just one supervaluational semantics out there—there are lots of such semantics; and although it is true that they all exploit the same insight, their relative differences are by no means immaterial. For one thing, a lot depends on how exactly supervaluations are constructed, that is, on how exactly we come to establish the truthvalue of a given statement. (And when I say that a lot depends on this I mean to say that different explanations may give rise to different philosophical worries, or justify different reactions.) Secondly, and equally importantly, a lot depends on how a given supervaluationary machinery is brought into play when it comes to explaining the logic of the language, that is, not the notion of truth, or ‘supertruth’, as it applies to individual statements, but the notion of validity, or ‘supervalidity’, as it applies to whole arguments. (I am thinking for instance of how different explanations may bear on the question of whether, or to what extent, vagueness involves a departure from classical logic.) Here I want to focus on this second part of the story. However, since the notion of validity depends on the notion of truth—or so one may argue—I also want to comment briefly on the first.
A Note on the Testability of Ramsey’s Class
"... In property testing, the goal is to distinguish between objects that satisfy some desirable property and objects that are far from satisfying it, after examining only a small, random sample of the object in question. Although much of the literature has focused on properties of graphs, very recentl ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In property testing, the goal is to distinguish between objects that satisfy some desirable property and objects that are far from satisfying it, after examining only a small, random sample of the object in question. Although much of the literature has focused on properties of graphs, very recently several strong results on hypergraphs have appeared. We revisit a logical result obtained by Alon et al. [1] in the light of these recent results. The main result is the testability of all properties (of relational structures) expressible in sentences of Ramsey’s class.
On the Brightness of the Thomson Lamp. A Prolegomenon to Quantum Recursion Theory
, 2009
"... Some physical aspects related to the limit operations of the Thomson lamp are discussed. Regardless of the formally unbounded and even infinite number of “steps” involved, the physical limit has an operational meaning in agreement with the Abel sums of infinite series. The formal analogies to accele ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Some physical aspects related to the limit operations of the Thomson lamp are discussed. Regardless of the formally unbounded and even infinite number of “steps” involved, the physical limit has an operational meaning in agreement with the Abel sums of infinite series. The formal analogies to accelerated (hyper) computers and the recursion theoretic diagonal methods are discussed. As quantum information is not bound by the mutually exclusive states of classical bits, it allows a consistent representation of fixed point states of the diagonal operator. In an effort to reconstruct the selfcontradictory feature of diagonalization, a generalized diagonal method allowing no quantum fixed points is proposed.
Is the Network TuringComplete?
"... Abstract—Ensuring correct network behavior is hard. This is the case even for simple networks, and adding middleboxes only complicates this task. In this paper, we demonstrate a fundamental property of networks. Namely, we show a way of using a network to emulate the Rule 110 cellular automaton. We ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—Ensuring correct network behavior is hard. This is the case even for simple networks, and adding middleboxes only complicates this task. In this paper, we demonstrate a fundamental property of networks. Namely, we show a way of using a network to emulate the Rule 110 cellular automaton. We do so using just a set of network devices with simple features such as packet matching, header rewriting and roundrobin loadbalancing. Therefore, we show that a network can emulate any Turing machine. This ultimately means that analyzing dynamic network behavior can be as hard as analyzing an arbitrary program. Analyzing a network containing middleboxes is already understood to be hard. Our result shows that using even only statically configured switches can make the problem intractable. I.
Constraint Acquisition You Can Chase but You Cannot Find
"... We identify established tableaux techniques as an invaluable tool for semantic knowledge acquisition in the design process of relational databases. Sample databases allow users and designers to judge, justify, convey and test their understanding of the semantics of the future database. In the case o ..."
Abstract
 Add to MetaCart
We identify established tableaux techniques as an invaluable tool for semantic knowledge acquisition in the design process of relational databases. Sample databases allow users and designers to judge, justify, convey and test their understanding of the semantics of the future database. In the case of integrity constraints such sample data can provide considerable assistance for deciding whether a constraint captures desirable information about the database or not. Since constraints can be particularly difficult to grasp in practice sample databases offer a convenient tool to confirm or reject the usefulness of potential candidate constraints. We pinpoint the Chase and analytical tableau as two tableaux techniques that are able to automatically generate sample databases for large classes of integrity constraints. The Chase can be used for generating sample data that allows us to reject candidate constraints. However, analytical tableaux enable us to find all minimal sample databases which enable us to either accept or reject a candidate constraint. 1
Predicate logic
"... Question 1. Describe this discipline/subdiscipline, and some of its more recent developments. Predicate logic is a subdiscipline of logic that had its roots in the last quarter of the nineteenth century, though it had to wait until the second decade of the twentieth century for a solid foundation. ..."
Abstract
 Add to MetaCart
Question 1. Describe this discipline/subdiscipline, and some of its more recent developments. Predicate logic is a subdiscipline of logic that had its roots in the last quarter of the nineteenth century, though it had to wait until the second decade of the twentieth century for a solid foundation. Like any other logic, it is concerned with the validity of arguments, though not of any kind: its interest lies in reasoning about what is universally true. As such, predicate logic is especially suited to reason about mathematical statements and can be considered a generalization of Aristotelian syllogisms. Predicate logic goes beyond syllogisms by introducing predicates with arbitrary numbers of arguments, and quantifiers that allow to refer either to all or to some of the elements in the universe that is under consideration. It has a proof theory, which consists of a set of rules that describe how to mechanically derive sentences from a given set of premises (and such derivations are called “proofs”), as well as a model theory that assigns meaning to the sentences with respect to structures so that a given sentence is either true or false in a given structure. Predicate logic is sound, meaning that every sentence that