Results 1  10
of
37
Introducing the ITP tool: a tutorial
 Journal of Universal Computer Science
, 2006
"... Abstract: We present a tutorial of the ITP tool, a rewritingbased theorem prover that can be used to prove inductive properties of membership equational specifications. We also introduce membership equational logic as a formal language particularly adequate for specifying and verifying semantic dat ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
(Show Context)
Abstract: We present a tutorial of the ITP tool, a rewritingbased theorem prover that can be used to prove inductive properties of membership equational specifications. We also introduce membership equational logic as a formal language particularly adequate for specifying and verifying semantic data structures, such as ordered lists, binary search trees, priority queues, and powerlists. The ITP tool is a Maude program that makes extensive use of the reflective capabilities of this system. In fact, rewritingbased proof simplification steps are directly executed by the powerful underlying Maude rewriting engine. The ITP tool is currently available as a webbased application that includes a module editor, a formula editor, and a command editor. These editors allow users to create and modify their specifications, to formalize properties about them, and to guide their proofs by filling and submitting web forms. Key Words: inductive theorem proving, semantic data structures, membership equational logic, ITP
Composition with Target Constraints
, 2010
"... It is known that the composition of schema mappings, each specified by sourcetotarget tgds (sttgds), can be specified by a secondorder tgd (SO tgd). We consider the question of what happens when target constraints are allowed. Specifically, we consider the question of specifying the composition ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
It is known that the composition of schema mappings, each specified by sourcetotarget tgds (sttgds), can be specified by a secondorder tgd (SO tgd). We consider the question of what happens when target constraints are allowed. Specifically, we consider the question of specifying the composition of standard schema mappings (those specified by sttgds, target egds, and a weaklyacyclic set of target tgds). We show that SO tgds, even with the assistance of arbitrary source constraints and target constraints, cannot specify in general the composition of two standard schema mappings. Therefore, we introduce sourcetotarget secondorder dependencies (stSO dependencies), which are similar to SO tgds, but allow equations in the conclusion. We show that stSO dependencies (along with target egds and target tgds) are sufficient to express the composition of every finite sequence of standard schema mappings,
TIOA User Guide and Reference Manual
, 2005
"... TIOA is a simple formal language for modeling distributed systems with timing as collections of interacting state machines, called timed input/output automata. The TIOA Toolkit supports a range of validation methods, including simulation and machinechecked proofs. This user guide and reference manu ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
TIOA is a simple formal language for modeling distributed systems with timing as collections of interacting state machines, called timed input/output automata. The TIOA Toolkit supports a range of validation methods, including simulation and machinechecked proofs. This user guide and reference manual includes a tutorial on the use of timed input/output automata and the TIOA language to model timed systems. It also includes a complete definition of the TIOA language.
Integration in valued fields
 in Algebraic Geometry and Number Theory, Progr. Math. 253, Birkhäuser
, 2006
"... Abstract. We develop a theory of integration over valued fields of residue characteristic zero. In particular we obtain new and basefield independent foundations for integration over local fields of large residue characteristic, extending results of Denef,Loeser, Cluckers. The method depends on an ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We develop a theory of integration over valued fields of residue characteristic zero. In particular we obtain new and basefield independent foundations for integration over local fields of large residue characteristic, extending results of Denef,Loeser, Cluckers. The method depends on an analysis of definable sets up to definable bijections. We obtain a precise description of the Grothendieck semigroup of such sets in terms of related groups over the residue field and value group. This yields new invariants of all definable bijections, as well as invariants of measure preserving bijections.
Supervaluationism and Its Logics
"... If we adopt a supervaluational semantics for vagueness, what sort of logic results? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recast. There are several ways of doing this within a supervaluational framework, the main alternative bei ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
If we adopt a supervaluational semantics for vagueness, what sort of logic results? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recast. There are several ways of doing this within a supervaluational framework, the main alternative being between ‘global ’ construals (e.g. an argument is valid if and only if it preserves truthunderallprecisifications) and ‘local’ construals (an argument is valid if and only if, under all precisifications, it preserves truth). The former alternative is by far more popular, but I argue in favour of the latter, for (i) it does not suffer from a number of serious objections, and (ii) it makes it possible to restore global validity as a defined notion. Supervaluationism is a mixed bag. It is sometimes described as the ‘standard ’ theory of vagueness, at least in so far as vagueness is construed as a semantic phenomenon, but exactly what that standard theory amounts to is far from clear. In fact, it is pretty clear that there is not just one supervaluational semantics out there—there are lots of such semantics; and although it is true that they all exploit the same insight, their relative differences are by no means immaterial. For one thing, a lot depends on how exactly supervaluations are constructed, that is, on how exactly we come to establish the truthvalue of a given statement. (And when I say that a lot depends on this I mean to say that different explanations may give rise to different philosophical worries, or justify different reactions.) Secondly, and equally importantly, a lot depends on how a given supervaluationary machinery is brought into play when it comes to explaining the logic of the language, that is, not the notion of truth, or ‘supertruth’, as it applies to individual statements, but the notion of validity, or ‘supervalidity’, as it applies to whole arguments. (I am thinking for instance of how different explanations may bear on the question of whether, or to what extent, vagueness involves a departure from classical logic.) Here I want to focus on this second part of the story. However, since the notion of validity depends on the notion of truth—or so one may argue—I also want to comment briefly on the first.
A Note on the Testability of Ramsey’s Class
"... In property testing, the goal is to distinguish between objects that satisfy some desirable property and objects that are far from satisfying it, after examining only a small, random sample of the object in question. Although much of the literature has focused on properties of graphs, very recentl ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
In property testing, the goal is to distinguish between objects that satisfy some desirable property and objects that are far from satisfying it, after examining only a small, random sample of the object in question. Although much of the literature has focused on properties of graphs, very recently several strong results on hypergraphs have appeared. We revisit a logical result obtained by Alon et al. [1] in the light of these recent results. The main result is the testability of all properties (of relational structures) expressible in sentences of Ramsey’s class.
On the Brightness of the Thomson Lamp. A Prolegomenon to Quantum Recursion Theory
, 2009
"... Some physical aspects related to the limit operations of the Thomson lamp are discussed. Regardless of the formally unbounded and even infinite number of “steps” involved, the physical limit has an operational meaning in agreement with the Abel sums of infinite series. The formal analogies to accele ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Some physical aspects related to the limit operations of the Thomson lamp are discussed. Regardless of the formally unbounded and even infinite number of “steps” involved, the physical limit has an operational meaning in agreement with the Abel sums of infinite series. The formal analogies to accelerated (hyper) computers and the recursion theoretic diagonal methods are discussed. As quantum information is not bound by the mutually exclusive states of classical bits, it allows a consistent representation of fixed point states of the diagonal operator. In an effort to reconstruct the selfcontradictory feature of diagonalization, a generalized diagonal method allowing no quantum fixed points is proposed.
Is the Network TuringComplete?
"... Abstract—Ensuring correct network behavior is hard. This is the case even for simple networks, and adding middleboxes only complicates this task. In this paper, we demonstrate a fundamental property of networks. Namely, we show a way of using a network to emulate the Rule 110 cellular automaton. We ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Ensuring correct network behavior is hard. This is the case even for simple networks, and adding middleboxes only complicates this task. In this paper, we demonstrate a fundamental property of networks. Namely, we show a way of using a network to emulate the Rule 110 cellular automaton. We do so using just a set of network devices with simple features such as packet matching, header rewriting and roundrobin loadbalancing. Therefore, we show that a network can emulate any Turing machine. This ultimately means that analyzing dynamic network behavior can be as hard as analyzing an arbitrary program. Analyzing a network containing middleboxes is already understood to be hard. Our result shows that using even only statically configured switches can make the problem intractable. I.
Syntactic characterizations of polynomial time optimization classes
 Chicago Journal of Theoretical Computer Science
, 2008
"... The characterization of important complexity classes by logical descriptions has been an important and prolific area of Descriptive complexity. However, the central focus of the research has been the study of classes like P, NP, L and NL, corresponding to decision problems (e.g. the characterization ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
The characterization of important complexity classes by logical descriptions has been an important and prolific area of Descriptive complexity. However, the central focus of the research has been the study of classes like P, NP, L and NL, corresponding to decision problems (e.g. the characterization of NP by Fagin [Fag74] and of P by Grädel [E. 91]). In contrast, optimization problems have received much less attention. Optimization problems corresponding to the NP class have been characterized in terms of logic expressions by Papadimitriou and Yannakakis, Panconesi and Ranjan, Kolaitis and Thakur, Khanna et al, and by Zimand. In this paper, we attempt to characterize the optimization versions of P via expressions in second order logic, many of them using universal Horn formulae with successor relations. These results nicely complement those of Kolaitis and Thakur [KT94] for polynomially bounded NPoptimization problems. The polynomially bounded versions of maximization and minimization problems are treated first, and then the maximization problems in the “not necessarily polynomially bounded” class. 1
Possible physical universes
, 2006
"... The purpose of this paper is to discuss the various types of physical universe which could exist according to modern mathematical physics. The paper begins with an introduction that approaches the question from the viewpoint of ontic structural realism. Section 2 takes the case of the ‘multiverse ’ ..."
Abstract
 Add to MetaCart
The purpose of this paper is to discuss the various types of physical universe which could exist according to modern mathematical physics. The paper begins with an introduction that approaches the question from the viewpoint of ontic structural realism. Section 2 takes the case of the ‘multiverse ’ of spatially homogeneous universes, and analyses the famous CollinsHawking argument, which purports to show that our own universe is a very special member of this collection. Section 3 considers the multiverse of all solutions to the Einstein field equations, and continues the discussion of whether the notions of special and typical can be defined within such a collection. 1