Results 1 
7 of
7
Enumerable sets and quasireducibility
 Annals of Pure and Applied Logic
, 1998
"... Abstract We consider the computably enumerable sets under the relation of Qreducibility. We first give several results comparing the upper semilattice of c.e. Qdegrees, hRQ; ^Q i, under this reducibility with the more familiar structure of the c.e. Turing degrees. In our final section, we use codin ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract We consider the computably enumerable sets under the relation of Qreducibility. We first give several results comparing the upper semilattice of c.e. Qdegrees, hRQ; ^Q i, under this reducibility with the more familiar structure of the c.e. Turing degrees. In our final section, we use coding methods to show that the elementary theory of hRQ; ^Q i is undecidable. 1 Introduction Classical recursion theory first arose in order to study the inherent difficulty of mathematical problems. By far the most deeply studied notion relating the difficulty of one problem to another has been that given by Turing reducibility. The reason for this is that Turing reducibility seems to give the most general means of obtaining finite information about one object given finite information about another; hence, as the limiting case of using information, it is the most natural object of study for purely theoretical investigations of relative computability and definability. Nevertheless, for specific problems, particularly those arising in the study of algebraic structures, other reducibilities are actually the correct ones to consider. These reducibilities are usually less general, or "stronger", since they arise by putting limits of some kind on what sort of information can be used in a relative solution of one problem given another. For example, weak truth table (wtt) reducibility imposes the additional condition that the amount of information used in a relative computation can be bounded in advance by a computable function. In the case of (computably presentable) infinite dimensional vector spaces, it turns out that the inherent difficulty of constructing bases for subspaces coincides exactly with the relation of wtt reducibility, rather than Turing reducibility. A similar situation arises in combinatorial group theory, where socalled quasireducibility, or Qreducibility, turns out to be a more useful means of comparing word problems than ordinary Treducibility.
2004], The 89theory of R( ; _; ^) is undecidable
 Trans. Am. Math. Soc
"... Abstract The three quantifier theory of (R; ^T), the recursively enumerable degrees under Turing reducibility, was proven undecidable by Lempp, Nies and Slaman [1998]. The two quantifier theory includes the lattice embedding problem and its decidability is a long standing open question. A negative s ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract The three quantifier theory of (R; ^T), the recursively enumerable degrees under Turing reducibility, was proven undecidable by Lempp, Nies and Slaman [1998]. The two quantifier theory includes the lattice embedding problem and its decidability is a long standing open question. A negative solution to this problem seems out of reach of the standard methods of interpretation of theories because the language is relational. We prove the undecidability of a fragment of the theory of R that lies between the two and three quantifier theories with ^T but includes function symbols.
The ∀∃theory of R(≤, ∨, ∧) is undecidable
 Trans. Amer. Math. Soc
, 2004
"... Abstract. The three quantifier theory of (R, ≤T), the recursively enumerable degrees under Turing reducibility, was proven undecidable by Lempp, Nies and Slaman (1998). The two quantifier theory includes the lattice embedding problem and its decidability is a longstanding open question. A negative ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. The three quantifier theory of (R, ≤T), the recursively enumerable degrees under Turing reducibility, was proven undecidable by Lempp, Nies and Slaman (1998). The two quantifier theory includes the lattice embedding problem and its decidability is a longstanding open question. A negative solution to this problem seems out of reach of the standard methods of interpretation of theories because the language is relational. We prove the undecidability of a fragment of the theory of R that lies between the two and three quantifier theories with ≤T but includes function symbols. Theorem. The two quantifier theory of (R, ≤, ∨, ∧), the r.e. degrees with Turing reducibility, supremum and infimum (taken to be any total function extending the infimum relation on R) is undecidable. The same result holds for various lattices of ideals of R which are natural extensions of R preserving join and infimum when it exits. 1.
Abstract
"... We present an explicit measurement in the Fourier basis that solves an important case of the Hidden Subgroup Problem, including the case to which Graph Isomorphism reduces. This entangled measurement uses k = log 2 G  registers, and each of the 2 k subsets of the registers contributes some informa ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We present an explicit measurement in the Fourier basis that solves an important case of the Hidden Subgroup Problem, including the case to which Graph Isomorphism reduces. This entangled measurement uses k = log 2 G  registers, and each of the 2 k subsets of the registers contributes some information. 1
Computational Processes, Observers and Turing Incompleteness
"... We propose a formal definition of Wolfram’s notion of computational process based on iterated transducers together with a weak observer, a model of computation that captures some aspects of physicslike computation. These processes admit a natural classification into decidable, intermediate and comp ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We propose a formal definition of Wolfram’s notion of computational process based on iterated transducers together with a weak observer, a model of computation that captures some aspects of physicslike computation. These processes admit a natural classification into decidable, intermediate and complete, where intermediate processes correspond to recursively enumerable sets of intermediate degree in the classical setting. It is shown that a standard finite injury priority argument will not suffice to establish the existence of an intermediate computational process.
Lattice Embeddings below a Nonlow Recursively Enumerable Degree
 Israel J. Math
, 1996
"... We introduce techniques that allow us to embed below an arbitary nonlow 2 recursively enumerable degree any lattice currently known to be embedable into the recursively enumerable degrees. 1 Introduction One of the most basic and important questions concerning the structure of the upper semilattice ..."
Abstract
 Add to MetaCart
We introduce techniques that allow us to embed below an arbitary nonlow 2 recursively enumerable degree any lattice currently known to be embedable into the recursively enumerable degrees. 1 Introduction One of the most basic and important questions concerning the structure of the upper semilattice R of recursively enumerable degrees is the embedding question: what (finite) lattices can be embedded as lattices into R? This question has a long and rich history. After the proof of the density theorem by Sacks [31], Shoenfield [32] made a conjecture, one consequence of which would be that no lattice embeddings into R were possible. Lachlan [21] and Yates [40] independently refuted Shoenfield's conjecture by proving that the 4 element boolean algebra could be embedded into R (even preserving 0). Using a little lattice representation theory, this result was subsequently extended by LachlanLermanThomason [38], [36] who proved that all countable distributive lattices could be embedded (pre...
Universality, Turing Incompleteness and Observers
"... The development of the mathematical theory of computability was motivated in large part by the foundational crisis in mathematics. D. Hilbert suggested an antidote to all the foundational problems that were discovered in the late 19th century: his proposal, in essence, was to formalize mathematics a ..."
Abstract
 Add to MetaCart
The development of the mathematical theory of computability was motivated in large part by the foundational crisis in mathematics. D. Hilbert suggested an antidote to all the foundational problems that were discovered in the late 19th century: his proposal, in essence, was to formalize mathematics and construct a finite set of axioms that are strong enough to prove all proper theorems, but no more. Thus a proof of consistency and a proof of completeness were required. These proofs should be carried only by strictly finitary means so as to be beyond any reasonable criticism. As Hilbert pointed out [19], to carry out this project one needs to develop a better understanding of proofs as objects of mathematical discourse: To reach our goal, we must make the proofs as such the object of our investigation; we are thus compelled to a sort of proof theory which studies operations with the proofs themselves. Furthermore, Hilbert hoped to find a single, mechanical procedure that would, at least in principle, provide correct answers to all welldefined questions