Results 1  10
of
135
Faulttolerant quantum computation
 In Proc. 37th FOCS
, 1996
"... It has recently been realized that use of the properties of quantum mechanics might speed up certain computations dramatically. Interest in quantum computation has since been growing. One of the main difficulties in realizing quantum computation is that decoherence tends to destroy the information i ..."
Abstract

Cited by 201 (4 self)
 Add to MetaCart
It has recently been realized that use of the properties of quantum mechanics might speed up certain computations dramatically. Interest in quantum computation has since been growing. One of the main difficulties in realizing quantum computation is that decoherence tends to destroy the information in a superposition of states in a quantum computer, making long computations impossible. A further difficulty is that inaccuracies in quantum state transformations throughout the computation accumulate, rendering long computations unreliable. However, these obstacles may not be as formidable as originally believed. For any quantum computation with t gates, we show how to build a polynomial size quantum circuit that tolerates O(1 / log c t) amounts of inaccuracy and decoherence per gate, for some constant c; the previous bound was O(1 /t). We do this by showing that operations can be performed on quantum data encoded by quantum errorcorrecting codes without decoding this data. 1.
Reliable quantum computers
 Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
, 1998
"... The new field of quantum error correction has developed spectacularly since its origin less than two years ago. Encoded quantum information can be protected from errors that arise due to uncontrolled interactions with the environment. Recovery from errors can work effectively even if occasional mist ..."
Abstract

Cited by 123 (3 self)
 Add to MetaCart
The new field of quantum error correction has developed spectacularly since its origin less than two years ago. Encoded quantum information can be protected from errors that arise due to uncontrolled interactions with the environment. Recovery from errors can work effectively even if occasional mistakes occur during the recovery procedure. Furthermore, encoded quantum information can be processed without serious propagation of errors. Hence, an arbitrarily long quantum computation can be performed reliably, provided that the average probability of error per quantum gate is less than a certain critical value, the accuracy threshold. A quantum computer storing about 106 qubits, with a probability of error per quantum gate of order 106, would be a formidable factoring engine. Even a smaller lessaccurate quantum computer would be able to perform many useful tasks. This paper is based on a talk presented at the ITP Conference on Quantum Coherence
A Theory of Quantum ErrorCorrecting Codes
 Phys. Rev. A
, 1996
"... Quantum Error Correction will be necessary for preserving coherent states against noise and other unwanted interactions in quantum computation and communication. We develop a general theory of quantum error correction based on encoding states into larger Hilbert spaces subject to known interactions. ..."
Abstract

Cited by 74 (7 self)
 Add to MetaCart
Quantum Error Correction will be necessary for preserving coherent states against noise and other unwanted interactions in quantum computation and communication. We develop a general theory of quantum error correction based on encoding states into larger Hilbert spaces subject to known interactions. We obtain necessary and sufficient conditions for the perfect recovery of an encoded state after its degradation by an interaction. The conditions depend only on the behavior of the logical states. We use them to give a recovery operator independent definition of errorcorrecting codes. We relate this definition to four others: The existence of a left inverse of the interaction, an explicit representation of the error syndrome using tensor products, perfect recovery of the completely entangled state, and an information theoretic identity. Two notions of fidelity and error for imperfect recovery are introduced, one for pure and the other for entangled states. The latter is more appropriate w...
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 31 (15 self)
 Add to MetaCart
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
Polynomial Simulations of Decohered Quantum Computers
 37th Annual Symposium on Foundations of Computer Science
, 1996
"... Recently it has become clear, that a key issue in quantum computation is understanding how interaction with the environment, or "decoherence", effects the computational power of quantum computers. We adopt the standard physical method of describing systems which are interwound with their environment ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
Recently it has become clear, that a key issue in quantum computation is understanding how interaction with the environment, or "decoherence", effects the computational power of quantum computers. We adopt the standard physical method of describing systems which are interwound with their environment by "density matrices", and within this framework define a model of decoherence in quantum computation.
Evolution in quantum causal histories
"... We provide a precise definition and analysis of quantum causal histories (QCH). A QCH consists of a discrete, locally finite, causal prespacetime with matrix algebras encoding the quantum structure at each event. The evolution of quantum states and observables is described by completely positive ma ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
We provide a precise definition and analysis of quantum causal histories (QCH). A QCH consists of a discrete, locally finite, causal prespacetime with matrix algebras encoding the quantum structure at each event. The evolution of quantum states and observables is described by completely positive maps between the algebras at causally related events. We show that this local description of evolution is sufficient and that unitary evolution can be recovered wherever it should actually be expected. This formalism may describe a quantum cosmology without an assumption of global hyperbolicity; it is thus more general than the WheelerDe Witt approach. The structure of a QCH is also closely related to quantum information theory and algebraic quantum field theory on a causal set.
Church's thesis meets the Nbody problem
, 1999
"... THIS IS A REVISIONINPROGRESS! NOT QUITE FINAL YET! "Church's thesis" is at the foundation of computer science. It is pointed out that with any particular set of physical laws, Church's thesis need not merely be postulated, in fact it may be decidable. Trying to do so is valuable. In Newton's laws ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
THIS IS A REVISIONINPROGRESS! NOT QUITE FINAL YET! "Church's thesis" is at the foundation of computer science. It is pointed out that with any particular set of physical laws, Church's thesis need not merely be postulated, in fact it may be decidable. Trying to do so is valuable. In Newton's laws of physics with point masses, we outline a proof that Church's thesis is false. But with certain more realistic laws of motion, incorporating some relativistic effects, the Extended Church's thesis is true. Along the way we prove a useful theorem: a wide class of ordinary differential equations may be integrated with "polynomial slowdown. " Warning: we cannot give careful definitions and caveats in this abstract, and interpreting our results is difficult. Keywords  Newtonian Nbody problem, Church's thesis, computability, numerical methods for ordinary differential equations. Contents 1 Background 1 2 Introduction. Our results and their interpretation. 2 2.1 First way to interpret the...
Understanding Deutsch’s probability in a deterministic multiverse
 Studies in History and Philosophy of Modern Physics 35B
, 2004
"... Difficulties over probability have often been considered fatal to the Everett interpretation of quantum mechanics. Here I argue that the Everettian can have everything she needs from ‘probability ’ without recourse to indeterminism, ignorance, primitive identity over time or subjective uncertainty: ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
Difficulties over probability have often been considered fatal to the Everett interpretation of quantum mechanics. Here I argue that the Everettian can have everything she needs from ‘probability ’ without recourse to indeterminism, ignorance, primitive identity over time or subjective uncertainty: all she needs is a particular rationality principle. The decisiontheoretic approach recently developed by Deutsch and Wallace claims to provide just such a principle. But, according to Wallace, decision theory is itself applicable only if the correct attitude to a future Everettian measurement outcome is subjective uncertainty. I argue that subjective uncertainty is not to be had, but I offer an alternative interpretation that enables the Everettian to live without uncertainty: we can justify Everettian decision theory on the basis that an Everettian should care about all her future branches. The probabilities appearing in the decisiontheoretic representation theorem can then be interpreted as the degrees to which the rational agent cares about each future branch. This reinterpretation, however, reduces the intuitive plausibility of one of the DeutschWallace axioms (Measurement Neutrality).
THE SEMICLASSICAL APPROXIMATION TO QUANTUM GRAVITY
, 1993
"... A detailed review is given of the semiclassical approximation to quantum gravity in the canonical framework. This includes in particular the derivation of the functional Schrödinger equation and a discussion of semiclassical time as well as the derivation of quantum gravitational correction terms to ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
A detailed review is given of the semiclassical approximation to quantum gravity in the canonical framework. This includes in particular the derivation of the functional Schrödinger equation and a discussion of semiclassical time as well as the derivation of quantum gravitational correction terms to the Schrödinger equation. These terms are used to calculate energy shifts for fields in De Sitter space and nonunitary contributions in black hole evaporation. Emphasis is also put on the relevance of decoherence and correlations in semiclassical gravity. The back reaction of nongravitational quantum fields onto the semiclassical background and the emergence of a Berry connection on superspace is also discussed in this framework.