Results 1  10
of
44
Nuclear magnetic resonance spectroscopy: An experimentally accessible paradigm for quantum computing
 In Proceedings of the Fourth Workshop on Physics and Computation. New England Complex Systems Institute
, 1996
"... We present experimental results which demonstrate that nuclear magnetic resonance spectroscopy is capable of efficiently emulating many of the capabilities of quantum computers, including unitary evolution and coherent superpositions, but without attendant wavefunction collapse. This emulation is m ..."
Abstract

Cited by 55 (7 self)
 Add to MetaCart
We present experimental results which demonstrate that nuclear magnetic resonance spectroscopy is capable of efficiently emulating many of the capabilities of quantum computers, including unitary evolution and coherent superpositions, but without attendant wavefunction collapse. This emulation is made possible by two facts. The first is that the spin active nuclei in each molecule of a liquid sample are largely isolated from the spins in all other molecules, so that each molecule is effectively an independent quantum computer. The second is the existence of a manifold of statistical spin states, called pseudopure states, whose transformation properties are identical to those of true pure states. These facts enable us to operate on coherent superpositions over the spins in each molecule using full quantum parallelism, and to combine the results into deterministic macroscopic observables via thermodynamic averaging. We call a device based on these principles an ensemble quantum computer. Our results show that it is indeed possible to prepare a pseudopure state
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 31 (15 self)
 Add to MetaCart
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
Experimental quantum teleportation
 Nature
, 1997
"... Quantum entanglement lies at the heart of new proposals for quantum communication and computation. Here we describe the recent experimental realization of quantum teleportation. ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
Quantum entanglement lies at the heart of new proposals for quantum communication and computation. Here we describe the recent experimental realization of quantum teleportation.
Information and Computation: Classical and Quantum Aspects
 REVIEWS OF MODERN PHYSICS
, 2001
"... Quantum theory has found a new field of applications in the realm of information and computation during the recent years. This paper reviews how quantum physics allows information coding in classically unexpected and subtle nonlocal ways, as well as information processing with an efficiency largely ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
Quantum theory has found a new field of applications in the realm of information and computation during the recent years. This paper reviews how quantum physics allows information coding in classically unexpected and subtle nonlocal ways, as well as information processing with an efficiency largely surpassing that of the present and foreseeable classical computers. Some outstanding aspects of classical and quantum information theory will be addressed here. Quantum teleportation, dense coding, and quantum cryptography are discussed as a few samples of the impact of quanta in the transmission of information. Quantum logic gates and quantum algorithms are also discussed as instances of the improvement in information processing by a quantum computer. We provide finally some examples of current experimental
Building quantum wires: the long and the short of it
 In Proc. International Symposium on Computer Architecture (ISCA 2003
, 2003
"... As quantum computing moves closer to reality the need for basic architectural studies becomes more pressing. Quantum wires, which transport quantum data, will be a fundamental component in all anticipated silicon quantum architectures. In this paper, we introduce a quantum wire architecture based up ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
As quantum computing moves closer to reality the need for basic architectural studies becomes more pressing. Quantum wires, which transport quantum data, will be a fundamental component in all anticipated silicon quantum architectures. In this paper, we introduce a quantum wire architecture based upon quantum teleportation. We compare this teleportation channel with the traditional approach to transporting quantum data, which we refer to as the swapping channel. We characterize the latency and bandwidth of these two alternatives in a deviceindependent way and describe how the advanced architecture of the teleportation channel overcomes a basic limit to the maximum communication distance of the swapping channel. In addition, we discover a fundamental tension between the scale of quantum effects and the scale of the classical logic needed to control them. This “pitchmatching ” problem imposes constraints on minimum wire lengths and wire intersections, which in turn imply a sparsely connected architecture of coarsegrained quantum computational elements. This is in direct contrast to the “sea of gates ” architectures presently assumed by most quantum computing studies. 1
Do we really understand quantum mechanics? Strange correlations, paradoxes, and theorems
 Am. J. Phys
, 2001
"... This article presents a general discussion of several aspects of our present understanding of quantum mechanics. The emphasis is put on the very special correlations that this theory makes possible: they are forbidden by very general arguments based on realism and local causality. In fact, these cor ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
This article presents a general discussion of several aspects of our present understanding of quantum mechanics. The emphasis is put on the very special correlations that this theory makes possible: they are forbidden by very general arguments based on realism and local causality. In fact, these correlations are completely impossible in any circumstance, except the very special situations designed by physicists especially to observe these purely quantum effects. Another general point that is emphasized is the necessity for the theory to predict the emergence of a single result in a single realization of an experiment. For this purpose, orthodox quantum mechanics introduces a special postulate: the reduction of the state vector, which comes in addition to the Schrödinger evolution postulate. Nevertheless, the presence in parallel of two evolution processes of the same object (the state vector) may be a potential source for conflicts; various attitudes that are possible
Decoherence, Einselection and the Existential Interpretation (The Rough Guide)
 PHIL. TRANS. R. SOC. LOND. A
, 1998
"... The roles of decoherence and environmentinduced superselection in the emergence of the classical from the quantum substrate are described. The stability of correlations between the einselected quantum pointer states and the environment allows them to exist almost as objectively as classical states ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
The roles of decoherence and environmentinduced superselection in the emergence of the classical from the quantum substrate are described. The stability of correlations between the einselected quantum pointer states and the environment allows them to exist almost as objectively as classical states were once thought to exist: there are ways of finding out what is the pointer state of the system which uses redundancy of its correlations with the environment, and which leave einselected states essentially unperturbed. This relatively objective existence of certain quantum states facilitates operational definition of probabilities in the quantum setting. Moreover, once there are states that ‘exist ’ and can be ‘found out’, a ‘collapse ’ in the traditional sense is no longer necessary—in effect, it has already happened. The role of the preferred states in the processing and storage of information is emphasized. The existential interpretation based on the relatively objective existence of stable correlations between the einselected states of observers’ memory and in the outside universe is formulated and discussed.
The New AI: General & Sound & Relevant for Physics
, 2003
"... Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inducti ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inductive inference based on Occam's razor, problem solving, decision making, and reinforcement learning in environments of a very general type. Since inductive inference is at the heart of all inductive sciences, some of the results are relevant not only for AI and computer science but also for physics, provoking nontraditional predictions based on Zuse's thesis of the computergenerated universe.
The Ion Trap Quantum Information Processor
 Appl. Phys. B
, 1997
"... An introductory review of the linear ion trap is given, with particular regard to its use for quantum information processing. The discussion aims to bring together ideas from information theory and experimental ion trapping, to provide a resource to workers unfamiliar with one or the other of these ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
An introductory review of the linear ion trap is given, with particular regard to its use for quantum information processing. The discussion aims to bring together ideas from information theory and experimental ion trapping, to provide a resource to workers unfamiliar with one or the other of these subjects. It is shown that information theory provides valuable concepts for the experimental use of ion traps, especially error correction, and conversely the ion trap provides a valuable link between information theory and physics, with attendant physical insights. Example parameters are given for the case of calcium ions. Passive stabilisation will allow about 200 computing operations on 10 ions; with error correction this can be greatly extended. 1