Results 1  10
of
257
Decoherence, einselection, and the quantum origins of the classical
 REVIEWS OF MODERN PHYSICS 75, 715. AVAILABLE ONLINE AT HTTP://ARXIV.ORG/ABS/QUANTPH/0105127
, 2003
"... The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) ..."
Abstract

Cited by 54 (1 self)
 Add to MetaCart
(Show Context)
The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) of the symptoms of classicality can be induced in quantum systems by their environments. Thus decoherence is caused by the interaction in which the environment in effect monitors certain observables of the system, destroying coherence between the pointer states corresponding to their eigenvalues. This leads to environmentinduced superselection or einselection, a quantum process associated with selective loss of information. Einselected pointer states are stable. They can retain correlations with the rest of the universe in spite of the environment. Einselection enforces classicality by imposing an effective ban on the vast majority of the Hilbert space, eliminating especially the flagrantly nonlocal "Schrödingercat states." The classical structure of phase space emerges from the quantum Hilbert space in the appropriate macroscopic limit. Combination of einselection with dynamics leads to the idealizations of a point and of a classical trajectory. In measurements, einselection replaces quantum entanglement between the apparatus and the measured system with the classical correlation. Only the preferred pointer observable of the apparatus can store information
Decoherence, the measurement problem, and interpretations of quantum mechanics
, 2003
"... Environmentinduced decoherence and superselection have been a subject of intensive research over the past two decades. Yet, their implications for the foundational problems of quantum mechanics, most notably the quantum measurement problem, have remained a matter of great controversy. This paper is ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
Environmentinduced decoherence and superselection have been a subject of intensive research over the past two decades. Yet, their implications for the foundational problems of quantum mechanics, most notably the quantum measurement problem, have remained a matter of great controversy. This paper is intended to clarify key features of the decoherence program, including its more recent results, and to investigate their implications for foundational issues, not only concerning the measurement problem but also with respect to the main interpretive approaches of
Semiclassical Fourier Transform for Quantum
 Computation, Phys. Rev. Lett
, 1996
"... Shor’s algorithms for factorization and discrete logarithms on a quantum computer employ Fourier transforms preceding a final measurement. It is shown that such a Fourier transform can be carried out in a semiclassical way in which a “classical” (macroscopic) signal resulting from the measurement o ..."
Abstract

Cited by 42 (0 self)
 Add to MetaCart
(Show Context)
Shor’s algorithms for factorization and discrete logarithms on a quantum computer employ Fourier transforms preceding a final measurement. It is shown that such a Fourier transform can be carried out in a semiclassical way in which a “classical” (macroscopic) signal resulting from the measurement of one bit (embodied in a twostate quantum system) is employed to determine the type of measurement carried out on the next bit, and so forth. In this way the twobit gates in the Fourier transform can all be replaced by a smaller number of onebit gates controlled by classical signals. Success in simplifying the Fourier transform suggests that it may be worthwhile looking for other ways of using semiclassical methods in quantum computing. Recently Shor [1, 2] has shown that a quantum computer [3], if it could be built, would be capable of solving certain problems, such as factoring long numbers, much more rapidly than is possible using currently available algorithms on a conventional computer. This has stimulated a lot of interest in the subject [4, 5, 6], and various proposals have been made for actually constructing such a computer [7, 8, 9, 10, 11]. The basic idea is that bits representing
Do we really understand quantum mechanics? Strange correlations, paradoxes, and theorems
 Am. J. Phys
, 2001
"... This article presents a general discussion of several aspects of our present understanding of quantum mechanics. The emphasis is put on the very special correlations that this theory makes possible: they are forbidden by very general arguments based on realism and local causality. In fact, these cor ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
(Show Context)
This article presents a general discussion of several aspects of our present understanding of quantum mechanics. The emphasis is put on the very special correlations that this theory makes possible: they are forbidden by very general arguments based on realism and local causality. In fact, these correlations are completely impossible in any circumstance, except the very special situations designed by physicists especially to observe these purely quantum effects. Another general point that is emphasized is the necessity for the theory to predict the emergence of a single result in a single realization of an experiment. For this purpose, orthodox quantum mechanics introduces a special postulate: the reduction of the state vector, which comes in addition to the Schrödinger evolution postulate. Nevertheless, the presence in parallel of two evolution processes of the same object (the state vector) may be a potential source for conflicts; various attitudes that are possible
A Rosetta stone for quantum mechanics with an introduction to quantum computation
, 2002
"... Abstract. The purpose of these lecture notes is to provide readers, who have some mathematical background but little or no exposure to quantum mechanics and quantum computation, with enough material to begin reading ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
(Show Context)
Abstract. The purpose of these lecture notes is to provide readers, who have some mathematical background but little or no exposure to quantum mechanics and quantum computation, with enough material to begin reading
Decoherence, Einselection and the Existential Interpretation (The Rough Guide)
 PHIL. TRANS. R. SOC. LOND. A
, 1998
"... The roles of decoherence and environmentinduced superselection in the emergence of the classical from the quantum substrate are described. The stability of correlations between the einselected quantum pointer states and the environment allows them to exist almost as objectively as classical states ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
(Show Context)
The roles of decoherence and environmentinduced superselection in the emergence of the classical from the quantum substrate are described. The stability of correlations between the einselected quantum pointer states and the environment allows them to exist almost as objectively as classical states were once thought to exist: there are ways of finding out what is the pointer state of the system which uses redundancy of its correlations with the environment, and which leave einselected states essentially unperturbed. This relatively objective existence of certain quantum states facilitates operational definition of probabilities in the quantum setting. Moreover, once there are states that ‘exist ’ and can be ‘found out’, a ‘collapse ’ in the traditional sense is no longer necessary—in effect, it has already happened. The role of the preferred states in the processing and storage of information is emphasized. The existential interpretation based on the relatively objective existence of stable correlations between the einselected states of observers’ memory and in the outside universe is formulated and discussed.
Quantum Mereotopology
 Annals of Mathematics and Artificial Intelligence
, 2000
"... While mereotopology  the theory of boundaries, contact and separation built up on a mereological foundation  has found fruitful applications in the realm of qualitative spatial reasoning, it faces problems when its methods are extended to deal with those kinds of spatial and nonspatial reas ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
While mereotopology  the theory of boundaries, contact and separation built up on a mereological foundation  has found fruitful applications in the realm of qualitative spatial reasoning, it faces problems when its methods are extended to deal with those kinds of spatial and nonspatial reasoning which involve a factor of granularity. This is because granularity cannot easily be represented within a mereologybased framework. We sketch how this problem can be solved by means of a theory of coarsegrained partitions, drawing on methods developed for the manipulation of partitions in the spatial realm and applying these to a range of partitions of nonspatial sorts. We then show how these same methods can be extended to apply to finite sequences of partitions evolving over time, or to what we shall call coarse and finegrained histories. Keywords: mereotopology, granularity, ontology, partitions, histories 1. Introduction As a result of a series of important contribut...
Choice of consistent family, and quantum incompatibility,” Phys
, 1998
"... In consistent history quantum theory, a description of the time development of a quantum system requires choosing a framework or consistent family, and then calculating probabilities for the different histories which it contains. It is argued that the framework is chosen by the physicist constructin ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
In consistent history quantum theory, a description of the time development of a quantum system requires choosing a framework or consistent family, and then calculating probabilities for the different histories which it contains. It is argued that the framework is chosen by the physicist constructing a description of a quantum system on the basis of questions he wishes to address, in a manner analogous to choosing a coarse graining of the phase space in classical statistical mechanics. The choice of framework is not determined by some law of nature, though it is limited by quantum incompatibility, a concept which is discussed using a twodimensional Hilbert space (spin half particle). Thus certain questions of physical interest can only be addressed using frameworks in which they make (quantum mechanical) sense. The physicist’s choice does not influence reality, nor does the presence of choices render the theory subjective. On the contrary, predictions of the theory can, in principle, be verified by experimental measurements. These considerations are used to address various criticisms and possible misunderstandings of the consistent history approach, including its predictive power, whether it requires a new logic, whether it can be interpreted realistically, the nature of “quasiclassicality”, and the possibility of “contrary ” inferences. I.
Research on hidden variable theories: a review of recent progresses
, 2007
"... Quantum Mechanics (QM) is one of the pillars of modern physics: an impressive amount of experiments have confirmed this theory and many technological applications are based on it. Nevertheless, at one century since its development, various aspects concerning its very foundations still remain to be c ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Quantum Mechanics (QM) is one of the pillars of modern physics: an impressive amount of experiments have confirmed this theory and many technological applications are based on it. Nevertheless, at one century since its development, various aspects concerning its very foundations still remain to be clarified. Among them, the transition from a microscopic probabilistic world into a macroscopic deterministic one and quantum nonlocality. A possible way out from these problems would be if QM represents a statistical approximation of an unknown deterministic theory. This review is addressed to present the most recent progresses on the studies related to Hidden Variable Theories (HVT), both from an experimental and a theoretical point of view, giving a larger emphasis to results with a direct experimental application. More in details, the first part of the review is a historical introduction to this problem. The EinsteinPodolskyRosen argument and the first discussions about
Causal inference using the algorithmic Markov condition
, 2008
"... Inferring the causal structure that links n observables is usually basedupon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to g ..."
Abstract

Cited by 12 (11 self)
 Add to MetaCart
Inferring the causal structure that links n observables is usually basedupon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information anddescribe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution. email: