Results 1  10
of
61
Decoherence, einselection, and the quantum origins of the classical
 REVIEWS OF MODERN PHYSICS 75, 715. AVAILABLE ONLINE AT HTTP://ARXIV.ORG/ABS/QUANTPH/0105127
, 2003
"... The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) ..."
Abstract

Cited by 46 (1 self)
 Add to MetaCart
The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) of the symptoms of classicality can be induced in quantum systems by their environments. Thus decoherence is caused by the interaction in which the environment in effect monitors certain observables of the system, destroying coherence between the pointer states corresponding to their eigenvalues. This leads to environmentinduced superselection or einselection, a quantum process associated with selective loss of information. Einselected pointer states are stable. They can retain correlations with the rest of the universe in spite of the environment. Einselection enforces classicality by imposing an effective ban on the vast majority of the Hilbert space, eliminating especially the flagrantly nonlocal "Schrödingercat states." The classical structure of phase space emerges from the quantum Hilbert space in the appropriate macroscopic limit. Combination of einselection with dynamics leads to the idealizations of a point and of a classical trajectory. In measurements, einselection replaces quantum entanglement between the apparatus and the measured system with the classical correlation. Only the preferred pointer observable of the apparatus can store information
Introductory lectures on quantum cosmology
 Quantum cosmology and baby universes, World Scientific (Singapore
, 1991
"... ABSTRACT: We describe the modern approach to quantum cosmology, as initiated by Hartle and Hawking, Linde, Vilenkin and others. The primary aim is to explain how one determines the consequences for the late universe of a given quantum theory of cosmological initial or boundary conditions. An extensi ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
ABSTRACT: We describe the modern approach to quantum cosmology, as initiated by Hartle and Hawking, Linde, Vilenkin and others. The primary aim is to explain how one determines the consequences for the late universe of a given quantum theory of cosmological initial or boundary conditions. An extensive list of references is included, together with a guide to the literature.
THE SEMICLASSICAL APPROXIMATION TO QUANTUM GRAVITY
, 1993
"... A detailed review is given of the semiclassical approximation to quantum gravity in the canonical framework. This includes in particular the derivation of the functional Schrödinger equation and a discussion of semiclassical time as well as the derivation of quantum gravitational correction terms to ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
A detailed review is given of the semiclassical approximation to quantum gravity in the canonical framework. This includes in particular the derivation of the functional Schrödinger equation and a discussion of semiclassical time as well as the derivation of quantum gravitational correction terms to the Schrödinger equation. These terms are used to calculate energy shifts for fields in De Sitter space and nonunitary contributions in black hole evaporation. Emphasis is also put on the relevance of decoherence and correlations in semiclassical gravity. The back reaction of nongravitational quantum fields onto the semiclassical background and the emergence of a Berry connection on superspace is also discussed in this framework.
Energy Aware Computing Through Probabilistic Switching: A Study of Limits
 IEEE Transactions on Computers
, 2005
"... The mathematical technique of randomization yielding probabilistic algorithms is shown, for the first time, through a physical interpretation based on statistical thermodynamics, to be a basis for energy savings in computing. Concretely, at the fundamental limit, it is shown that the energy needed ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
The mathematical technique of randomization yielding probabilistic algorithms is shown, for the first time, through a physical interpretation based on statistical thermodynamics, to be a basis for energy savings in computing. Concretely, at the fundamental limit, it is shown that the energy needed to compute a single probabilistic bit or PBIT is proportional to the probability p of computing a PBIT accurately. This result is established through the introduction of an idealized switch, for computing a PBIT, using which a network of switches can be constructed. Interesting examples of such networks including AND, OR and NOT gates (or as functions, boolean conjunction, disjunction and negation respectively), are constructed and the potential for energy savings through randomization is established. To quantify these savings, novel measures of "technology independent" energy complexity are introducedthese parallel conventional machineindependent measures of computational complexity such as the algorithm's running time. Networks of switches can be shown to be equivalent to Turing machines and to boolean circuits, both of which are widelyknown and wellunderstood models of computation. These savings are realized using a novel way of representing a PBIT in the physical domain through a group of classical microstates. A measurement and thus detection of a microstate yields the value of the PBIT. While the eventual goal of this work is to lead to the physical realization of these theoretical constructs through the innovation of randomized (CMOS based) devices, the current goal is to rigorously establish the potential for energy savings through probabilistic computing at a fundamental physical level, based on the canonical thermodynamic models of idealized monoa...
Time Symmetry and Asymmetry in Quantum Mechanics and Quantum Cosmology in Physical Origins of Time Asymmetry
, 1994
"... The disparity between the time symmetry of the fundamental laws of physics and the time asymmetries of the observed universe has been a subject of fascination for physicists since the late 19th century. 1 The following general time asymmetries are observed in this universe: ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
The disparity between the time symmetry of the fundamental laws of physics and the time asymmetries of the observed universe has been a subject of fascination for physicists since the late 19th century. 1 The following general time asymmetries are observed in this universe:
Bluff your way in the second law of thermodynamics
 STUD. HIST. PHIL. MOD. PHYS
, 2001
"... The aim of this article is to analyse the relation between the second law of thermodynamics and the socalled arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time(a)symmetry and of (ir)reversibility. Next I review versio ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
The aim of this article is to analyse the relation between the second law of thermodynamics and the socalled arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time(a)symmetry and of (ir)reversibility. Next I review versions of the second law in the work of Carnot, Clausius, Kelvin, Planck, Gibbs, Carathéodory and Lieb and Yngvason, and investigate their connection with these aspects of the arrow of time. It is shown that this connection varies a great deal along with these formulations of the second law. According to the famous formulation by Planck, the second law expresses the irreversibility of natural processes. But in many other formulations irreversibility or even timeasymmetry plays no role. I therefore argue for the view that the second law has nothing to do with the arrow of time.
chaos, quantumclassical correspondence, and the algorithmic arrow of time, Physica Scripta T76
, 1998
"... The environment – external or internal degrees of freedom coupled to the system – can, in effect, monitor some of its observables. As a result, the eigenstates of these observables decohere and behave like classical states: Continuous destruction of superpositions leads to environmentinduced supers ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
The environment – external or internal degrees of freedom coupled to the system – can, in effect, monitor some of its observables. As a result, the eigenstates of these observables decohere and behave like classical states: Continuous destruction of superpositions leads to environmentinduced superselection (einselection). Here I investigate it in the context of quantum chaos (i. e., quantum dynamics of systems which are classically chaotic). I show that the evolution of a chaotic macroscopic (but, ultimately, quantum) system is not just difficult to predict (requiring accuracy exponentially increasing with time) but quickly ceases to be deterministic in principle as a result of the Heisenberg indeterminacy (which limits the resolution available in the initial conditions). This happens after a time t¯h which is only logarithmic in the Planck constant. A definitely macroscopic, if somewhat outrageous example is afforded by various components of the solar system which are chaotic, with the Lyapunov timescales ranging from a bit more then a month (Hyperion) to millions of years (planetary system as a whole). On the timescale t¯h the initial minimum uncertainty wavepackets corresponding to celestial bodies would be smeared over distances
Thermodynamics and Garbage Collection
 In ACM Sigplan Notices
, 1994
"... INTRODUCTION Computer scientists should have a knowledge of abstract statistical thermodynamics. First, computer systems are dynamical systems, much like physical systems, and therefore an important first step in their characterization is in finding properties and parameters that are constant over ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
INTRODUCTION Computer scientists should have a knowledge of abstract statistical thermodynamics. First, computer systems are dynamical systems, much like physical systems, and therefore an important first step in their characterization is in finding properties and parameters that are constant over time (i.e., constants of motion). Second, statistical thermodynamics successfully reduces macroscopic properties of a system to the statistical behavior of large numbers of microscopic processes. As computer systems become large assemblages of small components, an explanation of their macroscopic behavior may also be obtained as the aggregate statistical behavior of its component parts. If not, the elegance of the statistical thermodynamical approach can at least provide inspiration for new classes of models. 1 Third, the components of computer systems are approaching the same size as the microscopic pr
Arrow of time in a recollapsing quantum Universe, Phys. Rev. D51
, 1995
"... We show that the WheelerDeWitt equation with a consistent boundary condition is only compatible with an arrow of time that formally reverses in a recollapsing universe. Consistency of these opposite arrows is facilitated by quantum effects in the region of the classical turning point. Since gravita ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
We show that the WheelerDeWitt equation with a consistent boundary condition is only compatible with an arrow of time that formally reverses in a recollapsing universe. Consistency of these opposite arrows is facilitated by quantum effects in the region of the classical turning point. Since gravitational time dilation diverges at horizons, collapsing matter must then start reexpanding “anticausally ” (controlled by the reversed arrow) before horizons or singularities can form. We also discuss the meaning of the timeasymmetric expression used in the definition of “consistent histories”. We finally emphasize that there is no mass inflation nor any information loss paradox in this scenario. In conventional statistical physics, the thermodynamical arrow of time is described by assuming the initial entropy to be extremely small compared
Integrable classical and quantum gravity
 Commun. Math. Phys
, 1998
"... In these lectures we report recent work on the exact quantization of dimensionally reduced gravity [1, 2, 3, 4]. Assuming the presence of commuting Killing symmetries which effectively eliminate the dependence on all but two spacetime coordinates allows us to cast the models into the form of 2d non ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
In these lectures we report recent work on the exact quantization of dimensionally reduced gravity [1, 2, 3, 4]. Assuming the presence of commuting Killing symmetries which effectively eliminate the dependence on all but two spacetime coordinates allows us to cast the models into the form of 2d nonlinear (G/H)coset space σmodels coupled