Results 1  10
of
203
Decoherence, einselection, and the quantum origins of the classical
 REVIEWS OF MODERN PHYSICS 75, 715. AVAILABLE ONLINE AT HTTP://ARXIV.ORG/ABS/QUANTPH/0105127
, 2003
"... The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) ..."
Abstract

Cited by 113 (1 self)
 Add to MetaCart
The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) of the symptoms of classicality can be induced in quantum systems by their environments. Thus decoherence is caused by the interaction in which the environment in effect monitors certain observables of the system, destroying coherence between the pointer states corresponding to their eigenvalues. This leads to environmentinduced superselection or einselection, a quantum process associated with selective loss of information. Einselected pointer states are stable. They can retain correlations with the rest of the universe in spite of the environment. Einselection enforces classicality by imposing an effective ban on the vast majority of the Hilbert space, eliminating especially the flagrantly nonlocal "Schrödingercat states." The classical structure of phase space emerges from the quantum Hilbert space in the appropriate macroscopic limit. Combination of einselection with dynamics leads to the idealizations of a point and of a classical trajectory. In measurements, einselection replaces quantum entanglement between the apparatus and the measured system with the classical correlation. Only the preferred pointer observable of the apparatus can store information
Evolution of biological complexity
 Proceedings of the National Academy of Sciences
, 2000
"... In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent informationtheoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a seq ..."
Abstract

Cited by 93 (23 self)
 Add to MetaCart
(Show Context)
In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent informationtheoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that because natural selection forces genomes to behave as a natural “Maxwell Demon”, within a fixed environment genomic complexity is forced to increase. Darwinian evolution is a simple yet powerful process that requires only a population of reproducing organisms in which each offspring has the potential for a heritable variation from its parent. This principle governs evolution in the natural world, and has gracefully produced organisms of vast complexity. Still, whether or not complexity increases through evolution has become a contentious issue. Gould [1] for example argues that any recognizable trend can be explained by the “drunkard’s walk ” model, where “progress ” is due simply to a fixed boundary condition. McShea [2] investigates trends in the evolution of certain types of structural and functional complexity, and finds some evidence of a trend but nothing conclusive. In fact, he concludes that “Something may be increasing. But is it complexity? ” Bennett [3], on the
Hypercomputation: computing more than the Turing machine
, 2002
"... In this report I provide an introduction to the burgeoning field of hypercomputation – the study of machines that can compute more than Turing machines. I take an extensive survey of many of the key concepts in the field, tying together the disparate ideas and presenting them in a structure which al ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
In this report I provide an introduction to the burgeoning field of hypercomputation – the study of machines that can compute more than Turing machines. I take an extensive survey of many of the key concepts in the field, tying together the disparate ideas and presenting them in a structure which allows comparisons of the many approaches and results. To this I add several new results and draw out some interesting consequences of hypercomputation for several different disciplines. I begin with a succinct introduction to the classical theory of computation and its place amongst some of the negative results of the 20 th Century. I then explain how the ChurchTuring Thesis is commonly misunderstood and present new theses which better describe the possible limits on computability. Following this, I introduce ten different hypermachines (including three of my own) and discuss in some depth the manners in which they attain their power and the physical plausibility of each method. I then compare the powers of the different models using a device from recursion theory. Finally, I examine the implications of hypercomputation to mathematics, physics, computer science and philosophy. Perhaps the most important of these implications is that the negative mathematical results of Gödel, Turing and Chaitin are each dependent upon the nature of physics. This both weakens these results and provides strong links between mathematics and physics. I conclude that hypercomputation is of serious academic interest within many disciplines, opening new possibilities that were previously ignored because of long held misconceptions about the limits of computation.
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 35 (15 self)
 Add to MetaCart
(Show Context)
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
Strategy and War: The Strategic Theory of John Boyd. Strategy and History Series
, 2007
"... The cover illustration depicts the comprehensive rendering of the OODA loop which ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
The cover illustration depicts the comprehensive rendering of the OODA loop which
Worlds in the Everett Interpretation
 Studies in the History and Philosopy of Modern Physics
, 2002
"... This is a discussion of how we can understand the worldview given to us by the Everett interpretation of quantum mechanics, and in particular the rôle played by the concept of ‘world’. The view presented is that we are entitled to use ‘manyworlds ’ terminology even if the theory does not specify t ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
This is a discussion of how we can understand the worldview given to us by the Everett interpretation of quantum mechanics, and in particular the rôle played by the concept of ‘world’. The view presented is that we are entitled to use ‘manyworlds ’ terminology even if the theory does not specify the worlds in the formalism; this is defended by means of an extensive analogy with the concept of an ‘instant ’ or moment of time in relativity, with the lack of a preferred foliation of spacetime being compared with the lack of a preferred basis in quantum theory. Implications for identity of worlds over time, and for relativistic quantum mechanics, are discussed.
A Rosetta stone for quantum mechanics with an introduction to quantum computation
, 2002
"... Abstract. The purpose of these lecture notes is to provide readers, who have some mathematical background but little or no exposure to quantum mechanics and quantum computation, with enough material to begin reading ..."
Abstract

Cited by 24 (10 self)
 Add to MetaCart
Abstract. The purpose of these lecture notes is to provide readers, who have some mathematical background but little or no exposure to quantum mechanics and quantum computation, with enough material to begin reading
A quantum computer only needs one universe
 In Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, chapter 8
, 2003
"... The nature of quantum computation is discussed. It is argued that, in terms of the amount of information manipulated in a given time, quantum and classical computation are equally efficient. Quantum superposition does not permit quantum computers to “perform many computations simultaneously ” except ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
(Show Context)
The nature of quantum computation is discussed. It is argued that, in terms of the amount of information manipulated in a given time, quantum and classical computation are equally efficient. Quantum superposition does not permit quantum computers to “perform many computations simultaneously ” except in a highly qualified and to some extent misleading sense. Quantum computation is therefore not well described by interpretations of quantum mechanics which invoke the concept of vast numbers of parallel universes. Rather, entanglement makes available types of computation process which, while not exponentially larger than classical ones, are unavailable to classical systems. The essence of quantum computation is that it uses entanglement to generate and manipulate a physical representation of the correlations between logical entities, without the need to completely represent the logical entities themselves.
Is Information Meaningful Data
 Philosophy and Phenomenological Research
"... There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the DretskeGrice approach: meaningful and wellformed data constitute ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the DretskeGrice approach: meaningful and wellformed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truthvalues do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudoinformation, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudoinformation. As a consequence, SDI is revised to include a necessary truthcondition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition. 1.
The neglected pillar of material computation
 PHYSICA D
, 2008
"... Many novel forms of computational material have been suggested, from using slime moulds to solve graph searching problems, to using packaging foam to solve differential equations. I argue that attempting to force such novel approaches into the conventional Universal Turing computational framework wi ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
(Show Context)
Many novel forms of computational material have been suggested, from using slime moulds to solve graph searching problems, to using packaging foam to solve differential equations. I argue that attempting to force such novel approaches into the conventional Universal Turing computational framework will provide neither insights into theoretical questions of computation, nor more powerful computational machines. Instead, we should be investigating matter from the perspective of its natural computational capabilities. I also argue that we should investigate nonbiological substrates, since these are less complex in that they have not been tuned by evolution to have their particular properties. Only then we will understand both aspects of computation (logical and physical) required to understand the computation occurring in biological systems.