Results 1  10
of
122
Evolution of biological complexity
 Proceedings of the National Academy of Sciences
, 2000
"... In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent informationtheoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a seq ..."
Abstract

Cited by 65 (17 self)
 Add to MetaCart
In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent informationtheoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that because natural selection forces genomes to behave as a natural “Maxwell Demon”, within a fixed environment genomic complexity is forced to increase. Darwinian evolution is a simple yet powerful process that requires only a population of reproducing organisms in which each offspring has the potential for a heritable variation from its parent. This principle governs evolution in the natural world, and has gracefully produced organisms of vast complexity. Still, whether or not complexity increases through evolution has become a contentious issue. Gould [1] for example argues that any recognizable trend can be explained by the “drunkard’s walk ” model, where “progress ” is due simply to a fixed boundary condition. McShea [2] investigates trends in the evolution of certain types of structural and functional complexity, and finds some evidence of a trend but nothing conclusive. In fact, he concludes that “Something may be increasing. But is it complexity? ” Bennett [3], on the
Decoherence, einselection, and the quantum origins of the classical
 REVIEWS OF MODERN PHYSICS 75, 715. AVAILABLE ONLINE AT HTTP://ARXIV.ORG/ABS/QUANTPH/0105127
, 2003
"... The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) ..."
Abstract

Cited by 48 (1 self)
 Add to MetaCart
The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) of the symptoms of classicality can be induced in quantum systems by their environments. Thus decoherence is caused by the interaction in which the environment in effect monitors certain observables of the system, destroying coherence between the pointer states corresponding to their eigenvalues. This leads to environmentinduced superselection or einselection, a quantum process associated with selective loss of information. Einselected pointer states are stable. They can retain correlations with the rest of the universe in spite of the environment. Einselection enforces classicality by imposing an effective ban on the vast majority of the Hilbert space, eliminating especially the flagrantly nonlocal "Schrödingercat states." The classical structure of phase space emerges from the quantum Hilbert space in the appropriate macroscopic limit. Combination of einselection with dynamics leads to the idealizations of a point and of a classical trajectory. In measurements, einselection replaces quantum entanglement between the apparatus and the measured system with the classical correlation. Only the preferred pointer observable of the apparatus can store information
Hypercomputation: computing more than the Turing machine
, 2002
"... In this report I provide an introduction to the burgeoning field of hypercomputation – the study of machines that can compute more than Turing machines. I take an extensive survey of many of the key concepts in the field, tying together the disparate ideas and presenting them in a structure which al ..."
Abstract

Cited by 32 (5 self)
 Add to MetaCart
In this report I provide an introduction to the burgeoning field of hypercomputation – the study of machines that can compute more than Turing machines. I take an extensive survey of many of the key concepts in the field, tying together the disparate ideas and presenting them in a structure which allows comparisons of the many approaches and results. To this I add several new results and draw out some interesting consequences of hypercomputation for several different disciplines. I begin with a succinct introduction to the classical theory of computation and its place amongst some of the negative results of the 20 th Century. I then explain how the ChurchTuring Thesis is commonly misunderstood and present new theses which better describe the possible limits on computability. Following this, I introduce ten different hypermachines (including three of my own) and discuss in some depth the manners in which they attain their power and the physical plausibility of each method. I then compare the powers of the different models using a device from recursion theory. Finally, I examine the implications of hypercomputation to mathematics, physics, computer science and philosophy. Perhaps the most important of these implications is that the negative mathematical results of Gödel, Turing and Chaitin are each dependent upon the nature of physics. This both weakens these results and provides strong links between mathematics and physics. I conclude that hypercomputation is of serious academic interest within many disciplines, opening new possibilities that were previously ignored because of long held misconceptions about the limits of computation.
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
A Rosetta stone for quantum mechanics with an introduction to quantum computation
, 2002
"... Abstract. The purpose of these lecture notes is to provide readers, who have some mathematical background but little or no exposure to quantum mechanics and quantum computation, with enough material to begin reading ..."
Abstract

Cited by 21 (9 self)
 Add to MetaCart
Abstract. The purpose of these lecture notes is to provide readers, who have some mathematical background but little or no exposure to quantum mechanics and quantum computation, with enough material to begin reading
Transcending the Limits of Turing Computability
, 1998
"... Hypercomputation or superTuring computation is a “computation ” that transcends the limit imposed by Turing’s model of computability. The field still faces some basic questions, technical (can we mathematically and/or physically build a hypercomputer?), cognitive (can hypercomputers realize the AI ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
Hypercomputation or superTuring computation is a “computation ” that transcends the limit imposed by Turing’s model of computability. The field still faces some basic questions, technical (can we mathematically and/or physically build a hypercomputer?), cognitive (can hypercomputers realize the AI dream?), philosophical (is thinking more than computing?). The aim of this paper is to address the question: can we mathematically build a hypercomputer? We will discuss the solutions of the Infinite Merchant Problem, a decision problem equivalent to the Halting Problem, based on results obtained in [9, 2]. The accent will be on the new computational technique and results rather than formal proofs. 1
Worlds in the Everett Interpretation
 Studies in the History and Philosopy of Modern Physics
, 2002
"... This is a discussion of how we can understand the worldview given to us by the Everett interpretation of quantum mechanics, and in particular the rôle played by the concept of ‘world’. The view presented is that we are entitled to use ‘manyworlds ’ terminology even if the theory does not specify t ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
This is a discussion of how we can understand the worldview given to us by the Everett interpretation of quantum mechanics, and in particular the rôle played by the concept of ‘world’. The view presented is that we are entitled to use ‘manyworlds ’ terminology even if the theory does not specify the worlds in the formalism; this is defended by means of an extensive analogy with the concept of an ‘instant ’ or moment of time in relativity, with the lack of a preferred foliation of spacetime being compared with the lack of a preferred basis in quantum theory. Implications for identity of worlds over time, and for relativistic quantum mechanics, are discussed.
TimeSymmetrized Counterfactuals in Quantum Theory,’ TelAviv University preprint TAUP
, 1997
"... Counterfactuals in quantum theory are briefly reviewed and it is argued that they are very different from counterfactuals considered in the general philosophical literature. The issue of time symmetry of quantum counterfactuals is considered and a novel timesymmetric definition of quantum counterfa ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
Counterfactuals in quantum theory are briefly reviewed and it is argued that they are very different from counterfactuals considered in the general philosophical literature. The issue of time symmetry of quantum counterfactuals is considered and a novel timesymmetric definition of quantum counterfactuals is proposed. This definition is applied for analyzing several controversies related to quantum counterfactuals. 1 There are very many philosophical discussions on the concept of counterfactuals and especially on the time’s arrow in counterfactuals. There is also a considerable literature on counterfactual in quantum theory. In order to be a helpful tool in quantum theory counterfactuals have to be rigorously defined. Unfortunately, the concept of counterfactuals is vague 1 and this leads to several controversies. I, however, believe that since quantum counterfactuals appear in a much narrow context than in general discussions on counterfactuals, they can be defined unambiguously. I will briefly review counterfactuals in quantum theory and will propose a rigorous definition which can clarify several issues, in particular, those related to the timesymmetry of quantum counterfactuals.
The New AI: General & Sound & Relevant for Physics
, 2003
"... Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inducti ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inductive inference based on Occam's razor, problem solving, decision making, and reinforcement learning in environments of a very general type. Since inductive inference is at the heart of all inductive sciences, some of the results are relevant not only for AI and computer science but also for physics, provoking nontraditional predictions based on Zuse's thesis of the computergenerated universe.
A quantum computer only needs one universe
 In Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, chapter 8
, 2003
"... The nature of quantum computation is discussed. It is argued that, in terms of the amount of information manipulated in a given time, quantum and classical computation are equally efficient. Quantum superposition does not permit quantum computers to “perform many computations simultaneously ” except ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
The nature of quantum computation is discussed. It is argued that, in terms of the amount of information manipulated in a given time, quantum and classical computation are equally efficient. Quantum superposition does not permit quantum computers to “perform many computations simultaneously ” except in a highly qualified and to some extent misleading sense. Quantum computation is therefore not well described by interpretations of quantum mechanics which invoke the concept of vast numbers of parallel universes. Rather, entanglement makes available types of computation process which, while not exponentially larger than classical ones, are unavailable to classical systems. The essence of quantum computation is that it uses entanglement to generate and manipulate a physical representation of the correlations between logical entities, without the need to completely represent the logical entities themselves.