Results 1  10
of
246
Flux compactification
"... Contents We review recent work in which compactifications of string and M theory are constructed in which all scalar fields (moduli) are massive, and supersymmetry is broken with a small positive cosmological constant, features needed to reproduce real world physics. We explain how this work implies ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
Contents We review recent work in which compactifications of string and M theory are constructed in which all scalar fields (moduli) are massive, and supersymmetry is broken with a small positive cosmological constant, features needed to reproduce real world physics. We explain how this work implies that there is a “landscape ” of string/M theory vacua, perhaps containing many candidates
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
The Collective Stance in Modeling Expertise in Individuals and Organizations
, 1994
"... This paper is concerned with modeling the nature of expertise and its role in society in relation to research on expert systems and enterprise models. It argues for the adoption of a collective stance in which the human species is viewed as a single organism recursively partitioned in space and time ..."
Abstract

Cited by 28 (20 self)
 Add to MetaCart
This paper is concerned with modeling the nature of expertise and its role in society in relation to research on expert systems and enterprise models. It argues for the adoption of a collective stance in which the human species is viewed as a single organism recursively partitioned in space and time into suborganisms that are similar to the whole. These parts include societies, organizations, groups, individuals, roles, and neurological functions. Notions of expertise arise because the organism adapts as a whole through adaptation of its interacting parts. The phenomena of expertise correspond to those leading to distribution of tasks and functional differentiation of the parts. The mechanism is one of positive feedback from parts of the organism allocating resources for action to other parts on the basis of those latter parts past performance of similar activities. Distribution and differentiation follow if performance is rewarded, and low performers of tasks, being excluded by the f...
RELATIVISTIC COMPUTERS AND THE TURING Barrier
, 2006
"... We examine the current status of the physical version of the ChurchTuring Thesis (PhCT for short) in view of latest developments in spacetime theory. This also amounts to investigating the status of hypercomputation in view of latest results on spacetime. We agree with Deutsch et al [17] that PhCT ..."
Abstract

Cited by 22 (10 self)
 Add to MetaCart
We examine the current status of the physical version of the ChurchTuring Thesis (PhCT for short) in view of latest developments in spacetime theory. This also amounts to investigating the status of hypercomputation in view of latest results on spacetime. We agree with Deutsch et al [17] that PhCT is not only a conjecture of mathematics but rather a conjecture of a combination of theoretical physics, mathematics and, in some sense, cosmology. Since the idea of computability is intimately connected with the nature of Time, relevance of spacetime theory seems to be unquestionable. We will see that recent developments in spacetime theory show that temporal developments may exhibit features that traditionally seemed impossible or absurd. We will see that recent results point in the direction that the possibility of artificial systems computing nonTuring computable functions may be consistent with spacetime theory. All these trigger new open questions and new research directions for spacetime theory, cosmology, and computability.
Existential Risks: Analyzing Human Extinction Scenarios
 Journal of Evolution and Technology
"... Because of accelerating technological progress, humankind may be rapidly approaching a critical phase in its career. In addition to well−known threats such as nuclear holocaust, the prospects of radically transforming technologies like nanotech systems and machine intelligence present us with unprec ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
Because of accelerating technological progress, humankind may be rapidly approaching a critical phase in its career. In addition to well−known threats such as nuclear holocaust, the prospects of radically transforming technologies like nanotech systems and machine intelligence present us with unprecedented opportunities and risks. Our future, and whether we will have a future at all, may well be determined by how we deal with these challenges. In the case of radically transforming technologies, a better understanding of the transition dynamics from a human to a “posthuman ” society is needed. Of particular importance is to know where the pitfalls are: the ways in which things could go terminally wrong. While we have had long exposure to various personal, local, and endurable global hazards, this paper analyzes a recently emerging
Towards applying computational complexity to foundations of physics
 Notes of Mathematical Seminars of St. Petersburg Department of Steklov Institute of Mathematics
, 2004
"... In one of his early papers, D. Grigoriev analyzed the decidability and computational complexity of different physical theories. This analysis was motivated by the hope that this analysis would help physicists. In this paper, we survey several similar ideas that may be of help to physicists. We hope ..."
Abstract

Cited by 18 (17 self)
 Add to MetaCart
In one of his early papers, D. Grigoriev analyzed the decidability and computational complexity of different physical theories. This analysis was motivated by the hope that this analysis would help physicists. In this paper, we survey several similar ideas that may be of help to physicists. We hope that further research may lead to useful physical applications. 1
Cosmological Constant  the Weight of the Vacuum
 Phys. Rept
, 2003
"... Recent cosmological observations suggest the existence of a positive cosmological constant Λ with the magnitude Λ(G�/c 3) ≈ 10 −123. This review discusses several aspects of the cosmological constant both from the cosmological (sections 1–6) and field theoretical (sections 7–11) perspectives. The f ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
Recent cosmological observations suggest the existence of a positive cosmological constant Λ with the magnitude Λ(G�/c 3) ≈ 10 −123. This review discusses several aspects of the cosmological constant both from the cosmological (sections 1–6) and field theoretical (sections 7–11) perspectives. The first section introduces the key issues related to cosmological constant and provides a brief historical overview. This is followed by a summary of the kinematics and dynamics of the standard Friedmann model of the universe paying special attention to features involving the cosmological constant. Section 3 reviews the observational evidence for cosmological constant, especially the supernova results, constraints from the age of the universe and a few others. Theoretical models (quintessence, tachyonic scalar field,...) with evolving cosmological ‘constant ’ are described from different perspectives in the next section. Constraints on dark energy from structure formation and from CMBR anisotropies are discussed in the next two sections. The latter part of the review (sections 7–11) concentrates on more conceptual and fundamental aspects of the cosmological constant. Section 7 provides some alternative interpretations of the cosmological constant which could have a bearing on the possible solution to the problem. Several relaxation mechanisms have been suggested in the literature to reduce the cosmological constant to the currently observed value and some of these attempts are described in section 8. Next section gives a brief description of the geometrical structure of the de Sitter spacetime and the thermodynamics of the de Sitter universe is taken up in section 10. The last section deals with the role of string theory in the cosmological constant problem.
A dying universe: the longterm fate and evolution of astrophysical objects
 Rev. Mod. Phys
, 1997
"... This paper outlines astrophysical issues related to the long term fate of the universe. We consider the evolution of planets, stars, stellar populations, galaxies, and the universe itself over time scales which greatly exceed the current age of the universe. Our discussion starts with new stellar ev ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
This paper outlines astrophysical issues related to the long term fate of the universe. We consider the evolution of planets, stars, stellar populations, galaxies, and the universe itself over time scales which greatly exceed the current age of the universe. Our discussion starts with new stellar evolution calculations which follow the future evolution of the low mass (M type) stars that dominate the stellar mass function. We derive scaling relations which describe how the range of stellar masses and lifetimes depend on forthcoming increases in metallicity. We then proceed to determine the ultimate mass distribution of stellar remnants, i.e., the neutron stars, white dwarfs, and brown dwarfs remaining at the end of stellar evolution; this aggregate of remnants defines the “final stellar mass function”. At times exceeding ∼1–10 trillion years, the supply of interstellar gas will be exhausted, yet star formation will continue at a highly attenuated level via collisions between brown dwarfs. This process tails off as the galaxy gradually depletes its stars by ejecting the majority, and driving a minority toward eventual accretion onto massive black holes. As the galaxy disperses, stellar remnants provide a mechanism for converting the halo dark matter into radiative energy. Posited weakly interacting massive particles are accreted by white dwarfs, where they subsequently annihilate with each other. Thermalization of the decay products keeps the old white dwarfs much warmer
Categorizing different approaches to the cosmological constant problem
"... Abstract. We have found that proposals addressing the old cosmological constant problem come in various categories. The aim of this paper is to identify as many different, credible mechanisms as possible and to provide them with a code for future reference. We find that they all can be classified in ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Abstract. We have found that proposals addressing the old cosmological constant problem come in various categories. The aim of this paper is to identify as many different, credible mechanisms as possible and to provide them with a code for future reference. We find that they all can be classified into five different schemes of which we indicate the advantages and drawbacks. Besides, we add a new approach based on a symmetry principle mapping real to