Results 1  10
of
20
Dark Energy from Structure  A Status Report
 GEN. REL. GRAV., DARK ENERGY SPECIAL ISSUE
, 2007
"... The effective evolution of an inhomogeneous universe model in any theory of gravitation may be described in terms of spatially averaged variables. In Einstein’s theory, restricting attention to scalar variables, this evolution can be modeled by solutions of a set of Friedmann equations for an effe ..."
Abstract

Cited by 58 (9 self)
 Add to MetaCart
The effective evolution of an inhomogeneous universe model in any theory of gravitation may be described in terms of spatially averaged variables. In Einstein’s theory, restricting attention to scalar variables, this evolution can be modeled by solutions of a set of Friedmann equations for an effective volume scale factor, with matter and backreaction source terms. The latter can be represented by an effective scalar field (‘morphon field’) modeling Dark Energy. The present work provides an overview over the Dark Energy debate in connection with the impact of inhomogeneities, and formulates strategies for a comprehensive quantitative evaluation of backreaction effects both in theoretical and observational cosmology. We recall the basic steps of a description of backreaction effects in relativistic cosmology that lead to refurnishing the standard cosmological equations, but also lay down a number of challenges and unresolved issues in connection with their observational interpretation. The present status of this subject is intermediate: we have a good qualitative understanding of backreaction effects pointing to a global instability of the standard
Cosmological NonLinearities as an Effective Fluid
 JCAP
"... The universe is smooth on large scales but very inhomogeneous on small scales. Why is the spacetime on large scales modeled to a good approximation by the Friedmann equations? Are we sure that smallscale nonlinearities do not induce a large backreaction? Related to this, what is the effective theo ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
The universe is smooth on large scales but very inhomogeneous on small scales. Why is the spacetime on large scales modeled to a good approximation by the Friedmann equations? Are we sure that smallscale nonlinearities do not induce a large backreaction? Related to this, what is the effective theory that describes the universe on large scales? In this paper we make progress in addressing these questions. We show that the effective theory for the longwavelength universe behaves as a viscous fluid coupled to gravity: integrating out shortwavelength perturbations renormalizes the homogeneous background and introduces dissipative dynamics into the evolution of longwavelength perturbations. The effective fluid has small perturbations and is characterized by a few parameters like an equation of state, a sound speed and a viscosity parameter. These parameters can be matched to numerical simulations or fitted from observations. We find that the backreaction of smallscale nonlinearities is very small, being suppressed by the large hierarchy between the scale of nonlinearities and the horizon scale. The effective pressure of the fluid is always positive and much too small to significantly affect the background evolution. Moreover, we prove that virialized scales decouple completely from the largescale dynamics, at all orders in the postNewtonian expansion. We propose that our effective theory be used to formulate a welldefined and controlled alternative to conventional perturbation theory, and we discuss possible observational applications. Finally, our way of reformulating results in secondorder perturbation theory in terms of a longwavelength effective fluid provides the opportunity to understand nonlinear effects in a simple and physically intuitive way. ar X iv
Confronting LemaitreTolmanBondi models with Observational Cosmology
, 802
"... Abstract. The possibility that we live in a special place in the universe, close to the centre of a large void, seems an appealing alternative to the prevailing interpretation of the acceleration of the universe in terms of a ΛCDM model with a dominant dark energy component. In this paper we confron ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The possibility that we live in a special place in the universe, close to the centre of a large void, seems an appealing alternative to the prevailing interpretation of the acceleration of the universe in terms of a ΛCDM model with a dominant dark energy component. In this paper we confront the asymptotically flat LemaitreTolmanBondi (LTB) models with a series of observations, from Type Ia Supernovae to Cosmic Microwave Background and Baryon Acoustic Oscillations data. We propose two concrete LTB models describing a local void in which the only arbitrary functions are the radial dependence of the matter density ΩM and the Hubble expansion rate H. We find that all observations can be accommodated within 1 sigma, for our models with 4 or 5 independent parameters. The best fit models have a χ 2 very close to that of the ΛCDM model. A general Fortran program for comparing LTB models with cosmological observations, that has been used to make the parameter scan in this paper, is made public, and can be downloaded at
On the curvature of the present day Universe
, 2008
"... Abstract. We discuss the effect of curvature and matter inhomogeneities on the averaged scalar curvature of the present–day Universe. Motivated by studies of averaged inhomogeneous cosmologies, we contemplate on the question whether it is sensible to assume that curvature averages out on some scale ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We discuss the effect of curvature and matter inhomogeneities on the averaged scalar curvature of the present–day Universe. Motivated by studies of averaged inhomogeneous cosmologies, we contemplate on the question whether it is sensible to assume that curvature averages out on some scale of homogeneity, as implied by the standard concordance model of cosmology, or whether the averaged scalar curvature can be largely negative today, as required for an explanation of Dark Energy from inhomogeneities. We confront both conjectures with a detailed analysis of the kinematical backreaction term and estimate its strength for a multi– scale inhomogeneous matter and curvature distribution. Our main result is a formula for the spatially averaged scalar curvature involving quantities that are all measurable on regional (i.e. up to 100 Mpc) scales. We propose strategies to quantitatively evaluate the formula, and pinpoint the assumptions implied by the conjecture of a small or zero averaged curvature. We reach the conclusion that the standard concordance model needs fine–tuning in the sense of an assumed equipartition law for curvature in order to reconcile it with the estimated properties of the averaged physical space, whereas a negative averaged curvature is favoured, independent of the prior on the value of the cosmological constant.
Ricci flow deformation of cosmological initial data sets
, 2008
"... Ricci flow deformation of cosmological initial data sets in general relativity is a technique for generating families of initial data sets which potentially would allow to interpolate between distinct spacetimes. This idea has been around since the appearance of the Ricci flow on the scene, but it h ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Ricci flow deformation of cosmological initial data sets in general relativity is a technique for generating families of initial data sets which potentially would allow to interpolate between distinct spacetimes. This idea has been around since the appearance of the Ricci flow on the scene, but it has been difficult to turn it into a sound mathematical procedure. In this expository talk we illustrate, how Perelman’s recent results in Ricci flow theory can considerably improve on such a situation. From a physical point of view this analysis can be related to the issue of finding a constant–curvature template spacetime for the inhomogeneous Universe, relevant to the interpretation of observational data and, hence, bears relevance to the dark energy and dark matter debates. These techniques provide control on curvature fluctuations (intrinsic backreaction terms) in their relation to the averaged matter distribution.
Testing backreaction effects with observations
 arXiv:0808.1161
"... Abstract. In order to quantitatively test the ability of averaged inhomogeneous cosmologies to correctly describe observations of the large scale properties of the Universe, we introduce a smoothed template metric corresponding to a constant spatial curvature model at any time, but with an evolving ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In order to quantitatively test the ability of averaged inhomogeneous cosmologies to correctly describe observations of the large scale properties of the Universe, we introduce a smoothed template metric corresponding to a constant spatial curvature model at any time, but with an evolving curvature parameter. This metric is used to compute quantities along an approximate effective lightcone of the averaged model of the Universe. As opposed to the standard Friedmann model, we parameterize this template metric by exact scaling properties of an averaged inhomogeneous cosmology, and we also motivate this form of the metric by results on a geometrical smoothing of inhomogeneous cosmological hypersurfaces. We test our hypothesis for the template metric against supernova data and the position of the CMB peaks, and infer the goodness–of–fit and parameter uncertainties. We find that averaged inhomogeneous models can reproduce the observations without requiring an additional Dark Energy component (but they still need volume acceleration), and that current data do not disfavour our main assumption on the effective lightcone structure. We also show that the experimental uncertainties on the angular diameter distance and the Hubble
An Attempt To Explain Dark Energy In Terms Of Statistical Anisotropy
, 2010
"... An attempt to explain dark energy is made in terms of a modelling error introduced by the simplifications used to derive the Friedman equations of cosmology. A demonstration is given that small, statisticallyfluctuating anisotropic terms in the RobertsonWalker line element could effect an unexpect ..."
Abstract
 Add to MetaCart
(Show Context)
An attempt to explain dark energy is made in terms of a modelling error introduced by the simplifications used to derive the Friedman equations of cosmology. A demonstration is given that small, statisticallyfluctuating anisotropic terms in the RobertsonWalker line element could effect an unexpected additional expansion. A modelling error might therefore account for the cosmological constant, and anisotropies explain the anomalous acceleration in the expansion of the Cosmos. Such an interpretation has the advantage of not requiring any new fundamental physics. By assuming a flat Friedmann cosmology and then adding some statistical anisotropies the need for a locally hyperbolic geometry rather than the original flat geometry results. Hyperbolic geometry is more expansive and open than flat geometry in some sense, and so the tendency towards openness is suggestive of dark energy. Unfortunately (and for several reasons necessarily) an actual acceleration term does not appear in this local geometry in spite of this feature.
Contents
, 2005
"... Abstract. We consider a system of particles experiencing diffusion and mean field interaction, and study its behaviour when the number of particles goes to infinity. We derive nonasymptotic large deviation bounds measuring the concentration of the empirical measure of the paths of the particles aro ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. We consider a system of particles experiencing diffusion and mean field interaction, and study its behaviour when the number of particles goes to infinity. We derive nonasymptotic large deviation bounds measuring the concentration of the empirical measure of the paths of the particles around its limit. The method is based on a coupling argument, strong integrability estimates on the paths in Hölder norm, and some general concentration result for the empirical measure of identically distributed independent paths.