Results 11  20
of
161
Towards the QFT on Curved Spacetime Limit of QGR. I: A General Scheme
"... In this article and the companion paper [1] we address the question of how one might obtain the semiclassical limit of ordinary matter quantum fields (QFT) propagating on curved spacetimes (CST) from full fledged Quantum General Relativity (QGR), starting from first principles. We stress that we do ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
In this article and the companion paper [1] we address the question of how one might obtain the semiclassical limit of ordinary matter quantum fields (QFT) propagating on curved spacetimes (CST) from full fledged Quantum General Relativity (QGR), starting from first principles. We stress that we do not claim to have a satisfactory answer to this question, rather our intention is to ignite a discussion by displaying the problems that have to be solved when carrying out such a program. In the first paper of this series of two we propose a general scheme of logical steps that one has to take in order to arrive at such a limit. We discuss the technical and conceptual problems that arise in doing so and how they can be solved in principle. As to be expected, completely new issues arise due to the fact that QGR is a background independent theory. For instance, fundamentally the notion of a photon involves not only the Maxwell quantum field but also the metric operator – in a sense, there is no photon vacuum state but a “photon vacuum operator”! Such problems have, to the best of our knowledge, not been discussed in the literature before, we are facing squarely one aspect of the deep conceptual difference between a
Bayesian Decision Theory, the Maximum Local Mass Estimate, and Color Constancy
 IN PROCEEDINGS: FIFTH INTERNATIONAL CONFERENCE ON COMPUTER VISION, PP 210217, (IEEE COMPUTER
, 1995
"... Vision algorithms are often developed in a Bayesian framework. Two estimators are commonly used: maximum a posteriori (MAP), and minimum mean squared error (MMSE). We argue that neither is appropriate for perception problems. The MAP estimator makes insufficient use of structure in the posterior pro ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Vision algorithms are often developed in a Bayesian framework. Two estimators are commonly used: maximum a posteriori (MAP), and minimum mean squared error (MMSE). We argue that neither is appropriate for perception problems. The MAP estimator makes insufficient use of structure in the posterior probability. The squared error penalty of the MMSE estimator does not reflect typical penalties. We describe a new
Bistatic synthetic aperture radar imaging for arbitrary flight trajectories
 IEEE Trans. Image Process
"... Abstract—In this paper, we present an analytic, filtered backprojection (FBP) type inversion method for bistatic synthetic aperture radar (BISAR). We consider a BISAR system where a scene of interest is illuminated by electromagnetic waves that are transmitted, at known times, from positions along a ..."
Abstract

Cited by 18 (12 self)
 Add to MetaCart
Abstract—In this paper, we present an analytic, filtered backprojection (FBP) type inversion method for bistatic synthetic aperture radar (BISAR). We consider a BISAR system where a scene of interest is illuminated by electromagnetic waves that are transmitted, at known times, from positions along an arbitrary, but known, flight trajectory and the scattered waves are measured from positions along a different flight trajectory which is also arbitrary, but known. We assume a singlescattering model for the radar data, and we assume that the ground topography is known but not necessarily flat. We use microlocal analysis to develop the FBPtype reconstruction method. We analyze the computational complexity of the numerical implementation of the method and present numerical simulations to demonstrate its performance. Index Terms—Bistatic, filtered backprojection, microlocal analysis, radar, synthetic aperture imaging.
Synthetic aperture hitchhiker imaging
 IEEE Transactions on Imaging Processing
, 2008
"... Abstract—We introduce a novel syntheticaperture imaging method for radar systems that rely on sources of opportunity. We consider receivers that fly along arbitrary, but known, flight trajectories and develop a spatiotemporal correlationbased filteredbackprojectiontype image reconstruction meth ..."
Abstract

Cited by 18 (13 self)
 Add to MetaCart
Abstract—We introduce a novel syntheticaperture imaging method for radar systems that rely on sources of opportunity. We consider receivers that fly along arbitrary, but known, flight trajectories and develop a spatiotemporal correlationbased filteredbackprojectiontype image reconstruction method. The method involves first correlating the measurements from two different receiver locations. This leads to a forward model where the radiance of the target scene is projected onto the intersection of certain hyperboloids with the surface topography. We next use microlocal techniques to develop a filteredbackprojectiontype inversion method to recover the scene radiance. The method is applicable to both stationary and mobile, and cooperative and noncooperative sources of opportunity. Additionally, it is applicable to nonideal imaging scenarios such as those involving arbitrary flight trajectories, and has the desirable property of preserving the visible edges of the scene radiance. We present an analysis of the computational complexity of the image reconstruction method and demonstrate its performance in numerical simulations for single and multiple transmitters of opportunity. Index Terms—Generalized filteredbackprojection, microlocal analysis, passive imaging, radar, synthetic aperture imaging.
Efficient Computation of Stochastic Complexity
 Proceedings of the Ninth International Conference on Artificial Intelligence and Statistics
, 2003
"... Stochastic complexity of a data set is defined as the shortest possible code length for the data obtainable by using some fixed set of models. This measure is of great theoretical and practical importance as a tool for tasks such as model selection or data clustering. Unfortunately, computing ..."
Abstract

Cited by 15 (11 self)
 Add to MetaCart
Stochastic complexity of a data set is defined as the shortest possible code length for the data obtainable by using some fixed set of models. This measure is of great theoretical and practical importance as a tool for tasks such as model selection or data clustering. Unfortunately, computing the modern version of stochastic complexity, defined as the Normalized Maximum Likelihood (NML) criterion, requires computing a sum with an exponential number of terms. Therefore, in order to be able to apply the stochastic complexity measure in practice, in most cases it has to be approximated. In this paper, we show that for some interesting and important cases with multinomial data sets, the exponentiality can be removed without loss of accuracy. We also introduce a new computationally efficient approximation scheme based on analytic combinatorics and assess its accuracy, together with earlier approximations, by comparing them to the exact form.
Ray theory for a locally layered random medium
, 2004
"... We consider acoustic pulse propagation in inhomogeneous media over relatively long propagation distances. Our main objective is to characterize the spreading of the travelling pulse due to microscale variations in the medium parameters. The pulse is generated by a point source and the medium is mode ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
We consider acoustic pulse propagation in inhomogeneous media over relatively long propagation distances. Our main objective is to characterize the spreading of the travelling pulse due to microscale variations in the medium parameters. The pulse is generated by a point source and the medium is modeled by a smooth three dimensional background that is modulated by stratified random fluctuations. We refer to such media as locally layered. We show that, when the pulse is observed relative to its random arrival time, it stabilizes to a shape determined by the slowly varying background convoluted with a Gaussian. The width of the Gaussian and the random travel time are determined by the medium parameters along the ray connecting the source and the point of observation. The ray is determined by high frequency asymptotics (geometrical optics). If we observe the pulse in a deterministic frame moving with the effective slowness, it does not stabilize and its mean is broader because of the random component of the travel time. The analysis of this phenomenon involves the asymptotic solution of partial differential equations with randomly varying coefficients and is based on a new representation of the field in terms of generalized plane waves that travel in opposite directions relative to the layering.
Pressure Fields Generated By Acoustical Pulses Propagating in Randomly Layered Media
, 1997
"... This paper investigates the pressure field generated at the bottom of a highcontrast randomly layered slab by an acoustical pulse emitted at the surface of the slab. This analysis takes place in the framework introduced by Asch, Kohler, Papanicolaou, Postel and White [1] where the incident pulse wa ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
This paper investigates the pressure field generated at the bottom of a highcontrast randomly layered slab by an acoustical pulse emitted at the surface of the slab. This analysis takes place in the framework introduced by Asch, Kohler, Papanicolaou, Postel and White [1] where the incident pulse wave length is long compared to the correlation length of the random inhomogeneities, but short compared to the size of the slab. This problem has been studied in the onedimensional case simultaneously by Clouet and Fouque [4] and Lewicki, Burridge and Papanicolaou [6] or for multimode plane wave pulses in Lewicki, Burridge and De Hoop [7]. These situations require only the use of classical diffusionapproximation results whereas the pointsource problem studied in this paper requires a nontrivial combination of diffusionapproximation results with stationary phase methods. The stationary phase method has been used by De Hoop, Chang and Burridge [5] for weakly fluctuating media and in [1] fo...
Asymptotic Analysis Of Tail Probabilities Based On The Computation Of Moments
, 1995
"... Choudhury and Lucantoni recently developed an algorithm for calculating moments of a probability distribution by numerically inverting its moment generating function. They also showed that highorder moments can be used to calculate asymptotic parameters of the complementary cumulative distribution ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
Choudhury and Lucantoni recently developed an algorithm for calculating moments of a probability distribution by numerically inverting its moment generating function. They also showed that highorder moments can be used to calculate asymptotic parameters of the complementary cumulative distribution function when an asymptotic form is assumed, such as F c (x) ~ ax b e hx as x . Momentbased algorithms for computing asymptotic parameters are especially useful when the transforms are not available explicitly, as in models of busy periods or polling systems. Here we provide additional theoretical support for this momentbased algorithm for computing asymptotic parameters and new refined estimators for the case b 0. The new refined estimators converge much faster (as a function of moment order) than the previous estimators, which means that fewer moments are needed, thereby speeding up the algorithm. We also show how to compute all the parameters in a multiterm asymptote of the form F c (x) ~ k = 1 S m a k x b  k + 1 e hx . We identify conditions under which the estimators converge to the asymptotic parameters and we determine rates of convergence, focusing especially on the case b 0. Even when b = 0, we show that it is necessary to assume the asymptotic form for the complementary distribution function; the asymptotic form is not implied by convergence of the momentbased estimators alone. In order to get good estimators of the asymptotic decay rate h and the asymptotic power b when b 0, a multipleterm asymptotic expansion is required. Such asymptotic expansions typically hold when b 0, corresponding to the dominant singularity of the transform being a multiple pole (b a positive integer) or an algebraic singularity (branch point, b noninteger)...
Planar Maps and Airy Phenomena
, 2000
"... A considerable number of asymptotic distributions arising in random combinatorics and analysis of algorithms are of the exponentialquadratic type (e x 2 ), that is, Gaussian. We exhibit here a new class of \universal" phenomena that are of the exponentialcubic type (e ix 3 ), corresponding to ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
A considerable number of asymptotic distributions arising in random combinatorics and analysis of algorithms are of the exponentialquadratic type (e x 2 ), that is, Gaussian. We exhibit here a new class of \universal" phenomena that are of the exponentialcubic type (e ix 3 ), corresponding to nonstandard distributions that involve the Airy function. Such Airy phenomena are expected to be found in a number of applications, when conuences of critical points and singularities occur. About a dozen classes of planar maps are treated in this way, leading to the occurrence of a common Airy distribution that describes the sizes of cores and of largest (multi)connected components. Consequences include the analysis and ne optimization of random generation algorithms for multiply connected planar graphs.