Results 11  20
of
37
A SetBased Methodology for White Noise Modeling
, 1994
"... This paper provides a new framework for analyzing white noise disturbances in linear systems: rather than the usual stochastic approach, noise signals are described as elements in sets and their effect is analyzed from a worstcase perspective. The paper studies how these sets must be chosen in orde ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper provides a new framework for analyzing white noise disturbances in linear systems: rather than the usual stochastic approach, noise signals are described as elements in sets and their effect is analyzed from a worstcase perspective. The paper studies how these sets must be chosen in order to have adequate properties for system response in the worstcase, statistics consistent with the stochastic point of view, and simple descriptions that allow for tractable worstcase analysis. The methodology is demonstrated by considering its implications in two problems: rejection of white noise signals in the presence of system uncertainty, and worstcase system identification. 1 Introduction A general feature of mathematical models in engineering science is the presence of modeling errors, which arise due to poorly understood or highly unpredictable phenomena, or from simplifications deliberately introduced for the sake of model tractability. Essentially two approaches are available ...
PROPERTIES OF HIGHREDSHIFT LYMAN ALPHA CLOUDS II. STATISTICAL PROPERTIES OF THE CLOUDS
, 1993
"... Curve of growth analysis, applied to the Lyman series absorption ratios deduced in our previous paper, yields a measurement of the logarithmic slope of distribution of Lyman α clouds in column density N. The observed exponential distribution of the clouds ’ equivalent widths W is then shown to requi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Curve of growth analysis, applied to the Lyman series absorption ratios deduced in our previous paper, yields a measurement of the logarithmic slope of distribution of Lyman α clouds in column density N. The observed exponential distribution of the clouds ’ equivalent widths W is then shown to require a broad distribution of velocity parameters b, extending up to 80 km s −1. We show how the exponential itself emerges in a natural way. An absolute normalization for the differential distribution of cloud numbers in z, N, and b is obtained. By detailed analysis of absorption fluctuations along the line of sight (including correlations among neighboring spectral frequency bins) we are able to put upper limits on the cloudcloud correlation function ξ on several megaparsec length scales. We show that observed b values, if thermal, are incompatible, in several different ways, with the hypothesis of equilibrium heating and ionization by a background UV flux. Either a significant component of b is due to bulk motion (which we argue against on several grounds), or else the clouds are out of equilibrium, and hotter than is implied by their ionization state, a situation which could be indicative of recent adiabatic collapse. Subject headings: cosmology: observations – quasars – intergalactic medium – 2 – 1.
GoodnessofFit Test for Long Range Dependent Processes
"... In this paper, we make use of the information measure introduced by Mokkadem (1997) for building a goodnessoffit test for longrange dependent processes. Our test statistic is in the frequency domain and writes as a non linear functional of the normalized periodogram. We establish the asymptotic d ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper, we make use of the information measure introduced by Mokkadem (1997) for building a goodnessoffit test for longrange dependent processes. Our test statistic is in the frequency domain and writes as a non linear functional of the normalized periodogram. We establish the asymptotic distribution of our statistic under the null hypothesis. Under specific alternative hypotheses, we prove that the power converges to one. The performance of our test procedure is illustrated from different simulated series. In particular, we compare its size and its power with test of Chen and Deo.
Measurements of SecondOrder Properties of Point Processes
"... Abstract—The secondorder statistical properties of point processes (PPs) are described by the coincidence function which can be measured by a coincidence device, but such measurements are long and complicated. We propose another method of measurement, and we analyze its performances. The starting p ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract—The secondorder statistical properties of point processes (PPs) are described by the coincidence function which can be measured by a coincidence device, but such measurements are long and complicated. We propose another method of measurement, and we analyze its performances. The starting point is that the coincidence function can be deduced from the probability density functions of the life times (the distances between points) of the process. The idea is to transform the PP into a positive signal whose values are these distances. From an appropriate processing of this signal, we deduce the coincidence function. For the validation of the method, we use PPs for which the coincidence function is known. The agreement between theory and experiment is, in general, excellent. Finally, the method is applied to measure the coincidence functions of some PPs for which no theoretical result is available. Index Terms—Point processes (PPs), signal processing, signal representation, statistical measurements. I.
ANTIBODY REPERTOIRES AND PATHOGEN
, 1999
"... This dissertation is approved, and it is acceptable in quality and form for publication on microfilm: Approved by the Dissertation Committee: Accepted: ..."
Abstract
 Add to MetaCart
This dissertation is approved, and it is acceptable in quality and form for publication on microfilm: Approved by the Dissertation Committee: Accepted:
Equilibrium Unemployment and Labour Market Flows in the UK
, 1999
"... We argue that equilibrium unemployment has varies in the UK over the last twenty years, and that time series econometric methods have not always been suited to uncovering its evolution. Recent changes in the UK labour market seem to have had a significant impact on equilibrium unemployment, partly t ..."
Abstract
 Add to MetaCart
We argue that equilibrium unemployment has varies in the UK over the last twenty years, and that time series econometric methods have not always been suited to uncovering its evolution. Recent changes in the UK labour market seem to have had a significant impact on equilibrium unemployment, partly through a reduction in prime age participation. We calculate equilibrium unemployment using flows between employment, unemployment and inactivity for prime age members of the workforce. Labour market transitions do appear to have been eased by the acquisition of skills amongst the unemployed in the 1980s and 1990s, offsetting some of the potential effects of skill biased technical progress. Equilibrium unemployment may have fallen by up to two pints in the 1990s as compared to the late 1980s. We would judge that the UK has been close to equilibrium employment since late 1996.
A Convergence Theorem for the Standard LuriaDelbrück Distribution
, 2003
"... A new scaling for the standard discrete LuriaDelbruck distribution is provided which leads to a weak convergence result as the parameter of the distribution tends to in nity. We show that the stable limiting probability measure has the Fourier transform t 7! exp( jtj=2 it log jtj). ..."
Abstract
 Add to MetaCart
A new scaling for the standard discrete LuriaDelbruck distribution is provided which leads to a weak convergence result as the parameter of the distribution tends to in nity. We show that the stable limiting probability measure has the Fourier transform t 7! exp( jtj=2 it log jtj).
A Semidirect Approach to Structure from Motion
 The Visual Computer
, 2003
"... The problem of structure from motion is often decomposed into two steps: feature correspondence and threedimensional reconstruction. This separation often causes gross errors when establishing correspondence fails. Therefore, we advocate the necessity to integrate visual information not only in tim ..."
Abstract
 Add to MetaCart
The problem of structure from motion is often decomposed into two steps: feature correspondence and threedimensional reconstruction. This separation often causes gross errors when establishing correspondence fails. Therefore, we advocate the necessity to integrate visual information not only in time (i.e. across different views), but also in space, by matching regions  rather than points  using explicit photometric deformation models. We present an algorithm that integrates image feature tracking and threedimensional motion estimation into a closed loop, while detecting and rejecting outlier regions that do not fit the model. Due to occlusions and the causal nature of our algorithm, a drift in the estimates accumulates over time. We describe a method to perform global registration of local estimates of motion and structure by matching the appearance of feature regions stored over long time periods. We use image intensities to construct a score function that takes into account changes in brightness and contrast. Our algorithm is recursive and suitable for realtime implementation.
STANDARD LURIADELBRÜCK DISTRIBU
, 2003
"... A new scaling for the standard discrete LuriaDelbrück distribution is provided which leads to a weak convergence result as the parameter of the distribution tends to infinity. We show that the stable limiting probability measure has the Fourier transform t ↦ → exp(−πt/2−it log t). For the corre ..."
Abstract
 Add to MetaCart
A new scaling for the standard discrete LuriaDelbrück distribution is provided which leads to a weak convergence result as the parameter of the distribution tends to infinity. We show that the stable limiting probability measure has the Fourier transform t ↦ → exp(−πt/2−it log t). For the corresponding density an integral representation is derived, which differs from that found in a closely related paper of Kepler and Oprea [4]. In addition, we indicate how the approach is connected to more general compound Poisson distributions.
Bayesian Modelling of Music: Algorithmic Advances and . . .
, 2005
"... In order to perform many signal processing tasks such as classification, pattern recognition and coding, it is helpful to specify a signal model in terms of meaningful signal structures. In general, designing such a model is complicated and for many signals it is not feasible to specify the appropri ..."
Abstract
 Add to MetaCart
In order to perform many signal processing tasks such as classification, pattern recognition and coding, it is helpful to specify a signal model in terms of meaningful signal structures. In general, designing such a model is complicated and for many signals it is not feasible to specify the appropriate structure. Adaptive models overcome this problem by learning structures from a set of signals. Such adaptive models need to be general enough, so that they can represent relevant structures. However, more general models often require additional constraints to guide the learning procedure. In this thesis