Results 1  10
of
27
On the Convergence of Monte Carlo Maximum Likelihood Calculations
 Journal of the Royal Statistical Society B
, 1992
"... Monte Carlo maximum likelihood for normalized families of distributions (Geyer and Thompson, 1992) can be used for an extremely broad class of models. Given any family f h ` : ` 2 \Theta g of nonnegative integrable functions, maximum likelihood estimates in the family obtained by normalizing the the ..."
Abstract

Cited by 59 (3 self)
 Add to MetaCart
Monte Carlo maximum likelihood for normalized families of distributions (Geyer and Thompson, 1992) can be used for an extremely broad class of models. Given any family f h ` : ` 2 \Theta g of nonnegative integrable functions, maximum likelihood estimates in the family obtained by normalizing the the functions to integrate to one can be approximated by Monte Carlo, the only regularity conditions being a compactification of the parameter space such that the the evaluation maps ` 7! h ` (x) remain continuous. Then with probability one the Monte Carlo approximant to the log likelihood hypoconverges to the exact log likelihood, its maximizer converges to the exact maximum likelihood estimate, approximations to profile likelihoods hypoconverge to the exact profile, and level sets of the approximate likelihood (support regions) converge to the exact sets (in Painlev'eKuratowski set convergence). The same results hold when there are missing data (Thompson and Guo, 1991, Gelfand and Carlin, 19...
Performance Engineering of the World Wide Web: Application to Dimensioning and Cache Design
, 1996
"... The quality of the service provided by the World Wide Web, namely convenient access to a tremendous amount of information in remote locations, depends in an important way on the time required to retrieve this information. This time in turn depends on a number of parameters, in particular the load at ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
The quality of the service provided by the World Wide Web, namely convenient access to a tremendous amount of information in remote locations, depends in an important way on the time required to retrieve this information. This time in turn depends on a number of parameters, in particular the load at the server and in the network. Overloads are avoided by carefully dimensioning the server (so that it has enough resources such as CPU power and disk space to handle expected requests) and the network (so that it has enough resources such as bandwidth and buoeers to transport requests and replies), and by using mechanisms such as caching that minimize the resource requirements of user requests. In this paper, we consider performance issues related to dimensioning and caching. Our contribution is twofold. Regarding dimensioning, we advocate the use of time series analysis techniques for Web traffic modeling and forecasting. We show using experimental data that quantities of interest such as t...
On the Applicability of Regenerative Simulation in Markov Chain Monte Carlo
, 2001
"... We consider the central limit theorem and the calculation of asymptotic standard errors for the ergodic averages constructed in Markov chain Monte Carlo. Chan & Geyer (1994) established a central limit theorem for ergodic averages by assuming that the underlying Markov chain is geometrically ergo ..."
Abstract

Cited by 35 (24 self)
 Add to MetaCart
We consider the central limit theorem and the calculation of asymptotic standard errors for the ergodic averages constructed in Markov chain Monte Carlo. Chan & Geyer (1994) established a central limit theorem for ergodic averages by assuming that the underlying Markov chain is geometrically ergodic and that a simple moment condition is satisfied. While it is relatively straightforward to check Chan and Geyer's conditions, their theorem does not lead to a consistent and easily computed estimate of the variance of the asymptotic normal distribution. Conversely, Mykland, Tierney & Yu (1995) discuss the use of regeneration to establish an alternative central limit theorem with the advantage that a simple, consistent estimate of the asymptotic variance is readily available. However, their result assumes a pair of unwieldy moment conditions whose verification is difficult in practice. In this paper, we show that the conditions of Chan and Geyer's theorem are sucient to establish Mykland, Tierney, and Yu's central limit theorem. This result, in conjunction with other recent developments, should pave the way for more widespread use of the regenerative method in Markov chain Monte Carlo. Our results are applied to the slice sampler for illustration.
Methods for quantifying the causal structure of bivariate time series
 Int. J. of Bifurcation and Chaos
, 2006
"... In the study of complex systems one of the major concerns is the detection and characterization of causal interdependencies and couplings between different subsystems. The nature of such dependencies is typically not only nonlinear but also asymmetric and thus makes the use of symmetric and linear m ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
In the study of complex systems one of the major concerns is the detection and characterization of causal interdependencies and couplings between different subsystems. The nature of such dependencies is typically not only nonlinear but also asymmetric and thus makes the use of symmetric and linear methods ineffective. Moreover, signals sampled from real world systems are noisy and short, posing additional constraints on the estimation of the underlying couplings. In this article, we compare a set of six recently introduced methods for quantifying the causal structure of bivariate time series extracted from systems with complex dynamical behavior. We discuss the usefulness of the methods for detecting asymmetric couplings and directional flow of information in the context of uni and bidirectionally coupled deterministic chaotic systems. Key words: causal structure, nonlinear time series analysis, coupled systems, information flow
Discriminating Mental Tasks Using EEG Represented by AR Models
 In Proceedings of the 1995 IEEE Engineering in Medicine and Biology Annual Conference
, 1995
"... EEG signals are modeled using singlechannel and multichannel autoregressive (AR) techniques. The coefficients of these models are used to classify EEG data into one of two classes corresponding to the mental task the subjects are performing. A neural network is trained to perform the classificatio ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
EEG signals are modeled using singlechannel and multichannel autoregressive (AR) techniques. The coefficients of these models are used to classify EEG data into one of two classes corresponding to the mental task the subjects are performing. A neural network is trained to perform the classification. When applying a trained network to test data, we find that the multivariate AR representation performs slightly better, resulting in an average classification accuracy of about 91%. I. Introduction and Background If different mental states can be reliably detected solely on the basis of EEG, then a new means of communication for paralyzed persons can be developed with which, for example, a wheelchair could be controlled. This difficult pattern recognition problem is primarily one finding a signal representation that captures information related to the mental state of a person in a way that is invariant to time and subject. This paper describes the results of experiments that were perfor...
Control Relevant Identification of a Compact Disc Pickup Mechanism
, 1993
"... This paper discusses the control relevant parametric identification of a servo system present in a Compact Disc player. In this application an approximate closed loop identification problem is... ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper discusses the control relevant parametric identification of a servo system present in a Compact Disc player. In this application an approximate closed loop identification problem is...
Model Selection and Order Determination for Time Series by Information Between the Past and the Future
 Journal of Time Series Analysis
, 1996
"... In this paper, the information between the past and the future of a Gaussian stationary sequence is calculated either by its spectral density or by its autocovariances, and is related to the problem of model fitting. It is demonstrated that the criterion of minimum mutual information is the generali ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper, the information between the past and the future of a Gaussian stationary sequence is calculated either by its spectral density or by its autocovariances, and is related to the problem of model fitting. It is demonstrated that the criterion of minimum mutual information is the generalization of that of maximum entropy. By employing the above information quantity, we proposed a procedure, which is called LIC for simplicity, to obtain consistent estimate of the order of the Bloomfield model or the autoregressive model. In Monte Carlo studies, we illustrate the LIC procedure by several examples, and also estimate spectral density of time series by the Bloomfield model and LIC method. Key Words and Phrases: time series, information, parametric models, order selection. 3 x1 Introduction The main purpose of this paper is to exhibit some applications of the mutual information (that between the past and the future of time series) in the fields of parametric modelling. Suppose ...
Minimum weighted norm wavefield reconstruction for AVA imaging
 Geophysical Prospecting
, 2005
"... Seismic wavefield reconstruction is posed as an inversion problem where, from inadequate and incomplete data, we attempt to recover the data we would have acquired with a denser distribution of sources and receivers. A minimum weighted norm interpolation method is proposed to interpolate prestack vo ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Seismic wavefield reconstruction is posed as an inversion problem where, from inadequate and incomplete data, we attempt to recover the data we would have acquired with a denser distribution of sources and receivers. A minimum weighted norm interpolation method is proposed to interpolate prestack volumes before waveequation amplitude versus angle imaging. Synthetic and real data were used to investigate the effectiveness of our wavefield reconstruction scheme when preconditioning seismic data for waveequation amplitude versus angle imaging.
Divide and Conquer: Pattern Recognition using Mixtures of Experts
, 1997
"... speech recognition task. The mixture of experts is shown to be a superior method for speaker adaptation of connectionist models to new conditions. In addition, the significant improvement of the performance of an ensemble of classifiers via the mixture framework is demonstrated. In addition to these ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
speech recognition task. The mixture of experts is shown to be a superior method for speaker adaptation of connectionist models to new conditions. In addition, the significant improvement of the performance of an ensemble of classifiers via the mixture framework is demonstrated. In addition to these applications, a number of theoretical extensions of the mixture of experts have been made in this thesis. The link between hierarchical mixtures of experts (HME) and other tree based models is described and used to motivate a new training algorithm for the HME, known as tree growing. Tree growing is a constructive algorithm which results in faster training and a more efficient use of parameters than standard training methods. The second extension described is path pruning which is a fast training and evaluation algorithm for deep hierarchies in which paths through the tree which have low probability are ignored. A stabilising method for the algorithm based on weight decay regularisation is