Results 1 - 10
of
127,210
Generalized Autoregressive Conditional Heteroskedasticity
- JOURNAL OF ECONOMETRICS
, 1986
"... A natural generalization of the ARCH (Autoregressive Conditional Heteroskedastic) process introduced in Engle (1982) to allow for past conditional variances in the current conditional variance equation is proposed. Stationarity conditions and autocorrelation structure for this new class of parametri ..."
Abstract
-
Cited by 2406 (30 self)
- Add to MetaCart
A natural generalization of the ARCH (Autoregressive Conditional Heteroskedastic) process introduced in Engle (1982) to allow for past conditional variances in the current conditional variance equation is proposed. Stationarity conditions and autocorrelation structure for this new class
Shallow Parsing with Conditional Random Fields
, 2003
"... Conditional random fields for sequence labeling offer advantages over both generative models like HMMs and classifiers applied at each sequence position. Among sequence labeling tasks in language processing, shallow parsing has received much attention, with the development of standard evaluati ..."
Abstract
-
Cited by 581 (8 self)
- Add to MetaCart
Conditional random fields for sequence labeling offer advantages over both generative models like HMMs and classifiers applied at each sequence position. Among sequence labeling tasks in language processing, shallow parsing has received much attention, with the development of standard
Dynamic Conditional Correlation: A simple class of multivariate Generalized Autoregressive Conditional Heteroskedasticity Models.
- Journal of Business & Economic Statistics
, 2002
"... Abstract Time varying correlations are often estimated with Multivariate Garch models that are linear in squares and cross products of the data. A new class of multivariate models called dynamic conditional correlation (DCC) models is proposed. These have the flexibility of univariate GARCH models ..."
Abstract
-
Cited by 711 (17 self)
- Add to MetaCart
Abstract Time varying correlations are often estimated with Multivariate Garch models that are linear in squares and cross products of the data. A new class of multivariate models called dynamic conditional correlation (DCC) models is proposed. These have the flexibility of univariate GARCH models
Conditional random fields: Probabilistic models for segmenting and labeling sequence data
, 2001
"... We present conditional random fields, a framework for building probabilistic models to segment and label sequence data. Conditional random fields offer several advantages over hidden Markov models and stochastic grammars for such tasks, including the ability to relax strong independence assumptions ..."
Abstract
-
Cited by 3485 (85 self)
- Add to MetaCart
We present conditional random fields, a framework for building probabilistic models to segment and label sequence data. Conditional random fields offer several advantages over hidden Markov models and stochastic grammars for such tasks, including the ability to relax strong independence assumptions
Initial Conditions and Moment Restrictions in Dynamic Panel Data Models
- Journal of Econometrics
, 1998
"... Estimation of the dynamic error components model is considered using two alternative linear estimators that are designed to improve the properties of the standard firstdifferenced GMM estimator. Both estimators require restrictions on the initial conditions process. Asymptotic efficiency comparisons ..."
Abstract
-
Cited by 2393 (16 self)
- Add to MetaCart
Estimation of the dynamic error components model is considered using two alternative linear estimators that are designed to improve the properties of the standard firstdifferenced GMM estimator. Both estimators require restrictions on the initial conditions process. Asymptotic efficiency
CONDENSATION -- conditional density propagation for visual tracking
, 1998
"... The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimodal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses “factored sampling”, previously applied to th ..."
Abstract
-
Cited by 1503 (12 self)
- Add to MetaCart
to the interpretation of static images, in which the probability distribution of possible interpretations is represented by a randomly generated set. Condensation uses learned dynamical models, together with visual observations, to propagate the random set over time. The result is highly robust tracking of agile motion
Contour Tracking By Stochastic Propagation of Conditional Density
, 1996
"... . In Proc. European Conf. Computer Vision, 1996, pp. 343--356, Cambridge, UK The problem of tracking curves in dense visual clutter is a challenging one. Trackers based on Kalman filters are of limited use; because they are based on Gaussian densities which are unimodal, they cannot represent s ..."
Abstract
-
Cited by 661 (23 self)
- Add to MetaCart
simultaneous alternative hypotheses. Extensions to the Kalman filter to handle multiple data associations work satisfactorily in the simple case of point targets, but do not extend naturally to continuous curves. A new, stochastic algorithm is proposed here, the Condensation algorithm --- Conditional
Modeling and Forecasting Realized Volatility
, 2002
"... this paper is built. First, although raw returns are clearly leptokurtic, returns standardized by realized volatilities are approximately Gaussian. Second, although the distributions of realized volatilities are clearly right-skewed, the distributions of the logarithms of realized volatilities are a ..."
Abstract
-
Cited by 549 (50 self)
- Add to MetaCart
-frequency models, we find that our simple Gaussian VAR forecasts generally produce superior forecasts. Furthermore, we show that, given the theoretically motivated and empirically plausible assumption of normally distributed returns conditional on the realized volatilities, the resulting lognormal-normal mixture
Bayesian Analysis of Stochastic Volatility Models
, 1994
"... this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener- alized ARCH ..."
Abstract
-
Cited by 601 (26 self)
- Add to MetaCart
this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener- alized
Fit indices in covariance structure modeling: Sensitivity to underparameterized model misspecification
- Psychological Methods
, 1998
"... This study evaluated the sensitivity of maximum likelihood (ML)-, generalized least squares (GLS)-, and asymptotic distribution-free (ADF)-based fit indices to model misspecification, under conditions that varied sample size and distribution. The effect of violating assumptions of asymptotic robustn ..."
Abstract
-
Cited by 543 (0 self)
- Add to MetaCart
This study evaluated the sensitivity of maximum likelihood (ML)-, generalized least squares (GLS)-, and asymptotic distribution-free (ADF)-based fit indices to model misspecification, under conditions that varied sample size and distribution. The effect of violating assumptions of asymptotic
Results 1 - 10
of
127,210