Results 1  10
of
140
An empirical investigation of continuoustime equity return models
 Journal of Finance
, 2002
"... This paper extends the class of stochastic volatility diffusions for asset returns to encompass Poisson jumps of timevarying intensity. We find that any reasonably descriptive continuoustime model for equityindex returns must allow for discrete jumps as well as stochastic volatility with a pronou ..."
Abstract

Cited by 170 (11 self)
 Add to MetaCart
This paper extends the class of stochastic volatility diffusions for asset returns to encompass Poisson jumps of timevarying intensity. We find that any reasonably descriptive continuoustime model for equityindex returns must allow for discrete jumps as well as stochastic volatility with a pronounced negative relationship between return and volatility innovations. We also find that the dominant empirical characteristics of the return process appear to be priced by the option market. Our analysis indicates a general correspondence between the evidence extracted from daily equityindex returns and the stylized features of the corresponding options market prices. MUCH ASSET AND DERIVATIVE PRICING THEORY is based on diffusion models for primary securities. However, prescriptions for practical applications derived from these models typically produce disappointing results. A possible explanation could be that analytic formulas for pricing and hedging are available for only a limited set of continuoustime representations for asset returns
An Econometric Model of Serial Correlation and Illiquidity in Hedge Fund Returns
 Journal of Financial Economics
, 2004
"... The returns to hedge funds and other alternative investments are often highly serially correlated, in sharp contrast to the returns of more traditional investment vehicles such as longonly equity portfolios and mutual funds. In this paper, we explore several sources of such serial correlation and s ..."
Abstract

Cited by 144 (7 self)
 Add to MetaCart
The returns to hedge funds and other alternative investments are often highly serially correlated, in sharp contrast to the returns of more traditional investment vehicles such as longonly equity portfolios and mutual funds. In this paper, we explore several sources of such serial correlation and show that the most likely explanation is illiquidity exposure, i.e., investments in securities that are not actively traded and for which market prices are not always readily available. For portfolios of illiquid securities, reported returns will tend to be smoother than true economic returns, which will understate volatility and increase riskadjusted performance measures such as the Sharpe ratio. We propose an econometric model of illiquidity exposure and develop estimators for the smoothing profile as well as a smoothingadjusted Sharpe ratio. For a sample of 908 hedge funds drawn from the TASS database, we show that our estimated smoothing coefficients vary considerably across hedgefund style categories and may be a useful proxy for quantifying illiquidity exposure.
How often to sample a continuoustime process in the presence of market microstructure noise
 Review of Financial Studies
, 2005
"... In theory, the sum of squares of log returns sampled at high frequency estimates their variance. When market microstructure noise is present but unaccounted for, however, we show that the optimal sampling frequency is finite and derives its closedform expression. But even with optimal sampling, usi ..."
Abstract

Cited by 111 (13 self)
 Add to MetaCart
In theory, the sum of squares of log returns sampled at high frequency estimates their variance. When market microstructure noise is present but unaccounted for, however, we show that the optimal sampling frequency is finite and derives its closedform expression. But even with optimal sampling, using say 5min returns when transactions are recorded every second, a vast amount of data is discarded, in contradiction to basic statistical principles. We demonstrate that modeling the noise and using all the data is a better solution, even if one misspecifies the noise distribution. So the answer is: sample as often as possible. Over the past few years, price data sampled at very high frequency have become increasingly available in the form of the Olsen dataset of currency exchange rates or the TAQ database of NYSE stocks. If such data were not affected by market microstructure noise, the realized volatility of the process (i.e., the average sum of squares of logreturns sampled at high frequency) would estimate the returns ’ variance, as is well known. In fact, sampling as often as possible would theoretically produce in the limit a perfect estimate of that variance. We start by asking whether it remains optimal to sample the price process at very high frequency in the presence of market microstructure noise, consistently with the basic statistical principle that, ceteris paribus, more data are preferred to less. We first show that, if noise is present but unaccounted for, then the optimal sampling frequency is finite, and we We are grateful for comments and suggestions from the editor, Maureen O’Hara, and two anonymous
Estimating Stochastic Volatility Diffusion Using Conditional Moments of Integrated Volatility
, 2000
"... We exploit the distributional information contained in highfrequency intraday data in constructing a simple conditional moment estimator for stochastic volatility diffusions. The estimator is based on the analytical solutions of the first two conditional moments for the integrated volatility, which ..."
Abstract

Cited by 76 (8 self)
 Add to MetaCart
We exploit the distributional information contained in highfrequency intraday data in constructing a simple conditional moment estimator for stochastic volatility diffusions. The estimator is based on the analytical solutions of the first two conditional moments for the integrated volatility, which is effectively approximated by the quadratic variation of the process. We successfully implement the resulting GMM estimator with highfrequency fiveminute foreign exchange and equity index returns. Our simulation evidence and actual empirical results indicate that the method is very reliable and accurate. The computational speed of the procedure compares very favorably to other existing estimation methods in the literature.
The Adaptive Markets Hypothesis: Market Efficiency from an Evolutionary Perspective
 THE JOURNAL OF PORTFOLIO MANAGEMENT
, 2004
"... The 30th anniversary of The Journal of Portfolio Management is a milestone in the rich intellectual history of modern finance, firmly establishing the relevance of quantitative models and scientific inquiry in the practice of financial management. One of the most enduring ideas from this intellectu ..."
Abstract

Cited by 50 (12 self)
 Add to MetaCart
The 30th anniversary of The Journal of Portfolio Management is a milestone in the rich intellectual history of modern finance, firmly establishing the relevance of quantitative models and scientific inquiry in the practice of financial management. One of the most enduring ideas from this intellectual history is the Efficient Markets Hypothesis (EMH), a deceptively simple notion that has become a lightning rod for its disciples and the proponents of behavioral economics and finance. In its purest form, the EMH obviates active portfolio management, calling into question the very motivation for portfolio research. It is only fitting that we revisit this groundbreaking idea after three very successful decades of this Journal. In this article, I review the current state of the controversy surrounding the EMH and propose a new perspective that reconciles the two opposing schools of thought. The proposed reconciliation, which I call the Adaptive Markets Hypothesis (AMH), is based on an evolutionary approach to economic interactions, as well as some recent research in the cognitive neurosciences that has been transforming and revitalizing the intersection of psychology and economics. Although some of these ideas have not yet been fully articulated within a rigorous quantitative framework, long time students of the EMH and seasoned practitioners will no doubt recognize immediately the possibilities generated by this new perspective. Only time will tell whether its potential will be fulfilled. I begin with a brief review of the classic version of the EMH, and then summarize the most significant criticisms leveled against it by psychologists and behavioral economists. I argue that the sources of this controversy can
Variation, jumps, market frictions and high frequency data in financial econometrics
, 2005
"... ..."
Estimating covariation: Epps effect and microstructure noise
 Journal of Econometrics, forthcoming
, 2009
"... This paper is about how to estimate the integrated covariance 〈X, Y 〉T of two assets over a fixed time horizon [0, T], when the observations of X and Y are “contaminated ” and when such noisy observations are at discrete, but not synchronized, times. We show that the usual previoustick covariance e ..."
Abstract

Cited by 37 (3 self)
 Add to MetaCart
This paper is about how to estimate the integrated covariance 〈X, Y 〉T of two assets over a fixed time horizon [0, T], when the observations of X and Y are “contaminated ” and when such noisy observations are at discrete, but not synchronized, times. We show that the usual previoustick covariance estimator is biased, and the size of the bias is more pronounced for less liquid assets. This is an analytic characterization of the Epps effect. We also provide optimal sampling frequency which balances the tradeoff between the bias and various sources of stochastic error terms, including nonsynchronous trading, microstructure noise, and time discretization. Finally, a twoscales covariance estimator is provided which simultaneously cancels (to first order) the Epps effect and the effect of microstructure noise. The gain is demonstrated in data.
The Price of Diversifiable Risk in Venture Capital and Private Equity
, 2002
"... This paper explores the private equity (PE) and venture capital (VC) markets and demonstrates that unavoidable principalagent problems result in equilibrium competitive equity prices that are decreasing in the amount of idiosyncratic risk. The structure of information in these markets means that id ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
This paper explores the private equity (PE) and venture capital (VC) markets and demonstrates that unavoidable principalagent problems result in equilibrium competitive equity prices that are decreasing in the amount of idiosyncratic risk. The structure of information in these markets means that idiosyncratic risk will be priced even if investors can fully diversify and the private capital markets are competitive. VCs are agents who help investors (the principals) Þnd positive NPV projects. To ensure that VCs screen properly, they must receive compensation based on the performance of their recommendations. Significant time is required to determine if a project is NPV positive, which means that VCs will identify only a small number of investments, exposing them to idiosyncratic risk. Furthermore, VC compensation represents a significant fraction of their wealth. Therefore, they demand returns for the risk they hold. As a result, we show that VC investments have positive alphas while investors in VC funds earn zero alphas. In addition, some positive NPV projects with significant idiosyncratic risk will not be Þnanced. Furthermore, projects or funds that have more idiosyncratic risk will earn higher returns. This last result can be used to empirically distinguish our idea from Þxed compensation or a lack of competition. The