Results 1  10
of
282
OvertheCounter Markets
, 2005
"... We study how intermediation and asset prices in overthecounter markets are affected by illiquidity associated with search and bargaining. We compute explicitly the prices at which investors trade with each other, as well as marketmakers’ bid and ask prices in a dynamic model with strategic agents. ..."
Abstract

Cited by 126 (4 self)
 Add to MetaCart
We study how intermediation and asset prices in overthecounter markets are affected by illiquidity associated with search and bargaining. We compute explicitly the prices at which investors trade with each other, as well as marketmakers’ bid and ask prices in a dynamic model with strategic agents. Bidask spreads are lower if investors can more easily find other investors, or have easier access to multiple marketmakers. With a monopolistic marketmaker, bidask spreads are higher if investors have
Causal Parameters and Policy Analysis in Economics: A Twentieth Century Retrospective." Quarterly Journal of Economics 115 (February
 In MeansTested Transfers in the
"... JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification pr ..."
Abstract

Cited by 122 (6 self)
 Add to MetaCart
JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification problem), and clarification of the role of causal parameters in policy evaluation and in forecasting the effects of policies never previously experienced. This paper summarizes the development of those ideas by the Cowles Commission, the response to their work by structural econometricians and VAR econometricians, and the response to structural and VAR econometrics by calibrators, advocates of natural and social experiments, and by nonparametric econometricians and statisticians.
Sequential prediction of individual sequences under general loss functions
 IEEE Trans. on Information Theory
, 1998
"... Abstract—We consider adaptive sequential prediction of arbitrary binary sequences when the performance is evaluated using a general loss function. The goal is to predict on each individual sequence nearly as well as the best prediction strategy in a given comparison class of (possibly adaptive) pre ..."
Abstract

Cited by 93 (8 self)
 Add to MetaCart
Abstract—We consider adaptive sequential prediction of arbitrary binary sequences when the performance is evaluated using a general loss function. The goal is to predict on each individual sequence nearly as well as the best prediction strategy in a given comparison class of (possibly adaptive) prediction strategies, called experts. By using a general loss function, we generalize previous work on universal prediction, forecasting, and data compression. However, here we restrict ourselves to the case when the comparison class is finite. For a given sequence, we define the regret as the total loss on the entire sequence suffered by the adaptive sequential predictor, minus the total loss suffered by the predictor in the comparison class that performs best on that particular sequence. We show that for a large class of loss functions, the minimax regret is either (log N)
On Convergence Rates in the Central Limit Theorems for Combinatorial Structures
, 1998
"... Flajolet and Soria established several central limit theorems for the parameter "number of components" in a wide class of combinatorial structures. In this paper, we shall prove a simple theorem which applies to characterize the convergence rates in their central limit theorems. This th ..."
Abstract

Cited by 82 (11 self)
 Add to MetaCart
Flajolet and Soria established several central limit theorems for the parameter "number of components" in a wide class of combinatorial structures. In this paper, we shall prove a simple theorem which applies to characterize the convergence rates in their central limit theorems. This theorem is also applicable to arithmetical functions. Moreover, asymptotic expressions are derived for moments of integral order. Many examples from different applications are discussed.
Efficiencydriven heavytraffic approximations for manyserver queues with abandonments
 Management Science
, 2004
"... Motivated by the desire to understand the performance of serviceoriented call centers, which often provide lowtomoderate quality of service, this paper investigates the efficiencydriven (ED) limiting regime for manyserver queues with abandonments. The starting point is the realization that, in ..."
Abstract

Cited by 79 (38 self)
 Add to MetaCart
Motivated by the desire to understand the performance of serviceoriented call centers, which often provide lowtomoderate quality of service, this paper investigates the efficiencydriven (ED) limiting regime for manyserver queues with abandonments. The starting point is the realization that, in the presence of substantial customer abandonment, callcenter servicelevel agreements (SLA’s) can be met in the ED regime, where the arrival rate exceeds the maximum possible service rate. Mathematically, the ED regime is defined by letting the arrival rate and the number of servers increase together so that the probability of abandonment approaches a positive limit. To obtain the ED regime, it suffices to let the arrival rate and the number of servers increase with the traffic intensity ρ held fixed with ρ> 1 (so that the arrival rate exceeds the maximum possible service rate). Even though the probability of delay necessarily approaches 1 in the ED regime, the ED regime can be realistic because, due to the abandonments, the delays need not be excessively large. This paper establishes ED manyserver heavytraffic limits and develops associated approximations for performance measures in the M/M/s/r + M model, having a Poisson arrival process, exponential service times, s servers, r extra waiting spaces and exponential abandon times (the final +M). In the ED regime, essentially the same limiting behavior occurs when the abandonment rate α approaches 0 as when the number of servers s approaches ∞; indeed, it suffices to assume that s/α → ∞. The ED approximations are shown to be useful by comparing them to exact numerical results for the M/M/s/r + M model obtained using an algorithm developed in Whitt (2003), which exploits numerical transform inversion.
Optimal Inference in Regression Models with Nearly Integrated Regressors
, 2004
"... This paper considers the problem of conducting inference on the regression coefficient in a bivariate regression model with a highly persistent regressor. Gaussian power envelopes are obtained for a class of testing procedures satisfying a conditionality restriction. In addition, the paper proposes ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
This paper considers the problem of conducting inference on the regression coefficient in a bivariate regression model with a highly persistent regressor. Gaussian power envelopes are obtained for a class of testing procedures satisfying a conditionality restriction. In addition, the paper proposes feasible testing procedures that attain these Gaussian power envelopes whether or not the innovations of the regression model are normally distributed.
Maximum Likelihood Estimation of a Binary Choice Model with Random Coefficients of Unknown Distribution
, 1993
"... We consider a binary response model y = 1 { x/ ~ + £ ~ 0} with x 1 1 1 1 1 independent.of the unobservables (~l'£l) ' Nq finitedimensional parametric restrictions are imposed on F, the Joint distribution of ( ~,£). A 011 nonparametric maximum likelihood estimator for F is shown to be con ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
We consider a binary response model y = 1 { x/ ~ + £ ~ 0} with x 1 1 1 1 1 independent.of the unobservables (~l'£l) ' Nq finitedimensional parametric restrictions are imposed on F, the Joint distribution of ( ~,£). A 011 nonparametric maximum likelihood estimator for F is shown to be consistent. o We analyze some conditions under which F is or is not identified. We find o that certain moments of Fo are not identified, even when the model is normalized by fixing one variance. The correlation matrix of (~l'£l) is not identified. We also provide some Monte Carlo evidence on the small sample performance of our estimator.
State space collapse and diffusion approximation for a network operating under a fair bandwidthsharing policy, in preparation
, 2004
"... We consider a connectionlevel model of Internet congestion control, introduced by Massoulié and Roberts [36], that represents the randomly varying number of flows present in a network. Here bandwidth is shared fairly amongst elastic document transfers according to a weighted αfair bandwidth sharin ..."
Abstract

Cited by 46 (8 self)
 Add to MetaCart
(Show Context)
We consider a connectionlevel model of Internet congestion control, introduced by Massoulié and Roberts [36], that represents the randomly varying number of flows present in a network. Here bandwidth is shared fairly amongst elastic document transfers according to a weighted αfair bandwidth sharing policy introduced by Mo and Walrand [37] (α ∈ (0,∞)). Assuming Poisson arrivals and exponentially distributed document sizes, we focus on the heavy traffic regime in which the average load placed on each resource is approximately equal to its capacity. A fluid model (or functional law of large numbers approximation) for this stochastic model was derived and analyzed in a prior work [29] by two of the authors. Here we use the long time behavior of the solutions of this fluid model established in [29] to derive a property called multiplicative state space collapse, which loosely speaking shows that in diffusion scale the flow count process for the stochastic model can be approximately recovered as a continuous lifting of the workload process. Under weighted proportional fair sharing of bandwidth (α = 1) and a mild
Multivariate Extremes, Aggregation and Risk Estimation
, 2000
"... We briefly introduce some basic facts about multivariate extreme value theory and present some new results regarding finite aggregates and multivariate extreme value distributions. Based on our results high frequency data can considerably improve quality of estimates of extreme movements in fina ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
We briefly introduce some basic facts about multivariate extreme value theory and present some new results regarding finite aggregates and multivariate extreme value distributions. Based on our results high frequency data can considerably improve quality of estimates of extreme movements in financial markets. Secondly we present an empirical exploration of what the tails really look like for four foreign exchange rates sampled at varying frequencies. Both temporal and spatial dependence is considered. In particular we estimate the spectral measure, which along with the tail index, completely determines the extreme value distribution. Lastly we apply our results to the problem of portfolio optimisation or risk minimization. We analyze how the expected shortfall and VaR scale with time horizon and find that this scaling is not by a factor of square root of time as is frequently used, but by a different power of time. We show that the accuracy of risk estimation can be drast...