Results 1  10
of
148
OvertheCounter Markets
, 2005
"... We study how intermediation and asset prices in overthecounter markets are affected by illiquidity associated with search and bargaining. We compute explicitly the prices at which investors trade with each other, as well as marketmakers’ bid and ask prices in a dynamic model with strategic agents. ..."
Abstract

Cited by 75 (4 self)
 Add to MetaCart
We study how intermediation and asset prices in overthecounter markets are affected by illiquidity associated with search and bargaining. We compute explicitly the prices at which investors trade with each other, as well as marketmakers’ bid and ask prices in a dynamic model with strategic agents. Bidask spreads are lower if investors can more easily find other investors, or have easier access to multiple marketmakers. With a monopolistic marketmaker, bidask spreads are higher if investors have
Sequential Prediction of Individual Sequences Under General Loss Functions
 IEEE Transactions on Information Theory
, 1998
"... We consider adaptive sequential prediction of arbitrary binary sequences when the performance is evaluated using a general loss function. The goal is to predict on each individual sequence nearly as well as the best prediction strategy in a given comparison class of (possibly adaptive) prediction st ..."
Abstract

Cited by 74 (7 self)
 Add to MetaCart
We consider adaptive sequential prediction of arbitrary binary sequences when the performance is evaluated using a general loss function. The goal is to predict on each individual sequence nearly as well as the best prediction strategy in a given comparison class of (possibly adaptive) prediction strategies, called experts. By using a general loss function, we generalize previous work on universal prediction, forecasting, and data compression. However, here we restrict ourselves to the case when the comparison class is finite. For a given sequence, we define the regret as the total loss on the entire sequence suffered by the adaptive sequential predictor, minus the total loss suffered by the predictor in the comparison class that performs best on that particular sequence. We show that for a large class of loss functions, the minimax regret is either \Theta(log N) or \Omega\Gamma p ` log N ), depending on the loss function, where N is the number of predictors in the comparison class a...
On Convergence Rates in the Central Limit Theorems for Combinatorial Structures
, 1998
"... Flajolet and Soria established several central limit theorems for the parameter "number of components" in a wide class of combinatorial structures. In this paper, we shall prove a simple theorem which applies to characterize the convergence rates in their central limit theorems. This theorem is a ..."
Abstract

Cited by 67 (8 self)
 Add to MetaCart
Flajolet and Soria established several central limit theorems for the parameter "number of components" in a wide class of combinatorial structures. In this paper, we shall prove a simple theorem which applies to characterize the convergence rates in their central limit theorems. This theorem is also applicable to arithmetical functions. Moreover, asymptotic expressions are derived for moments of integral order. Many examples from different applications are discussed.
Causal Parameters and Policy Analysis in Economics: A Twentieth Century Retrospective." Quarterly Journal of Economics 115 (February
 In MeansTested Transfers in the
"... JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification pr ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification problem), and clarification of the role of causal parameters in policy evaluation and in forecasting the effects of policies never previously experienced. This paper summarizes the development of those ideas by the Cowles Commission, the response to their work by structural econometricians and VAR econometricians, and the response to structural and VAR econometrics by calibrators, advocates of natural and social experiments, and by nonparametric econometricians and statisticians.
Efficiencydriven heavytraffic approximations for manyserver queues with abandonments
 Management Science
, 2004
"... Motivated by the desire to understand the performance of serviceoriented call centers, which often provide lowtomoderate quality of service, this paper investigates the efficiencydriven (ED) limiting regime for manyserver queues with abandonments. The starting point is the realization that, in ..."
Abstract

Cited by 47 (31 self)
 Add to MetaCart
Motivated by the desire to understand the performance of serviceoriented call centers, which often provide lowtomoderate quality of service, this paper investigates the efficiencydriven (ED) limiting regime for manyserver queues with abandonments. The starting point is the realization that, in the presence of substantial customer abandonment, callcenter servicelevel agreements (SLA’s) can be met in the ED regime, where the arrival rate exceeds the maximum possible service rate. Mathematically, the ED regime is defined by letting the arrival rate and the number of servers increase together so that the probability of abandonment approaches a positive limit. To obtain the ED regime, it suffices to let the arrival rate and the number of servers increase with the traffic intensity ρ held fixed with ρ> 1 (so that the arrival rate exceeds the maximum possible service rate). Even though the probability of delay necessarily approaches 1 in the ED regime, the ED regime can be realistic because, due to the abandonments, the delays need not be excessively large. This paper establishes ED manyserver heavytraffic limits and develops associated approximations for performance measures in the M/M/s/r + M model, having a Poisson arrival process, exponential service times, s servers, r extra waiting spaces and exponential abandon times (the final +M). In the ED regime, essentially the same limiting behavior occurs when the abandonment rate α approaches 0 as when the number of servers s approaches ∞; indeed, it suffices to assume that s/α → ∞. The ED approximations are shown to be useful by comparing them to exact numerical results for the M/M/s/r + M model obtained using an algorithm developed in Whitt (2003), which exploits numerical transform inversion.
Optimal Inference in Regression Models with Nearly Integrated Regressors
, 2004
"... This paper considers the problem of conducting inference on the regression coefficient in a bivariate regression model with a highly persistent regressor. Gaussian power envelopes are obtained for a class of testing procedures satisfying a conditionality restriction. In addition, the paper proposes ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
This paper considers the problem of conducting inference on the regression coefficient in a bivariate regression model with a highly persistent regressor. Gaussian power envelopes are obtained for a class of testing procedures satisfying a conditionality restriction. In addition, the paper proposes feasible testing procedures that attain these Gaussian power envelopes whether or not the innovations of the regression model are normally distributed.
Multivariate Extremes, Aggregation and Risk Estimation
, 2000
"... We briefly introduce some basic facts about multivariate extreme value theory and present some new results regarding finite aggregates and multivariate extreme value distributions. Based on our results high frequency data can considerably improve quality of estimates of extreme movements in fina ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
We briefly introduce some basic facts about multivariate extreme value theory and present some new results regarding finite aggregates and multivariate extreme value distributions. Based on our results high frequency data can considerably improve quality of estimates of extreme movements in financial markets. Secondly we present an empirical exploration of what the tails really look like for four foreign exchange rates sampled at varying frequencies. Both temporal and spatial dependence is considered. In particular we estimate the spectral measure, which along with the tail index, completely determines the extreme value distribution. Lastly we apply our results to the problem of portfolio optimisation or risk minimization. We analyze how the expected shortfall and VaR scale with time horizon and find that this scaling is not by a factor of square root of time as is frequently used, but by a different power of time. We show that the accuracy of risk estimation can be drast...
State space collapse and diffusion approximation for a network operating under a fair bandwidthsharing policy, in preparation
, 2004
"... We consider a connectionlevel model of Internet congestion control, introduced by Massoulié and Roberts [36], that represents the randomly varying number of flows present in a network. Here bandwidth is shared fairly amongst elastic document transfers according to a weighted αfair bandwidth sharin ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
We consider a connectionlevel model of Internet congestion control, introduced by Massoulié and Roberts [36], that represents the randomly varying number of flows present in a network. Here bandwidth is shared fairly amongst elastic document transfers according to a weighted αfair bandwidth sharing policy introduced by Mo and Walrand [37] (α ∈ (0,∞)). Assuming Poisson arrivals and exponentially distributed document sizes, we focus on the heavy traffic regime in which the average load placed on each resource is approximately equal to its capacity. A fluid model (or functional law of large numbers approximation) for this stochastic model was derived and analyzed in a prior work [29] by two of the authors. Here we use the long time behavior of the solutions of this fluid model established in [29] to derive a property called multiplicative state space collapse, which loosely speaking shows that in diffusion scale the flow count process for the stochastic model can be approximately recovered as a continuous lifting of the workload process. Under weighted proportional fair sharing of bandwidth (α = 1) and a mild
2009): “Information percolation with equilibrium search dynamics
 Econometrica
"... We solve for the equilibrium dynamics of information sharing in a large population. Each agent is endowed with signals regarding the likely outcome of a random variable of common concern. Individuals choose the effort with which they search for others from whom they can gather additional information ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
We solve for the equilibrium dynamics of information sharing in a large population. Each agent is endowed with signals regarding the likely outcome of a random variable of common concern. Individuals choose the effort with which they search for others from whom they can gather additional information. When two agents meet, they share their information. The information gathered is further shared at subsequent meetings, and so on. Equilibria exist in which agents search maximally until they acquire sufficient information precision, and then minimally. A tax whose proceeds are used to subsidize the costs of search improves information sharing and can in some cases increase welfare. On the other hand, endowing agents with public signals reduces information sharing and can in some cases decrease welfare.