Results 1  10
of
454
Testing ContinuousTime Models of the Spot Interest Rate
 Review of Financial Studies
, 1996
"... Different continuoustime models for interest rates coexist in the literature. We test parametric models by comparing their implied parametric density to the same density estimated nonparametrically. We do not replace the continuoustime model by discrete approximations, even though the data are rec ..."
Abstract

Cited by 194 (7 self)
 Add to MetaCart
Different continuoustime models for interest rates coexist in the literature. We test parametric models by comparing their implied parametric density to the same density estimated nonparametrically. We do not replace the continuoustime model by discrete approximations, even though the data are recorded at discrete intervals. The principal source of rejection of existing models is the strong nonlinearity of the drift. Around its mean, where the drift is essentially zero, the spot rate behaves like a random walk. The drift then meanreverts strongly when far away from the mean. The volatility is higher when away from the mean. The continuoustime financial theory has developed extensive tools to price derivative securities when the underlying traded asset(s) or nontraded factor(s) follow stochastic differential equations [see Merton (1990) for examples]. However, as a practical matter, how to specify an appropriate stochastic differential equation is for the most part an unanswered question. For example, many different continuoustime The comments and suggestions of Kerry Back (the editor) and an anonymous referee were very helpful. I am also grateful to George Constantinides,
Nettimer: A Tool for Measuring Bottleneck Link Bandwidth
 In Proceedings of the USENIX Symposium on Internet Technologies and Systems
, 2001
"... Measuring the bottleneck link bandwidth along a path is important for understanding the performance of many Internet applications. Existing tools to measure bottleneck bandwidth are relatively slow, can only measure bandwidth in one direction, and/or actively send probe packets. We present the netti ..."
Abstract

Cited by 171 (1 self)
 Add to MetaCart
Measuring the bottleneck link bandwidth along a path is important for understanding the performance of many Internet applications. Existing tools to measure bottleneck bandwidth are relatively slow, can only measure bandwidth in one direction, and/or actively send probe packets. We present the nettimer bottleneck link bandwidth measurement tool, the libdpcap distributed packet capture library, and experiments quantifying their utility. We test nettimer across a variety of bottleneck network technologies ranging from 19.2Kb/s to 100Mb/s, wired and wireless, symmetric and asymmetric bandwidth, across local area and crosscountry paths, while using both one and two packet capture hosts. In most cases, nettimer has an error of less than 10%, but at worst has an error of 40%, even on crosscountry paths of 17 or more hops. It converges within 10KB of the first large packet arrival while consuming less than 7% of the network traffic being measured.
Measuring Bandwidth
, 1999
"... Accurate network bandwidth measurement is important to a variety of network applications. Unfortunately, accurate bandwidth measurement is difficult. We describe some current bandwidth measurement techniques: using throughput, pathchar [8], and Packet Pair [2]. We explain some of the problems with t ..."
Abstract

Cited by 171 (4 self)
 Add to MetaCart
Accurate network bandwidth measurement is important to a variety of network applications. Unfortunately, accurate bandwidth measurement is difficult. We describe some current bandwidth measurement techniques: using throughput, pathchar [8], and Packet Pair [2]. We explain some of the problems with these techniques, including poor accuracy, poor scalability, lack of statistical robustness, poor agility in adapting to bandwidth changes, lack of flexibility in deployment, and inaccuracy when used on a variety of traffic types. Our solutions to these problems include using a packet window to adapt quickly to bandwidth changes, Receiver Only Packet Pair to combine accuracy and ease of deployment, and Potential Bandwidth Filtering to increase accuracy. Our techniques are are at least as accurate as previously used filtering algorithms, and in some situations, our techniques are more than 37% more accurate. I. INTRODUCTION A common complaint about the Internet is that it is slow. Some of this...
A Nonparametric Model of Term Structure Dynamics and the Market Price of Interest Rate Risk
, 1997
"... This article presents a technique for nonparametrically estimating continuoustime di#usion processes which are observed at discrete intervals. We illustrate the methodology by using daily three and six month Treasury Bill data, from January 1965 to July 1995, to estimate the drift and di#usion of t ..."
Abstract

Cited by 126 (5 self)
 Add to MetaCart
This article presents a technique for nonparametrically estimating continuoustime di#usion processes which are observed at discrete intervals. We illustrate the methodology by using daily three and six month Treasury Bill data, from January 1965 to July 1995, to estimate the drift and di#usion of the short rate, and the market price of interest rate risk. While the estimated di#usion is similar to that estimated by Chan, Karolyi, Longsta# and Sanders (1992), there is evidence of substantial nonlinearity in the drift. This is close to zero for low and medium interest rates, but mean reversion increases sharply at higher interest rates.
Internet Tomography
 IEEE Signal Processing Magazine
, 2002
"... Today's Internet is a massive, distributed network which continues to explode in size as ecommerce and related activities grow. The heterogeneous and largely unregulated structure of the Internet renders tasks such as dynamic routing, optimized service provision, service level verification, and dete ..."
Abstract

Cited by 109 (11 self)
 Add to MetaCart
Today's Internet is a massive, distributed network which continues to explode in size as ecommerce and related activities grow. The heterogeneous and largely unregulated structure of the Internet renders tasks such as dynamic routing, optimized service provision, service level verification, and detection of anomalous/malicious behavior increasingly challenging tasks. The problem is compounded by the fact that one cannot rely on the cooperation of individual servers and routers to aid in the collection of network traffic measurements vital for these tasks. In many ways, network monitoring and inference problems bear a strong resemblance to other "inverse problems" in which key aspects of a system are not directly observable. Familiar signal processing problems such as tomographic image reconstruction, system identification, and array processing all have interesting interpretations in the networking context. This article introduces the new field of network tomography, a field which we believe will benefit greatly from the wealth of signal processing theory and algorithms.
Network tomography: recent developments
 Statistical Science
, 2004
"... Today's Int ernet is a massive, dist([/#][ net work which cont inuest o explode in size as ecommerce andrelatH actH]M/# grow. Thehet([H(/#]H( and largelyunregulatS stregula of t/ Int/HH3 renderstnde such as dynamicroutc/[ opt2]3fl/ service provision, service level verificatflH( and det(2][/ of anoma ..."
Abstract

Cited by 85 (4 self)
 Add to MetaCart
Today's Int ernet is a massive, dist([/#][ net work which cont inuest o explode in size as ecommerce andrelatH actH]M/# grow. Thehet([H(/#]H( and largelyunregulatS stregula of t/ Int/HH3 renderstnde such as dynamicroutc/[ opt2]3fl/ service provision, service level verificatflH( and det(2][/ of anomalous/malicious behaviorext/[(22 challenging. The problem is compounded bytS fact tct onecannot rely ont[ cooperatH2 of individual servers and routSS t aid intS collect[3 of net workt/[S measurement vits fort/]3 t/]3] In many ways, net workmonit]/#[ and inference problems bear a st[fl[ resemblancet otnc "inverse problems" in which key aspect of asystfl are not direct/ observable. Familiar signal processing orst[]23/#[S problems such ast omographic imagereconst[/#[S] and phylogenet# tog identn/HH2[M have int erest3/ connect[HU t tonn arising in net working. This artflMM int/ ducesnet workt/H3]S]/ y, a new field which we believe will benefit greatU from tm wealt of stH2](/#S( ttH2 andalgorit#S( It focuses especially on recent development s int2 field includingtl applicat[fl of pseudolikelihoodmetfl ds andt reeestfl3](/# formulat]M23 Keyw ords:Net workt/HflS33/ y, pseudolikelihood,t opology identn/]H22(/ tn est/]H tst 1 Introducti6 Nonet work is an island, ent/S ofitS[S] everynet work is a piece of an int/]SS work, a part of t/ main . Alt[]][ administHSHSS of smallscale net works can monit( localt ra#ccondit][/ and ident ify congest/# point s and performance botU((2/ ks, very few net works are complet/# # Rui Castroan Robert Nowak are with theDepartmen t of Electricalan ComputerEnterX Rice Unc ersity,Houston TX; Mark Coates is with the Departmen t of Electricalan ComputerEnterX McGill UnG ersity,Mon treal, Quebec,Can Gan Lian an Bin Yu are with theDepartmen t of Statistics,...
SiZer for exploration of structures in curves
 Journal of the American Statistical Association
, 1997
"... In the use of smoothing methods in data analysis, an important question is often: which observed features are "really there?", as opposed to being spurious sampling artifacts. An approach is described, based on scale space ideas that were originally developed in computer vision literature. Assess ..."
Abstract

Cited by 82 (16 self)
 Add to MetaCart
In the use of smoothing methods in data analysis, an important question is often: which observed features are "really there?", as opposed to being spurious sampling artifacts. An approach is described, based on scale space ideas that were originally developed in computer vision literature. Assessment of Significant ZERo crossings of derivatives, results in the SiZer map, a graphical device for display of significance of features, with respect to both location and scale. Here "scale" means "level of resolution", i.e.
CapProbe: a Simple and Accurate Capacity Estimation Technique
 in Proc. ACM SIGCOMM
, 2004
"... The problem of estimating the capacity of an Internet path is one of fundamental importance. Due to the multitude of potential applications, a large number of solutions have been proposed and evaluated. The proposed solutions so far have been successful in partially addressing the problem, but have ..."
Abstract

Cited by 80 (18 self)
 Add to MetaCart
The problem of estimating the capacity of an Internet path is one of fundamental importance. Due to the multitude of potential applications, a large number of solutions have been proposed and evaluated. The proposed solutions so far have been successful in partially addressing the problem, but have suffered from being slow, obtrusive or inaccurate. In this work, we evaluate CapProbe, a lowcost and accurate endtoend capacity estimation scheme that relies on packet dispersion techniques as well as endtoend delays. The key observation that enabled the development of CapProbe is that both compression and expansion of packet pair dispersion are the result of queuing due to crosstraffic. By filtering out queuing effects from packet pair samples, CapProbe is able to estimate capacity accurately in most environments, with minimal processing and probing traffic overhead. In fact, the storage and processing requirements of CapProbe are orders of magnitude smaller than most of the previously proposed schemes. We tested CapProbe through simulation, Internet, Internet2 and wireless experiments. We found that CapProbe error percentage in capacity estimation was within 10 % in almost all cases, and within 5 % in most cases.
Approximating MultiDimensional Aggregate Range Queries Over Real Attributes
, 2000
"... Finding approximate answers to multidimensional range queries over real valued attributes has significant applications in data exploration and database query optimization. In this paper we consider the following problem: given a table of d attributes whose domain is the real numbers, and a quer ..."
Abstract

Cited by 74 (8 self)
 Add to MetaCart
Finding approximate answers to multidimensional range queries over real valued attributes has significant applications in data exploration and database query optimization. In this paper we consider the following problem: given a table of d attributes whose domain is the real numbers, and a query that specifies a range in each dimension, find a good approximation of the number of records in the table that satisfy the query. We present a new histogram technique that is designed to approximate the density of multidimensional datasets with real attributes. Our technique finds buckets of variable size, and allows the buckets to overlap. Overlapping buckets allow more efficient approximation of the density. The size of the cells is based on the local density of the data. This technique leads to a faster and more compact approximation of the data distribution. We also show how to generalize kernel density estimators, and how to apply them on the multidimensional query approxim...