Results 1 
5 of
5
WideArea Traffic: The Failure of Poisson Modeling
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1995
"... Network arrivals are often modeled as Poisson processes for analytic simplicity, even though a number of traffic studies have shown that packet interarrivals are not exponentially distributed. We evaluate 24 widearea traces, investigating a number of widearea TCP arrival processes (session and con ..."
Abstract

Cited by 1772 (24 self)
 Add to MetaCart
(Show Context)
Network arrivals are often modeled as Poisson processes for analytic simplicity, even though a number of traffic studies have shown that packet interarrivals are not exponentially distributed. We evaluate 24 widearea traces, investigating a number of widearea TCP arrival processes (session and connection arrivals, FTP data connection arrivals within FTP sessions, and TELNET packet arrivals) to determine the error introduced by modeling them using Poisson processes. We find that userinitiated TCP session arrivals, such as remotelogin and filetransfer, are wellmodeled as Poisson processes with fixed hourly rates, but that other connection arrivals deviate considerably from Poisson; that modeling TELNET packet interarrivals as exponential grievously underestimates the burstiness of TELNET traffic, but using the empirical Tcplib [Danzig et al, 1992] interarrivals preserves burstiness over many time scales; and that FTP data connection arrivals within FTP sessions come bunched into “connection bursts,” the largest of which are so large that they completely dominate FTP data traffic. Finally, we offer some results regarding how our findings relate to the possible selfsimilarity of widearea traffic.
Probability Bounds Analysis in Environmental Risk Assessment
 Applied Biomathematics, Setauket
, 2003
"... This document provides a detailed overview of probability bounds analysis. In the sections that follow, the conceptual background of the approach is briefly presented, followed by the mathematical derivation of probability bounds around parametric, nonparametric, empirical, and assumed or stipulate ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
This document provides a detailed overview of probability bounds analysis. In the sections that follow, the conceptual background of the approach is briefly presented, followed by the mathematical derivation of probability bounds around parametric, nonparametric, empirical, and assumed or stipulated models. Computation with pboxes is then described, and numerical examples of computations are provided. In the next section, probability bounds analysis is compared and contrasted with Monte Carlo simulation techniques. Methods used by Monte Carlo analysts for treating input variables, dependencies between input variables, and model uncertainty are compared to methods used in probability bounds analysis. Techniques for implementing microexposure event analysis models are also compared, along with methods for conducting sensitivity analysis. Finally, the use of probability bounds analysis within the tiered framework for conducting probabilistic risk assessments recommended by EPA is discussed.
Sensitivity in risk analyses with uncertain numbers
, 2006
"... Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is DempsterShafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a “pinching ” strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered. 3
Projecting uncertainty through black boxes
, 2008
"... Computational models whose internal details are not accessible to the analyst are called black boxes. They arise because of security restrictions or because of the loss of the source code for legacy software programs. Computational models whose internal details are extremely complex are also sometim ..."
Abstract
 Add to MetaCart
Computational models whose internal details are not accessible to the analyst are called black boxes. They arise because of security restrictions or because of the loss of the source code for legacy software programs. Computational models whose internal details are extremely complex are also sometimes treated as black boxes. It is often important to assess the uncertainty that should be ascribed to the output from a black box owing to uncertainty about its input quantities, their statistical distributions, or interdependencies. Sensitivity or ‘whatif ’ studies are commonly used for this purpose. In such studies, the space of possible inputs is sampled as a vector of real values which is then provided to the black box to compute the output(s) that corresponds to those inputs. Such studies are often cumbersome to implement and understand, and they generally require many samples, depending on the complexity of the model and the dimensionality of the inputs. This report reviews methods that can be used to propagate about inputs through black boxes, especially ‘hard ’ black boxes whose computational complexity restricts the total number of samples that can be evaluated. The focus is on methods that estimate the uncertainty of the outputs from the outside inward. That is, we are interested in methods that produce conservative characterizations of uncertainty that become tighter and tighter as the total computational effort increases.