#### DMCA

## Experimental Uncertainty Estimation and Statistics for Data Having Interval Uncertainty

Venue: | 11733, SAND2007-0939. hal-00839639, version 1 - 28 Jun 2013 |

Citations: | 42 - 20 self |

### Citations

14074 | Computers and Intractability: A Guide to the Theory of NP-Completeness - Garey, Johnson - 1979 |

10603 | Introduction to Algorithms
- Cormen, Leiverson, et al.
- 2009
(Show Context)
Citation Context ...ough there are algorithms to sort in only linear time O(N) in special cases, or with special knowledge about the data. Indeed, there are also lineartime algorithms to obtain the median of a data set (=-=Cormen et al. 2001-=-), although such algorithms are more complicated. 39sproblematic for interval data is really the same reason that it is problematic for realvalued data where, in principle, no two data values are like... |

2966 |
Robust Statistics
- Huber
(Show Context)
Citation Context ...erlying motivation behind developing statistics for interval data has close connections to some central ideas in statistics and engineering, including those behind the 12sfields of robust statistics (=-=Huber 1981-=-) and sensitivity analysis (Saltelli et al. 2001). Like sensitivity analysis, interval statistics asks what can be inferred when data are relaxed from point estimates to reliable ranges. Like robust s... |

2766 | Statistical Analysis with Missing Data
- Little, Rubin
- 1987
(Show Context)
Citation Context ...onal analyses typically make use of methods such as replacing values with the average of the available data, or the last value sampled (“hot deck imputation”) to try to minimize bias in the data set (=-=Little and Rubin 1987-=-). These methods are typically only appropriate if the absent data are missing at random, that is, in a way that is unrelated to their magnitudes. Sometimes this may be a reasonable assumption, and so... |

1509 |
Data Reduction and Error Analysis for the Physical Sciences
- Bevington, Robinson
- 1992
(Show Context)
Citation Context ...o the important problem of evaluating, representing and propagating measurement uncertainty. It constitutes a possible alternative to or generalization of current approaches (Coleman and Steele 1999; =-=Bevington and Robinson 1992-=-; Taylor and Kuyatt 1994; Carroll et al. 1995; Dieck 1997) which require sometimes untenable assumptions and cannot really capture interval uncertainty. This approach will be of limited interest for t... |

1367 |
Fuzzy sets and fuzzy logic, theory and applications
- Klir, Yuan
- 1995
(Show Context)
Citation Context ...r regression to data sets that contain interval uncertainty (Markov 1990; Manski and Tamer 2002; Marino and Palumbo 2002; 2003). Generalizations of regression for fuzzy data have also been described (=-=Klir and Yuan 1995-=-; Saila and Ferson 1998), and, because fuzzy data are generalizations of interval data, these methods can be applied immediately to interval data. Principal components analysis, sometimes called “mode... |

1274 |
Interval Analysis
- Moore
- 1966
(Show Context)
Citation Context ...pecial cases of data sets containing intervals for which the analyses reviewed in sections 4 and 5 are computationally convenient. 3.1 Interval arithmetic Interval arithmetic (Young 1931; Dwyer 1951; =-=Moore 1966-=-; 1979; Goos and Hartmanis 1975; Neumaier 1990) is a special case of set arithmetic defined on intervals of the real line. An interval is a closed set of the real line consisting of all the values bet... |

747 | Method and Applications of Interval Analysis, - Moore - 1979 |

721 | An Introduction to Parallel Algorithms - Jaja - 1992 |

607 |
Introduction to interval computations
- Alefeld, Herzberger
- 1983
(Show Context)
Citation Context ... be excluding some of the possible values in the theoretical interval, namely the values in the ranges [ 0. 666, 0. 667] and [ 3. 333, 3. 333] . In computer science applications of interval analysis (=-=Alefeld and Hertzberger 1983-=-), this convention is used for internal calculations when the bytes devoted to a numeric quantity do not permit a perfect representation of the quantity. When there are many sequential calculations, t... |

387 |
Applied Interval Analysis
- Jaulin, Kieffer, et al.
- 2001
(Show Context)
Citation Context ...ically manipulate the expression to reduce the occurrences of the repeated parameters, such as replacing the original expression x + x + x with the equivalent expression 3x, or completing the square (=-=Jaulin et al. 2001-=-) by which x + x 2 is re-expressed as (x + ) 2 − , which looks a bit more complex but has only one instance of the uncertain parameter x and therefore can be easily computed with the naive methods... |

380 |
Statistical Methods for Environmental Pollution Monitoring
- Gilbert
- 1987
(Show Context)
Citation Context ...nce between s N ⎤ ⎥ ⎦ 67sthe two statistics, we cannot be sure about whether they will be over- or underestimated. When censoring is moderate, Cohen’s method is often used to account for non-detects (=-=Gilbert 1987-=-; EPA 2000, page 4-43ff). It is a maximum likelihood method for correcting the estimates of the sample mean and the sample variance to account for the presence of non-detects among the data. This meth... |

323 |
Statistical methods for reliability data
- Meeker, Escobar
- 1998
(Show Context)
Citation Context ...principle, another lab using more sensitive methods might be able to quantify the concentration. There is a considerable statistical literature on handling censored data like this (Helsel 1990; 2005; =-=Meeker and Escobar 1995-=-). Although many of the available methods are rather sophisticated, the most common approach used in practice is to replace the non-detected datum with a quantity that is one half the detection limit,... |

270 |
Experimentation and Uncertainty Analysis for Engineers, 2nd edition,
- Coleman, Steel
- 1999
(Show Context)
Citation Context ...s report can be applied to the important problem of evaluating, representing and propagating measurement uncertainty. It constitutes a possible alternative to or generalization of current approaches (=-=Coleman and Steele 1999-=-; Bevington and Robinson 1992; Taylor and Kuyatt 1994; Carroll et al. 1995; Dieck 1997) which require sometimes untenable assumptions and cannot really capture interval uncertainty. This approach will... |

239 |
Measurement Error in Nonlinear Models.
- Carroll, Ruppert, et al.
- 1995
(Show Context)
Citation Context ...nd propagating measurement uncertainty. It constitutes a possible alternative to or generalization of current approaches (Coleman and Steele 1999; Bevington and Robinson 1992; Taylor and Kuyatt 1994; =-=Carroll et al. 1995-=-; Dieck 1997) which require sometimes untenable assumptions and cannot really capture interval uncertainty. This approach will be of limited interest for the lucky metrologists who can properly neglec... |

216 | Computational complexity and feasibility of data processing and interval computations Kluwer Academic Publishers,
- Kahl, Kreinovich, et al.
- 1998
(Show Context)
Citation Context ...statistics The table below summarizes the computability results for statistics of data sets containing intervals that have been established in this report and elsewhere (Ferson et al. 2002a,b; 2005a; =-=Kreinovich and Longpré 2003-=-; 2004; Kreinovich et al. 2003; 2004a,b; 2005a,b,c; Wu et al. 2003; Xiang 2006; Xiang et al. 2006; Dantsin et al. 2006; Xiang et al. 2007a). The column headings refer to exemplar problems. For instanc... |

124 | Sur Les Tableaux de Corrélation Dont les Marges Sont Données”, Annales de l’Université de - Frechet - 1951 |

107 |
Different Methods are Needed to Propagate Ignorance and Variability
- Ferson, Ginzburg
- 1996
(Show Context)
Citation Context ...te Carlo simulation as described in JCGM (2006). The gray p-box in Figure 40 was obtained by straightforward calculations described and implemented by Williamson and Downs (1990; Berleant 1993; 1996; =-=Ferson and Ginzburg 1996-=-; Berleant and Cheng 1998; Ferson et al. 2003; Ferson and Hajagos 2004). The black distribution characterizes the estimate of the sum 5.593+5.975 = 11.568 with combined uncertainty sqrt(1.361 2 + 0.67... |

87 |
Theoretical comparison of Bootstrap confidence intervals”,
- Hall
- 1988
(Show Context)
Citation Context ...nd 1972; 1975; Gilbert 1987; Zhou and Gao 1997), but these cannot readily be generalized for interval data because they use complex expressions based on maximum likelihood. Alternative methods (e.g., =-=Hall 1988-=-; 1992; Schulz and Griffin 1999) can also be complicated, and the basic problems of skewness and uncertainty about the shape of the underlying distribution—even before the issue of censoring or interv... |

85 | Inference on Regressions with Interval Data on a Regressor or Outcome,” - Manski, Tamer - 2002 |

75 | Propagation of Uncertainty in Risk Assessments: The Need to Distinguish Between Uncertainty Due to Lack of Knowledge and Uncertainty Due to Variability - Hoffman, Hammonds - 1994 |

67 |
Bounding the Results of Arithmetic Operations on Random Variables of Unknown Dependency Using Intervals,
- Berleant, Goodman-Strauss
- 1998
(Show Context)
Citation Context ...sis (Wise and Henrion 1986), and the second is the probabilistic approach due to Fréchet (1935; 1951) of bounding results over all possible dependencies (Frank et al. 1987; Williamson and Downs 1990; =-=Berleant and Goodman-Strauss 1998-=-; Ferson et al. 2004a). Only these approaches truly relax dependence assumptions. The standard methods are making assumptions about the interactions of the sources of uncertainty that may or may not b... |

58 | Nonparametric analysis of randomized experiments with missing covariate and outcome data, - Horowitz, Manski - 2000 |

57 | Nondetects and data analysis: statistics for censored environmental data (p. - Helsel - 2005 |

54 | Confidence limits for an unknown distribution function. - Kolmogorov - 1941 |

51 | Assessing uncertainty in physical constants. - Henrion, Fischhoff - 1986 |

49 |
An overview of robust Bayesian analysis (with discussion), Test 5
- Berger
- 1994
(Show Context)
Citation Context ...ability as a measure of the degree of belief that an event will occur does not preclude being agnostic about one’s belief. Recent work in robust Bayes analysis (Insua and Ruggeri 2000; Pericchi 2000; =-=Berger 1994-=-; Pericchi and Walley 1991; Berger 1985) demonstrates that Bayesian methods can be usefully combined with methods that treat interval-like uncertainty. 124s8.3 Harmonization We have argued that the me... |

49 | Statool: A Tool for Distribution Envelope Determination DEnv, An Interval-Based Algorithm for Arithmetic on Random Variables,” - Berleant, Xie, et al. - 2003 |

48 | Fuzzy Data Analysis, - Bandemer, Näther - 1992 |

43 |
Less than obvious: statistical treatment of data below the detection limit.
- Helsel
- 1990
(Show Context)
Citation Context ...ng its methods. In principle, another lab using more sensitive methods might be able to quantify the concentration. There is a considerable statistical literature on handling censored data like this (=-=Helsel 1990-=-; 2005; Meeker and Escobar 1995). Although many of the available methods are rather sophisticated, the most common approach used in practice is to replace the non-detected datum with a quantity that i... |

40 |
Table of percentage points of Kolmogorov statistics.
- Miller
- 1956
(Show Context)
Citation Context ...tinuous distribution. The confidence limits depend on the value of a special statistic given by Smirnov (1939), thus the bounds are often called the Kolmogorov-Smirnov confidence limits (Feller 1948; =-=Miller 1956-=-; Ferson et al. 2003, section 3.5.4). The bounds are computed by vertically expanding the upper and lower bounds of the distribution by a constant that depends on the sample size and the level of conf... |

37 |
On the Kolmogorov-Smirnov limit theorems for empirical distributions.
- Feller
- 1948
(Show Context)
Citation Context ... a common continuous distribution. The confidence limits depend on the value of a special statistic given by Smirnov (1939), thus the bounds are often called the Kolmogorov-Smirnov confidence limits (=-=Feller 1948-=-; Miller 1956; Ferson et al. 2003, section 3.5.4). The bounds are computed by vertically expanding the upper and lower bounds of the distribution by a constant that depends on the sample size and the ... |

34 | On the removal of skewness by transformation, - Hall - 1992 |

33 |
Best-possible bounds on the distribution of a sum–a problem of Kolmogorov. Probability Theory and Related Fields
- Frank, Nelsen, et al.
- 1987
(Show Context)
Citation Context ...between variables. The first is interval analysis (Wise and Henrion 1986), and the second is the probabilistic approach due to Fréchet (1935; 1951) of bounding results over all possible dependencies (=-=Frank et al. 1987-=-; Williamson and Downs 1990; Berleant and Goodman-Strauss 1998; Ferson et al. 2004a). Only these approaches truly relax dependence assumptions. The standard methods are making assumptions about the in... |

31 | Probability Bounds Analysis," - Ferson, Donald - 1998 |

30 | Automatically verified arithmetic on probability distributions and intervals - Berleant - 1996 |

30 |
Linear Computations
- Dwyer
- 1951
(Show Context)
Citation Context ...ts several special cases of data sets containing intervals for which the analyses reviewed in sections 4 and 5 are computationally convenient. 3.1 Interval arithmetic Interval arithmetic (Young 1931; =-=Dwyer 1951-=-; Moore 1966; 1979; Goos and Hartmanis 1975; Neumaier 1990) is a special case of set arithmetic defined on intervals of the real line. An interval is a closed set of the real line consisting of all th... |

30 |
Boole’s Logic and Probability
- Hailperin
- 1986
(Show Context)
Citation Context ...ation more than once. 20sThe appearance of repeated parameters in expressions is a well-known problem with interval arithmetic and, indeed, with all uncertainty calculi (e.g., Moore 1966; Manes 1982; =-=Hailperin 1986-=-; Ferson 1996). In interval analysis, the problem is less severe than in other calculi because its methods always fail safe in the sense that they cannot underestimate the uncertainty. This is not the... |

28 | Generalisations du theoreme des probabilites totales. - Frechet - 1935 |

28 | Novel Approaches to Numerical Software with Result Verification - Granvilliers, Kreinovich, et al. - 2003 |

28 | Combining unbiased estimators - Graybill, Deal - 1959 |

24 |
Principal component analysis on interval data
- Gioia, Lauro
- 2006
(Show Context)
Citation Context ...lysis (Billard and Diday 2000; Manski and Tamer 2002; Marino and Palumbo 2002; 2003), time series analysis (Möller and Reuter 2006), principal components analysis (Lauro and Palumbo 2000; 2003; 2005; =-=Gioia and Lauro 2006-=-; Lauro et al. 2007), factorial analysis (Palumbo and Irpino 2005), outlier detection (Kreinovich et al. 2005a; Neumann et al. 2006), hypothesis testing (Kutterer 2004), classification (Zaffalon 2002;... |

23 | An inequality paradigm for prob abilistic knowledge. - Grosof - 1986 |

23 | Estimating and Validating the Cumulative Distribution of a Function of Random Variables: Toward the Development of Distribution Arithmetic - Lodwick, Jamison |

22 |
Arithmetic With Uncertain Numbers: Rigorous and (Often)
- Ferson, Hajagos
- 2004
(Show Context)
Citation Context ...certainties are inserted into the evaluations multiple times. Even if we computed best possible bounds on v and w with special strategies such as subinterval reconstitution (Moore 1966; Corliss 1988; =-=Ferson and Hajagos 2004-=-), there would still be a problem because the resulting two interval parameters are not independent, and the p-box constructed from them would not be the best possible p-box given the sample data them... |

19 | Regression Analysis for Interval-valued Data. In - Billard, Diday - 2000 |

18 |
A Software Tool for Automatically Verified Operations on Intervals and Probability Distributions
- Berleant, Cheng
- 1998
(Show Context)
Citation Context ...cribed in JCGM (2006). The gray p-box in Figure 40 was obtained by straightforward calculations described and implemented by Williamson and Downs (1990; Berleant 1993; 1996; Ferson and Ginzburg 1996; =-=Berleant and Cheng 1998-=-; Ferson et al. 2003; Ferson and Hajagos 2004). The black distribution characterizes the estimate of the sum 5.593+5.975 = 11.568 with combined uncertainty sqrt(1.361 2 + 0.6768 2 ) = 1.520. The cover... |

18 | Conservative uncertainty propagation in environmental risk assessments, in Environmental Toxicology and - Ferson, Long |

18 | Variance of a weighted mean. - Meier - 1953 |

17 | Tables of confidence limits for linear functions of the normal mean and variance - Land - 1975 |

15 | Exact bounds on finite populations of interval data, - Ferson, Ginzburg, et al. - 2005 |

14 | Probability Bounds Analysis Solves the Problem of Incomplete Specification in Probabilistic Risk and Safety Assessments," Ninth Conference on RiskBased Decisionmaking in Water Resources, - Ferson - 2000 |

14 | Probabilities, intervals, what next? optimization problems related to extension of interval computations to situations with partial information about probabilities - Kreinovich - 2004 |

13 | Interval-Valued and Fuzzy-Valued Random Variables: From Computing Sample Variances to Computing Sample Covariances - Beck, Kreinovich, et al. |

13 |
Testing the Mean of Skewed Distributions
- Chen
- 1995
(Show Context)
Citation Context ...s method can easily be generalized to handle interval data. However, the method is rather sensitive to skewness in the underlying distribution and cannot be recommended for skewed data (Johnson 1978; =-=Chen 1995-=-; Zhou and Gao 2000; EPA 2002). Specific methods for a few specific distributions have been described, including the gamma distribution (Wong 1993; Stuart and Ord 1994) and the lognormal distribution ... |

12 |
Modified t-test and confidence intervals for asymmetrical populations
- Johnson
- 1978
(Show Context)
Citation Context ...bly large. This method can easily be generalized to handle interval data. However, the method is rather sensitive to skewness in the underlying distribution and cannot be recommended for skewed data (=-=Johnson 1978-=-; Chen 1995; Zhou and Gao 2000; EPA 2002). Specific methods for a few specific distributions have been described, including the gamma distribution (Wong 1993; Stuart and Ord 1994) and the lognormal di... |

12 | Measurement of Possibilistic Histograms from Interval Data - Joslyn - 1996 |

11 | Fast quantum algorithms for handling probabilistic, interval, and fuzzy uncertainty
- Martinez, Longpré, et al.
- 2003
(Show Context)
Citation Context ...inty characterized by an interval (Ahmad 1975; Shafer 1976; Walley 1991; Ferson and Ginzburg 1996; Horowitz and Manski 2000; Manski 2003; Ferson et al. 2003; Ferson and Hajagos 2004; Kreinovich 2004; =-=Kreinovich and Longpré 2004-=-; Starks et al. 2004; Hajagos 2005, inter alia). The examples given in section 8.2 and elsewhere in this report suggest that ignoring incertitude, or modeling it with uniform distributions, can produc... |

11 |
A class of fuzzy theories
- MANES
- 1982
(Show Context)
Citation Context ...o the calculation more than once. 20sThe appearance of repeated parameters in expressions is a well-known problem with interval arithmetic and, indeed, with all uncertainty calculi (e.g., Moore 1966; =-=Manes 1982-=-; Hailperin 1986; Ferson 1996). In interval analysis, the problem is less severe than in other calculi because its methods always fail safe in the sense that they cannot underestimate the uncertainty.... |

10 | Environmental Protection Agency) 2000. Guidance for Data Quality Assessment. Office of Environmental Information - EPA - 2000 |

10 |
Replicability, confidence, and priors.
- Killeen
- 2005
(Show Context)
Citation Context ...e mean are very widely used as a characterization of the location of a data set, just like the mean, median, and mode are, except that confidence intervals also capture the effect of sample size (cf. =-=Killeen 2005-=-). They are wide when there are few samples and tighten as the sample size increases. In practice, confidence limits on means are often used as conservative estimates of data location when the paucity... |

9 |
Statistical methods for astronomical data with upper limits. I - Univariate distributions,
- Feigelson, Nelson
- 1985
(Show Context)
Citation Context ...ring in regression analyses. When only one of the two variables is left-censored, standard survival analysis methods are typically used in regression studies (Schmitt 1985; Feigelson and Nelson 1985; =-=Isobe et al. 1986-=-). Schmitt (1985) described a weighted least-squares regression method for data sets in which both the independent and dependent variables could include non-detects. Akritas et al. (1995) described th... |

8 | Outlier Detection Under Interval and Fuzzy Uncertainty: Algorithmic Solvability and
- Kreinovich, Patangay, et al.
- 2003
(Show Context)
Citation Context ... 2. 4.8.2 Computing the outer bounds on confidence limits Computing the upper bound on the upper confidence limit U or computing the lower bound on the lower confidence limit L is an NP-hard problem (=-=Kreinovich et al. 2003-=-; 2004b; 2005a). If the quantity β = 1+1/K0 2 ≤ N, then the corresponding maximum and minimum are always attained at the endpoints of the intervals xi, so to compute the outer bounds on the confidence... |

8 | An approach to combining results from multiple methods motivated by the ISO - Levenson, Banks, et al. |

6 | The Theil-Sen estimator with doubly censored data and applications to astronomy. - Akritas, Murphy, et al. - 1995 |

6 | Applications of possibility and evidence theory in civil engineering, I st international symposium on imprecise probabilities and their applications Ghent - Fetz |

5 | Probability and fuzzy reliability analysis of a sample slope near Aliano. Engineering Geology - Giasi, Masi, et al. - 2003 |

5 |
Statistics manual, with examples taken from ordinance development.
- Crow
- 1960
(Show Context)
Citation Context ...hus, to compute 95% confidence limits, Dmax(0.05, N) ≈ 1.36/√N, so long as N is larger than about 50. For smaller values of N, the value of D must be read from a statistical table (e.g., Miller 1956; =-=Crow et al. 1960-=-, Table 11, page 248). The Kolmogorov-Smirnov confidence limits are distribution-free, which means that they do not depend on any particular knowledge about shape of the underlying distribution. They ... |

5 | Deconvolution can reduce uncertainty in risk analyses - Ferson, Long - 1997 |

5 |
An evaluation of approximate confidence interval estimation methods for lognormal means
- Land
- 1972
(Show Context)
Citation Context ... Zhou and Gao 2000; EPA 2002). Specific methods for a few specific distributions have been described, including the gamma distribution (Wong 1993; Stuart and Ord 1994) and the lognormal distribution (=-=Land 1972-=-; 1975; Gilbert 1987; Zhou and Gao 1997), but these cannot readily be generalized for interval data because they use complex expressions based on maximum likelihood. Alternative methods (e.g., Hall 19... |

5 | Interval arithmetic for the evaluation of imprecise data effects in least squares linear regression - Marino, Palumbo |

4 | Fuzzy models in geotechnical engineering and construction management. Computer-aided Civil and Infrastructure Engineering 14 - Fetz, Oberguggenberger, et al. - 1999 |

4 | Basic statistical methods for interval data
- Gioia, Lauro
- 2005
(Show Context)
Citation Context ... this idea and have started developing formulas and algorithms necessary to compute basic statistics for interval data* (e.g., Ferson et al. 2002a,b; 2005; Wu et al. 2003; Kreinovich, et al. 2005b,c; =-=Gioia and Lauro 2005-=-; Xiang et al. 2006). In fact, there has been quite a lot of work on applying statistical methods to data sets containing interval uncertainty, although a great deal of it undoubtedly has not been her... |

4 | Accurately computing ecological risk under measurement uncertainty - Hajagos - 2005 |

4 | Interval Statistical Models (in Russian), Radio i Svyaz - Kuznetsov - 1991 |

4 | Measurement of the Velocity of Light in a Partial Vacuum - Michelson, Pease, et al. - 1935 |

3 |
How can you apply interval techniques in an industrial setting
- Corliss
- 1988
(Show Context)
Citation Context ...vals, their uncertainties are inserted into the evaluations multiple times. Even if we computed best possible bounds on v and w with special strategies such as subinterval reconstitution (Moore 1966; =-=Corliss 1988-=-; Ferson and Hajagos 2004), there would still be a problem because the resulting two interval parameters are not independent, and the p-box constructed from them would not be the best possible p-box g... |

3 |
Prediction of uncertain structural responses using fuzzy time series
- Moller, Reuter
(Show Context)
Citation Context ...ysis (Nivlet et al. 2001a,b), Bayesian methods (Fernández et al. 2001; 2004), regression analysis (Billard and Diday 2000; Manski and Tamer 2002; Marino and Palumbo 2002; 2003), time series analysis (=-=Möller and Reuter 2006-=-), principal components analysis (Lauro and Palumbo 2000; 2003; 2005; Gioia and Lauro 2006; Lauro et al. 2007), factorial analysis (Palumbo and Irpino 2005), outlier detection (Kreinovich et al. 2005a... |

2 |
A distribution-free interval mathematical analysis of probability density functions
- Ahmad
- 1975
(Show Context)
Citation Context ...y (lack of knowledge) are present in measurements. As we and many others have argued, no single probability distribution can properly represent the epistemic uncertainty characterized by an interval (=-=Ahmad 1975-=-; Shafer 1976; Walley 1991; Ferson and Ginzburg 1996; Horowitz and Manski 2000; Manski 2003; Ferson et al. 2003; Ferson and Hajagos 2004; Kreinovich 2004; Kreinovich and Longpré 2004; Starks et al. 20... |

2 | A history of the speed of light”. http://www.sigmaengineering.co.uk/light/lightindex.shtml Cutler, A.N. 2001b. “Data from Michelson, Pease and Pearson - Cutler - 1935 |

2 | Théorie analytique de probabilités (edition troisième - Laplace, S - 1820 |

2 | The introduction (Essai philosophique sur les probabilités) is available in an English translation - Courcier - 1951 |

2 |
Irpino A. : Principal components analysis of symbolic data described by intervals
- Lauro, Verde
- 2008
(Show Context)
Citation Context ...ay 2000; Manski and Tamer 2002; Marino and Palumbo 2002; 2003), time series analysis (Möller and Reuter 2006), principal components analysis (Lauro and Palumbo 2000; 2003; 2005; Gioia and Lauro 2006; =-=Lauro et al. 2007-=-), factorial analysis (Palumbo and Irpino 2005), outlier detection (Kreinovich et al. 2005a; Neumann et al. 2006), hypothesis testing (Kutterer 2004), classification (Zaffalon 2002; 2005), and cluster... |

1 | Sampling without probabilistic model. Pages 369-390 - Beer - 2006 |

1 | Towards adding probabilities and correlations to interval computations. International Journal of Approximate Reasoning [to appear - Berleant, Ceberio, et al. - 2007 |

1 | New clustering methods of interval data. [manuscript - Chavent, Carvalho, et al. - 2005 |

1 | Use of fuzzy sets for evaluating shear strength of soils - Chuang |

1 | Physics Documents, http://www.av8n.com/physics/ (especially webpage 13, “A discussion of how to report measurement uncertainties - Denker |

1 |
Measurement Uncertainty Methods and Applications (second edition
- Dieck
- 1997
(Show Context)
Citation Context ...ement uncertainty. It constitutes a possible alternative to or generalization of current approaches (Coleman and Steele 1999; Bevington and Robinson 1992; Taylor and Kuyatt 1994; Carroll et al. 1995; =-=Dieck 1997-=-) which require sometimes untenable assumptions and cannot really capture interval uncertainty. This approach will be of limited interest for the lucky metrologists who can properly neglect resolution... |

1 |
A distribution-independent bound on the level of confidence in the result of a measurement
- Estler
- 1997
(Show Context)
Citation Context ...se, it represents the only* known method for obtaining true confidence limits on the mean that does not depend on an analyst’s making shape assumptions about the distribution of the variable. *Contra =-=Estler 1997-=-; contra Singh et al. 1997; 2000; Schulz and Griffin 1999; see Huber 2001. 74sNumerical example. Kolmogorov-Smirnov 95% confidence bands on the distributions for the skinny and puffy data distribution... |

1 | Probability bounds analysis. Pages 669–678 in Computing in Environmental Resource Management - Ferson - 1997 |

1 |
Computing variance for interval data is NP-hard
- 2002a
- 2002
(Show Context)
Citation Context ...nce of non-detects among the data. This method requires that the detection limit be the same for all the data and that the underlying data are normally distributed. Current guidance from the U.S. EPA =-=(2002)-=- suggests an interval statistics (bounding) approach to computing the bounds on the confidence limits in the presence of non-detects and other forms of data censoring using mathematical optimization, ... |

1 | Fuzzy representation and reasoning in geotechnical site characterization - Huang, Siller - 1997 |

1 | Quantitative Decisions]. 2001. “Statistical tests: the Chebyshev UCL proposal”. http://www.quantdec.com/envstats/notes/class_12/ucl.htm#Chebyshev Insua - Huber - 2000 |

1 | Rapport du groupe de travail sur l’expression des incertitudes au Comité International des Poids et Mesures - Kaarls - 1981 |

1 |
Computing Higher Central Moments for Interval Data. University of Texas at El Paso
- 2004a
(Show Context)
Citation Context ...depend on the values of Xi that shift from the right endpoint to the left endpoint as k increases. Keeping track of these can reduce the overall effort required by the calculation. Granvillierset al. =-=(2004)-=- showed that the computational effort needed by this algorithm in the worst case grows with the number of samples N as a function that is proportional to N log N. Computer scientists describe such an ... |

1 |
Outlier detection under interval uncertainty: algorithmic solvability and computational complexity. Reliable Computing 11 (1): 59-76. http://www.cs.utep.edu/vladik/2003/tr0310f.pdf Kreinovich
- 2005a
(Show Context)
Citation Context ...resented by intervals, they would also be affected by the measurement precision. Using the algorithms described in section 4.8 to compute confidence limits on the mean for interval data sets, Hajagos =-=(2005)-=- described a nonlinear tradeoff between precision and sample size as they affect the upper bound on the upper confidence limit. He conducted Monte Carlo simulation studies in which he varied both the ... |

1 | 2006a. Towards combining probabilistic and interval uncertainty in engineering calculations: algorithms for computing statistics under interval uncertainty, and their computational complexity - Kandathi, Torres, et al. |

1 |
Computing mean and variance under Dempster-Shafer uncertainty: towards faster algorithms
- 2006b
- 2004
(Show Context)
Citation Context ... kernel functions and the mixture operation extends in a straightforward way when the input data contain interval uncertainty, this method can easily be generalized to the case of interval data. Beer =-=(2006)-=- describes a novel method of iterated constrained bootstrapping with random modification by which a small sample (~200 values) may be used to generate an arbitrarily large sample, which is essentially... |

1 | Statistical Applications using Fuzzy Sets.Wiley - Manton, Woodbury, et al. - 1994 |