Results 11  20
of
340
Visualizing LargeScale Uncertainty in Astrophysical Data
 IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
, 2007
"... Visualization of uncertainty or error in astrophysical data is seldom available in simulations of astronomical phenomena, and yet almost all rendered attributes possess some degree of uncertainty due to observational error. Uncertainties associated with spatial location typically vary significantly ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Visualization of uncertainty or error in astrophysical data is seldom available in simulations of astronomical phenomena, and yet almost all rendered attributes possess some degree of uncertainty due to observational error. Uncertainties associated with spatial location typically vary significantly with scale and thus introduce further complexity in the interpretation of a given visualization. This paper introduces effective techniques for visualizing uncertainty in largescale virtual astrophysical environments. Building upon our previous transparently scalable visualization architecture, we develop tools that enhance the perception and comprehension of uncertainty across wide scale ranges. Our methods include a unified colorcoding scheme for representing logscale distances and percentage errors, an ellipsoid model to represent positional uncertainty, an ellipsoid envelope model to expose trajectory uncertainty, and a magicglass design supporting the selection of ranges of logscale distance and uncertainty parameters, as well as an overview mode and a scalable WIM tool for exposing the magnitudes of spatial context and uncertainty.
Realistic Evaluation of the Precision and Accuracy of Instrument Calibration Systems
, 1963
"... “precision ” and “accuracy ” were used in a qualitative manner to characterize measurements. These terms appeared in many American Society for Testing Materials (ASTM) standards long before any common agreement or understanding had been reached as to their meanings and consequences. Circa 1950, indi ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
(Show Context)
“precision ” and “accuracy ” were used in a qualitative manner to characterize measurements. These terms appeared in many American Society for Testing Materials (ASTM) standards long before any common agreement or understanding had been reached as to their meanings and consequences. Circa 1950, individuals and organizations began concerted efforts to right this situation. Churchill Eisenhart was drawn to this issue as it related to calibrations, which he called refined measurement methods. As Chief of the Statistical Engineering Laboratory (SEL), Applied Mathematics Division, he set out to put the concepts of accuracy and precision on a solid statistical basis for NBS scientists and metrologists. His paper on the subject, published in 1961 [1], was to become the preeminent publication on the subject. With impeccable scholarship and commitment to detail, Eisenhart synthesized his own work [2] and the writings of statistical theorists and practitioners, Walter Shewhart [3], Edwards Deming, Raymond Birge [4], and R. B. Murphy [5], into concepts of quality control that could be applied to measurement processes. Three basic concepts in the paper were immediately accepted by metrologists at NBS, namely: (1) a measurement process requires statistical control; (2) statistical control implies control of both reproducibility and repeatability; and (3) a measurement result requires an associated statement of uncertainty that includes any possible source of bias. In this paper, for the first time, measurements themselves were described as a process whose output can be controlled using statistical techniques. Eisenhart reinforced the conclusion, probably first drawn by Murphy [5], that “Incapability of control implies that the results of measurement are not to be trusted as an indication of the physical property at hand—in short, we are not in any verifiable sense measuring anything”—when he says, “a measurement operation must have attained what is known in industrial quality control language as a state of statistical control... before it can be regarded in any logical sense as measuring anything at all.” Eisenhart’s paper, coupled with work by other SEL statisticians, had a lasting and profound effect on
Visualizing Geometric Uncertainty of Surface Interpolants
 of Surface Interpolants,” Proc. Graphics Interface
, 1996
"... Evaluating and comparing the quality of surface interpolants is an important problem in computer graphics, computer aided geometric design and scientific visualization. We introduce geometric uncertainty as a measure of interpolation error, level of confidence or quality of an interpolant. Geometric ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
Evaluating and comparing the quality of surface interpolants is an important problem in computer graphics, computer aided geometric design and scientific visualization. We introduce geometric uncertainty as a measure of interpolation error, level of confidence or quality of an interpolant. Geometric uncertainty can be estimated as a scalar or a vectorvalued function that depends upon geometric characteristics of interpolants associated with the underlying data. These characteristics include position, normals, isophotes, principal curvatures and directions, mean and Gaussian curvatures. We present several new techniques for visualizing geometric uncertainty of surface interpolants, that combine the strengths of traditional techniques such as pseudocoloring, differencing, overlay, and transparency with new glyph and texturebased techniques. The viewer can control an interactive querydriven toolbox to create a wide variety of graphics that allow probing of geometric information in useful and convenient ways. We demonstrate the effectiveness of these techniques by visualizing geometric uncertainty of surfaces obtained by different interpolation techniques bilinear, C 0
Bayesian Reasoning Versus Conventional Statistics In High Energy Physics
"... . The intuitive reasoning of physicists in conditions of uncertainty is closer to the Bayesian approach than to the frequentist ideas taught at University and which are considered the reference framework for handling statistical problems. The combination of intuition and conventional statistics allo ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
. The intuitive reasoning of physicists in conditions of uncertainty is closer to the Bayesian approach than to the frequentist ideas taught at University and which are considered the reference framework for handling statistical problems. The combination of intuition and conventional statistics allows practitioners to get results which are very close, both in meaning and in numerical value, to those obtainable by Bayesian methods, at least in simple routine applications. There are, however, cases in which "arbitrary" probability inversions produce unacceptable or misleading results and in these cases the conscious application of Bayesian reasoning becomes crucial. Starting from these considerations, I will finally comment on the often debated question: "is there any chance that all physicists will become Bayesian?" Key words: Subjective Bayesian Theory, High Energy Physics, Measurement Uncertainty 1. Introduction High Energy Physics (HEP) is well known for using very sophisticated de...
Higher Dimensional Vector Field Visualization: A Survey
, 2009
"... Vector field visualization research has evolved very rapidly over the last two decades. There is growing consensus amongst the research community that the challenge of twodimensional vector field visualization is virtually solved as a result of the tremendous amount of effort put into this problem. ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
Vector field visualization research has evolved very rapidly over the last two decades. There is growing consensus amongst the research community that the challenge of twodimensional vector field visualization is virtually solved as a result of the tremendous amount of effort put into this problem. Twodimensional flow, both steady and unsteady can be visualized in realtime, with complete coverage of the flow without much difficulty. However, the same cannot be said of flow in higherspatial dimensions, e.g. surfaces in 3D (2.5D) or volumetric flow (3D). We present a survey of higherspatial dimensional flow visualization techniques based on the presumption that little work remains for the case of twodimensional flow whereas many challenges still remain for the cases of 2.5D and 3D domains. This survey provides the most uptodate review of the stateoftheart of flow visualization in higher dimensions. The reader is provided with a highlevel overview of research in the field highlighting both solved and unsolved problems in this rapidly evolving direction of research.
2005. Validation of abundance estimates from mark–recapture and removal techniques for rainbow trout by electrofishing in small streams. North American Journal of Fisheries Management 25:1395–1410
"... Abstract.—Estimation of fish abundance in streams using the removal model or the Lincoln– Peterson mark–recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
(Show Context)
Abstract.—Estimation of fish abundance in streams using the removal model or the Lincoln– Peterson mark–recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams. For one, two, three, and fourpass sampling effort in closed sites, we evaluated the influences of fish size and habitat characteristics on sampling efficiency and the accuracy of removal abundance estimates. We also examined the use of models to generate unbiased estimates of fish abundance through adjustment of total catch or biased removal estimates. Our results suggested that the assumptions of the mark– recapture model were satisfied and that abundance estimates based on this approach were unbiased. In contrast, the removal model assumptions were not met. Decreasing sampling efficiencies over removal passes resulted in underestimated population sizes and overestimates of sampling efficiency. This bias decreased, but was not eliminated, with increased sampling effort. Biased removal estimates based on different levels of effort were highly correlated with each other but were less correlated with unbiased mark–recapture estimates. Stream size decreased sampling efficiency, and stream size and instream wood increased the negative bias of removal estimates. We found that reliable estimates of population abundance could be obtained from models of sampling efficiency
Exploratory visualization of multivariate data with variable quality
 In Proceedings of the IEEE Symposium on Visual Analytics Science & Technology
, 2006
"... Realworld data is known to be imperfect, suffering from various forms of defects such as sensor variability, estimation errors, uncertainty, human errors in data entry, and gaps in data gathering. Analysis conducted on variable quality data can lead to inaccurate or incorrect results. An effective ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
(Show Context)
Realworld data is known to be imperfect, suffering from various forms of defects such as sensor variability, estimation errors, uncertainty, human errors in data entry, and gaps in data gathering. Analysis conducted on variable quality data can lead to inaccurate or incorrect results. An effective visualization system must make users aware of the quality of their data by explicitly conveying not only the actual data content, but also its quality attributes. While some research has been conducted on visualizing uncertainty in spatiotemporal data and univariate data, little work has been reported on extending this capability into multivariate data visualization. In this paper we describe our approach to the problem of visually exploring multivariate data with variable quality. As a foundation, we propose a general approach to defining quality measures for tabular data, in which data may experience quality problems at three granularities: individual data values, complete records, and specific dimensions. We then present two approaches to visual mapping of quality information into display space. In particular, one solution embeds the quality measures as explicit values into the original dataset by regarding value quality and record quality as new data dimensions. The other solution is to superimpose the quality information within the data visualizations using additional visual variables. We also report on user studies conducted to assess alternate mappings of quality attributes to visual variables for the second method. In addition, we describe case studies that expose some of the advantages and disadvantages of these two approaches.
Analysis of image noise in multispectral color acquisition
, 1997
"... The design of a system for multispectral image capture will be influenced by the imaging application, such as image archiving, vision research, illuminant modification or improved (trichromatic) color reproduction. A key aspect of the system performance is the effect of noise, or error, when acquiri ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
The design of a system for multispectral image capture will be influenced by the imaging application, such as image archiving, vision research, illuminant modification or improved (trichromatic) color reproduction. A key aspect of the system performance is the effect of noise, or error, when acquiring multiple color image records and processing of the data. This research provides an analysis that allows the prediction of the imagenoise characteristics of systems for the capture of multispectral images. The effects of both detector noise and image processing quantization on the color information are considered, as is the correlation between the errors in the component signals. The above multivariate errorpropagation analysis is then applied to an actual prototype system. Sources of image noise in both digital camera and image processing are related to colorimetric errors. Recommendations for detector characteristics and image processing for future systems are then discussed. Indexing terms: color image capture, color image processing, image noise, error propagation, multispectral imaging.
Glyphs for Visualizing Uncertainty in Environmental Vector Fields
 IEEE Transactions on Visualization and Computer Graphics
, 1995
"... Environmental data have inherent uncertainty which is often ignored in visualization. For example, meteorological stations measure wind with good accuracy, but winds are often averaged over minutes or hours. As another example, doppler radars (wind profilers and ocean current radars) take thousands ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
Environmental data have inherent uncertainty which is often ignored in visualization. For example, meteorological stations measure wind with good accuracy, but winds are often averaged over minutes or hours. As another example, doppler radars (wind profilers and ocean current radars) take thousands of samples and average the possibly spurious returns. Others, including time series data have a wealth of uncertainty information, that the traditional vector visualization methods such as using wind barbs and arrow glyphs simply ignore. We have developed new vector glyphs to visualize uncertain winds and ocean currents. Our approach is to include uncertainty in direction and magnitude, as well as the mean direction and length, in vector glyph plots. Our glyphs show the variation in uncertainty, and provide fair comparisons of data from instruments, models, and time averages of varying certainty. We use both qualitative and quantitative methods to compare our glyphs to traditional ones. Subj...
The Difficulty of Measuring Low Friction: Uncertainty Analysis for Friction Coefficient Measurements,”
 Tribology Transactions,
, 2005
"... ..."