Results 1  10
of
43
Epipolarplane image analysis: An approach to determining structure from motion
 Intern..1. Computer Vision
, 1987
"... We present a technique for building a threedimensional description of a static scene from a dense sequence of images. These images are taken in such rapid succession that they form a solid block of data in which the temporal continuity from image to image is approximately equal to the spatial conti ..."
Abstract

Cited by 208 (3 self)
 Add to MetaCart
We present a technique for building a threedimensional description of a static scene from a dense sequence of images. These images are taken in such rapid succession that they form a solid block of data in which the temporal continuity from image to image is approximately equal to the spatial continuity in an individual image. The technique utilizes knowledge of the camera motion to form and analyze slices of this solid. These slices directly encode not only the threedimensional positions of objects, but also such spatiotemporal events as the occlusion of one object by another. For straightline camera motions, these slices have a simple linear structure that makes them easier to analyze. The analysis computes the threedimensional positions of object features, marks occlusion boundaries on the objects, and builds a threedimensional map of "free space. " In our article, we first describe the application of this technique to a simple camera motion, and then show how projective duality is used to extend the analysis to a wider class of camera motions and object types that include curved and moving objects. 1
Computations with Imprecise Parameters in Engineering Design: Background and Theory
 ASME JOURNAL OF MECHANISMS, TRANSMISSIONS, AND AUTOMATION IN DESIGN
, 1989
"... A technique to perform design calculations on imprecise representations of parameters has been developed and is presented. The level of imprecision in the description of design elements is typically high in the preliminary phase of engineering design. This imprecision is represented using the fuzzy ..."
Abstract

Cited by 51 (18 self)
 Add to MetaCart
A technique to perform design calculations on imprecise representations of parameters has been developed and is presented. The level of imprecision in the description of design elements is typically high in the preliminary phase of engineering design. This imprecision is represented using the fuzzy calculus. Calculations can be performed using this method, to produce (imprecise) performance parameters from imprecise (input) design parameters. The Fuzzy Weighted Average technique is used to perform these calculations. A new metric, called the γlevel measure, is introduced to determine the relative coupling between imprecise inputs and outputs. The background and theory supporting this approach are presented, along with one example.
Separating processes within a trial in eventrelated functional MRI. I. The method. NeuroImage 13
, 2001
"... Many cognitive processes occur on time scales that can significantly affect the shape of the blood oxygenation leveldependent (BOLD) response in eventrelated functional MRI. This shape can be estimated from event related designs, even if these processes occur in a fixed temporal sequence (J. M. Oll ..."
Abstract

Cited by 42 (1 self)
 Add to MetaCart
Many cognitive processes occur on time scales that can significantly affect the shape of the blood oxygenation leveldependent (BOLD) response in eventrelated functional MRI. This shape can be estimated from event related designs, even if these processes occur in a fixed temporal sequence (J. M. Ollinger, G. L. Shulman, and M. Corbetta. 2001. NeuroImage 13: 210–217). Several important considerations come into play when interpreting these time courses. First, in single subjects, correlations among neighboring time points give the noise a smooth appearance that can be confused with changes in the BOLD response. Second, the variance and degree of correlation among estimated time courses are strongly influenced by the timing of the experimental design. Simulations show that optimal results are obtained if the intertrial intervals are as short as possible, if they follow an exponential distribution with at least three distinct values, and if 40 % of the trials are partial trials. These results are not particularly sensitive to the fraction of partial trials, so accurate estimation of time courses can be obtained with lower percentages of partial trials (20–25%). Third, statistical maps can be formed from F statistics computed with the extra sum of square principle or by t statistics computed from the crosscorrelation of the time courses with a model for the hemodynamic response. The latter method relies on an accurate model for the hemodynamic response. The most robust model among those tested was a single gamma function. Finally, the power spectrum of the measured BOLD signal in rapid eventrelated paradigms is similar to that of the noise. Nevertheless, highpass filtering is desirable if the appropriate model
Sensitivity analysis of discrete stochastic systems
 Biophysical Journal
, 2005
"... ABSTRACT Sensitivity analysis quantifies the dependence of system behavior on the parameters that affect the process dynamics. Classical sensitivity analysis, however, does not directly apply to discrete stochastic dynamical systems, which have recently gained popularity because of its relevance in ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
ABSTRACT Sensitivity analysis quantifies the dependence of system behavior on the parameters that affect the process dynamics. Classical sensitivity analysis, however, does not directly apply to discrete stochastic dynamical systems, which have recently gained popularity because of its relevance in the simulation of biological processes. In this work, sensitivity analysis for discrete stochastic processes is developed based on density function (distribution) sensitivity, using an analog of the classical sensitivity and the Fisher Information Matrix. There exist many circumstances, such as in systems with multistability, in which the stochastic effects become nontrivial and classical sensitivity analysis on the deterministic representation of a system cannot adequately capture the true system behavior. The proposed analysis is applied to a bistable chemical system—the Schlögl model, and to a synthetic genetic toggleswitch model. Comparisons between the stochastic and deterministic analyses show the significance of explicit consideration of the probabilistic nature in the sensitivity analysis for this class of processes.
Source Term Estimation of Pollution from an Instantaneous Point Source
 MODSIM
, 2002
"... The goal is to develop an inverse model capable of simultaneously estimating the parameters appearing in an air pollution model for an instantaneous point source, by using measured gas concentration data. The approach taken was to develop the inverse model as a nonlinear least squares estimation pr ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
The goal is to develop an inverse model capable of simultaneously estimating the parameters appearing in an air pollution model for an instantaneous point source, by using measured gas concentration data. The approach taken was to develop the inverse model as a nonlinear least squares estimation problem in which the source term is estimated using measurements of pollution concentration on the ground. The statistical basis of the least squares inverse model allows quantification of the uncertainty of the parameter estimates, which in turn allows estimation of the uncertainty of the simulation model predictions.
F.J.: Sensitivity analysis of discrete stochastic systems
 Biophysical Journal
"... ABSTRACT Sensitivity analysis quantifies the dependence of system behavior on the parameters that affect the process dynamics. Classical sensitivity analysis, however, does not directly apply to discrete stochastic dynamical systems, which have recently gained popularity because of its relevance in ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
ABSTRACT Sensitivity analysis quantifies the dependence of system behavior on the parameters that affect the process dynamics. Classical sensitivity analysis, however, does not directly apply to discrete stochastic dynamical systems, which have recently gained popularity because of its relevance in the simulation of biological processes. In this work, sensitivity analysis for discrete stochastic processes is developed based on density function (distribution) sensitivity, using an analog of the classical sensitivity and the Fisher Information Matrix. There exist many circumstances, such as in systems with multistability, in which the stochastic effects become nontrivial and classical sensitivity analysis on the deterministic representation of a system cannot adequately capture the true system behavior. The proposed analysis is applied to a bistable chemical system—the Schlögl model, and to a synthetic genetic toggleswitch model. Comparisons between the stochastic and deterministic analyses show the significance of explicit consideration of the probabilistic nature in the sensitivity analysis for this class of processes.
Stochastic Simulation of Hepatic Preneoplastic Foci Development for Four Chlorobenzene Congeners in a MediumTerm Bioassay
"... A combination of experimental and simulation approaches was used to analyze clonal growth of glutathioneStransferase � (GSTP) enzymealtered foci during liver carcinogenesis in an initiationpromotion regimen for 1,4dichlorobenzene (DCB), 1,2,4,5tetrachlorobenzene (TECB), pentachlorobenzene (PE ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
A combination of experimental and simulation approaches was used to analyze clonal growth of glutathioneStransferase � (GSTP) enzymealtered foci during liver carcinogenesis in an initiationpromotion regimen for 1,4dichlorobenzene (DCB), 1,2,4,5tetrachlorobenzene (TECB), pentachlorobenzene (PECB), and hexachlorobenzene (HCB). Male Fisher 344 rats, eight weeks of age, were initiated with a single dose (200 mg/kg, ip) of diethylnitrosamine (DEN). Two weeks later, daily dosing of 0.1 mol/kg chlorobenzene was maintained for six weeks. Partial hepatectomy was performed three weeks after initiation. Liver weight, normal hepatocyte division rates, and the number and volume of GSTP positive foci were obtained at 23, 26, 28, 47, and 56 days after initiation. A clonal growth stochastic model separating the initiated cell population into two distinct subtypes (referred to as A and B cells) was successfully used to describe the foci development
Robust BiasCorrected Least Squares Fitting Of Ellipses
, 2000
"... This paper presents a robust and accurate technique for an estimation of the bestt ellipse going through the given set of points. The approach is based on a least squares minimization of algebraic distances of the points with a correction of the statistical bias caused during the computation. An ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper presents a robust and accurate technique for an estimation of the bestt ellipse going through the given set of points. The approach is based on a least squares minimization of algebraic distances of the points with a correction of the statistical bias caused during the computation. An accurate ellipsespecic solution is guaranteed even for scattered or noisy data with outliers. Although the nal algorithm is iterative, it typically converges in a fraction of time needed for a true orthogonal tting based on Eucleidan distances of points. Keywords: ellipses, least squares, robust tting, Mestimators, statistical bias, renormalization 1
Study of Cluster Formation and its Effects on Rayleigh and Raman Scattering Measurements
 in a Mach 6 Wind Tunnel," AIAA paper # 911496, 22nd Fluid Dynamics, Plasma Dynamics, & Lasers Conference
, 1991
"... Using a frequencydoubled NdYAG pulsed laser and a singleintensified CCD camera, Rayleigh scattering measurements have been performed to study the cluster formation in a Mach 6 wind tunnel at NASA Langley Research Center. These studies were conducted both in the free stream and in a model flow fie ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Using a frequencydoubled NdYAG pulsed laser and a singleintensified CCD camera, Rayleigh scattering measurements have been performed to study the cluster formation in a Mach 6 wind tunnel at NASA Langley Research Center. These studies were conducted both in the free stream and in a model flow field for various flow conditions to gain an understanding of the dependence of the Rayleigh scattering (by clusters) on the local pressures and temperatures in the facility. Using the same laser system, we have also performed simultaneous measurements of the local temperature using the rotational Raman scattering of molecular nitrogen and determined the densities of molecular oxygen and nitrogen by using the vibrational Raman scattering from these species. Quantitative results will be presented in detail with emphasis on the applicability of the Rayleigh scattering for obtaining quantitative measurements of molecular densities both in the free stream and in the model flow field. _______ Copyr...
Multidimensional successive categories scaling: A maximum likelihood method
 Psychometrika
, 1981
"... A singlestep maximum likelihood estimation procedure is developed for multidimensional scaling of dissimilarity data measured on rating scales. The procedure can fit the euclidian distance model to the data under various assumptions about category widths and under two distributional assumptions. Th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A singlestep maximum likelihood estimation procedure is developed for multidimensional scaling of dissimilarity data measured on rating scales. The procedure can fit the euclidian distance model to the data under various assumptions about category widths and under two distributional assumptions. The scoring algorithm for parameter estimation has been developed and implemented in the form of a computer program. Practical uses of the method are demonstrated with an emphasis on various advantages of the method as a statistical procedure. Key words: similarity successive categories. ratings, maximum likelihood multidimensional scaling (MDS), method