Results 1  10
of
25
The Finite Volume, Finite Element, and Finite Difference Methods as Numerical Methods for Physical Field Problems
 Journal of Computational Physics
, 2000
"... The present work describes an alternative to the classical partial differential equationsbased approach to the discretization of physical field problems. This alternative is based on a preliminary reformulation of the mathematical model in a partially discrete form, which preserves as much as possi ..."
Abstract

Cited by 47 (1 self)
 Add to MetaCart
The present work describes an alternative to the classical partial differential equationsbased approach to the discretization of physical field problems. This alternative is based on a preliminary reformulation of the mathematical model in a partially discrete form, which preserves as much as possible the physical and geometrical content of the original problem, and is made possible by the existence and properties of a common mathematical structure of physical field theories. The goal is to maintain the focus, both in the modeling and in the discretizati on step, on the physics of the problem, thinking in terms of numerical methods for physical field problems, and not for a particular mathematical form (for example, a partial differential equation) into which the original physical problem happens to be translated.
Nonrigid Motion Analysis Based on Dynamic Refinement of Finite Element Models
 IEEE Trans. on Pattern Analysis and Machine Intelligence
, 1998
"... In this paper we propose new algorithms for accurate nonrigid motion tracking. Given only a set of sparse correspondences and incomplete or missing information about geometry or material properties, we recover dense motion vectors using nonlinear finite element models. The method is based on the ite ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
In this paper we propose new algorithms for accurate nonrigid motion tracking. Given only a set of sparse correspondences and incomplete or missing information about geometry or material properties, we recover dense motion vectors using nonlinear finite element models. The method is based on the iterative analysis of the differences between the actual and predicted behavior. Large differences indicate that an object's properties are not captured properly by the model. Feedback from the images during the motion allows the refinement of the model by minimizing the error between the expected and true position of the object's points. Unknown parameters are recovered using an iterative descent search for the best model that approximates nonrigid motion of the given object. Thus, during tracking the model is refined which, in turn, improves tracking quality. The method was applied successfully to manmade elastic materials and human skin to recover unknown elasticity, to complex 3D objects to find...
A Finite Element Method for Deformable Models
 Proceedings of the Fifth Alvey Vision Conference
, 1989
"... Deformable models of elastic structures have been proposed for use in image analysis. Previous work has used a variational approach, based on the EulerLagrange theory. In this paper an alternative mathematical treatment is introduced, based on a direct minimisation of the underlying energy integral ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Deformable models of elastic structures have been proposed for use in image analysis. Previous work has used a variational approach, based on the EulerLagrange theory. In this paper an alternative mathematical treatment is introduced, based on a direct minimisation of the underlying energy integral using the Finite Element Method. The method is outlined and demonstrated, and its principal advantages for modelbased image interpretation are explained.
A VisionBased Technique for Objective Assessment of Burn Scars
 IEEE TRANS. MED. IMAG
, 1998
"... In this paper a method for the objective assessment of burn scars is proposed. The quantitative measures developed in this research provide an objective way to calculate elastic properties of burn scars relative to the surrounding areas. The approach combines range data and the mechanics and motion ..."
Abstract

Cited by 10 (9 self)
 Add to MetaCart
In this paper a method for the objective assessment of burn scars is proposed. The quantitative measures developed in this research provide an objective way to calculate elastic properties of burn scars relative to the surrounding areas. The approach combines range data and the mechanics and motion dynamics of human tissues. Active contours are employed to locate regions of interest and to find displacements of feature points using automatically established correspondences. Changes in strain distribution over time are evaluated. Given images at two time instances and their corresponding features, the finite element method is used to synthesize strain distributions of the underlying tissues. This results in a physically based framework for motion and strain analysis. Relative elasticity of the burn scar is then recovered using iterative descent search for the best nonlinear finite element model that approximates stretching behavior of the region containing the burn scar. The results from the skin elasticity experiments illustrate the ability to objectively detect differences in elasticity between normal and abnormal tissue. These estimated differences in elasticity are correlated against the subjective judgments of physicians that are presently the practice.
Fast FloatingPoint Processing in Common Lisp
 ACM Trans. on Math. Software
, 1995
"... this paper we explore an approach which enables all of the problems listed above to be solved at a single stroke: use Lisp as the source language for the numeric and graphical code! This is not a new idea  it was tried at MIT and UCB in the 1970's. While these experiments were modestly succe ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
this paper we explore an approach which enables all of the problems listed above to be solved at a single stroke: use Lisp as the source language for the numeric and graphical code! This is not a new idea  it was tried at MIT and UCB in the 1970's. While these experiments were modestly successful, the particular systems are obsolete. Fortunately, some of those ideas used in Maclisp [37], NIL [38] and Franz Lisp [20] were incorporated in the subsequent standardization of Common Lisp (CL) [35]. In this new setting it is appropriate to reexamine the theoretical and practical implications of writing numeric code in Lisp. The popular conceptions of Lisp's inefficiency for numerics have been based on rumor, supposition, and experience with early and (in fact) inefficient implementations. It is certainly possible to continue to write inefficient programs: As one example of the results of deemphasizing numerics in the design, consider the situation of the basic arithmetic operators. The definitions of these functions require that they are generic, (e.g. "+" must be able to add any combination of several precisions of floats, arbitraryprecision integers, rational numbers, and complexes), The very simple way of implementing this arithmetic  by subroutine calls  is also very inefficient. Even with appropriate declarations to enable more specific treatment of numeric types, compilers are free to ignore declarations and such implementations naturally do not accommodate the needs of intensive numbercrunching. (See the appendix for further discussion of declarations). Be this as it may, the situation with respect to Lisp has changed for the better in recent years. With the advent of ANSI standard Common Lisp, several active vendors of implementations and one active universi...
Fusion of PhysicallyBased Registration and Deformation Modeling for Nonrigid Motion Analysis
, 2001
"... In our previous work we used finite element models to determine nonrigid motion parameters and recover unknown local properties of objects given correspondence data recovered with snakes or other tracking models. In this paper we present a novel multiscale approach to recovery of nonrigid motion fro ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In our previous work we used finite element models to determine nonrigid motion parameters and recover unknown local properties of objects given correspondence data recovered with snakes or other tracking models. In this paper we present a novel multiscale approach to recovery of nonrigid motion from sequences of registered intensity and range images. The main idea of our approach is that a finite element (FEM) model incorporating material properties of the object can naturally handle both registration and deformation modeling using a single modeldriving strategy. The method includes a multiscale iterative algorithm based on analysis of the undirected Hausdorff distance to recover correspondences. The method is evaluated with respect to speed and accuracy. Noise sensitivity issues are addressed. Advantages of the proposed approach are demonstrated using manmade elastic materials and human skin motion. Experiments with regular grid features are used for performance comparison with a conventional approach (separate snakes and FEM models). It is shown, however, that the new method does not require a sampling/correspondence template and can adapt the model to available object features. Usefulness of the method is presented not only in the context of tracking and motion analysis, but also for a burn scar detection application.
Automated parameter studies using a Cartesian method,” AIAAPaper
, 2004
"... A modular process for performing general parametric studies about an aerodynamic configuration using a Cartesian method is described. A novel part of this process is the automatic handling of general control surfaces deflections based upon simple, userspecified inputs. The article focuses on the use ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
A modular process for performing general parametric studies about an aerodynamic configuration using a Cartesian method is described. A novel part of this process is the automatic handling of general control surfaces deflections based upon simple, userspecified inputs. The article focuses on the use of aerodynamic databases in the design process. Database flythrough is used to develop and analyze guidance and control systems, and to evaluate performance data. Validation comparisons with experimental data and NavierStokes simulations are presented for the Langley GlideBack Booster vehicle. Two example parametric databases with control surfaces deflections are presented: an autonomous Mars explorer aircraft which contains 4700 datapoints and two movable elevons, and the Space Shuttle launch vehicle in ascent configuration with gimbaling engine nozzles. The database for the Mars aircraft has been used to validate a generic neuralnetwork control system, and trajectory simulations using the shuttle aerodynamic data are coupled with an optimization algorithm to develop a closedloop feedback pitch controller.
Structural Design Optimization and Comparative Analysis of a New HighPerformance Robot Arm via Finite Element Analysis
, 1997
"... This paper reports the structural design of a new highperformance robot arm. Design objectives for the new arm include large (12 meter) workspace, low weight, 5 kg payload capacity, high stiffness, high structural vibration frequencies, precise jointlevel torque control, a total of three degrees ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
This paper reports the structural design of a new highperformance robot arm. Design objectives for the new arm include large (12 meter) workspace, low weight, 5 kg payload capacity, high stiffness, high structural vibration frequencies, precise jointlevel torque control, a total of three degreesoffreedom, and mechanical simplicity. A comparative analysis is reported for four very different two degreeoffreedom linkage candidates using the finite element method. The authors gratefully acknowledge the support of the National Science Foundation under CAREER grant BES9625143 awarded to the last author. Correspondence Address: 123 Latrobe Hall, 3400 North Charles Street, Baltimore, Maryland, 21218, email addresses: roy@jhu.edu, llw@jhu.edu. A preliminary version of this paper was presented at the 1997 IEEE International Conference on Robotics and Automation, Albuquerque,New Mexico. 1 Introduction Our goal is to design and build a robot arm for high performance tracking and force c...
Grid Filters For Local Nonlinear Image Restoration
, 1998
"... We describe a new approach to local nonlinear image restoration, based on approximating functions using a regular grid of points in a manydimensional space. Symmetry reductions and compression of the sparse grid make it feasible to work with eightdimensional grids as large as 14 8 . Unlike polyn ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We describe a new approach to local nonlinear image restoration, based on approximating functions using a regular grid of points in a manydimensional space. Symmetry reductions and compression of the sparse grid make it feasible to work with eightdimensional grids as large as 14 8 . Unlike polynomials and neural networks whose filtering complexity per pixel is linear in the number of filter coefficients, grid filters have O(1) complexity per pixel. Grid filters require only a single presentation of the training samples, are numerically stable, leave unusual image features unchanged, and are a superset of order statistic filters. Results are presented for blurring and additive noise.
Load Balancing for Problems with Good Bisectors, and Applications in Finite Element Simulations
 In Proceedings of the Fourth International EUROPAR Conference on Parallel Processing EUROPAR'98, volume 1470 of LNCS
, 1998
"... This paper studies load balancing issues for classes of problems with certain bisection properties. A class of problems has ffbisectors if every problem in the class can be subdivided into two subproblems whose weight is not smaller than an fffraction of the original problem. It is shown that the ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper studies load balancing issues for classes of problems with certain bisection properties. A class of problems has ffbisectors if every problem in the class can be subdivided into two subproblems whose weight is not smaller than an fffraction of the original problem. It is shown that the maximum weight of a subproblem produced by Algorithm HF, which partitions a given problem into N subproblems by always subdividing the problem with maximum weight, is at most a factor of b1=ffc \Delta (1 \Gamma ff) b1=ffc\Gamma2 greater than the theoretical optimum (uniform partition). This bound is proved to be tight. Two strategies to use Algorithm HF for load balancing distributed hierarchical finite element simulations are presented. For this purpose, a certain class of weighted binary trees representing the load of such applications is shown to have 1=4bisectors. This establishes a performance guarantee of 9=4 for load balancing in this case. 1 Introduction Load balancing is one of...