Results 1  10
of
15
Liegroup methods
 ACTA NUMERICA
, 2000
"... Many differential equations of practical interest evolve on Lie groups or on manifolds acted upon by Lie groups. The retention of Liegroup structure under discretization is often vital in the recovery of qualitatively correct geometry and dynamics and in the minimization of numerical error. Having ..."
Abstract

Cited by 93 (18 self)
 Add to MetaCart
Many differential equations of practical interest evolve on Lie groups or on manifolds acted upon by Lie groups. The retention of Liegroup structure under discretization is often vital in the recovery of qualitatively correct geometry and dynamics and in the minimization of numerical error. Having introduced requisite elements of differential geometry, this paper surveys the novel theory of numerical integrators that respect Liegroup structure, highlighting theory, algorithmic issues and a number of applications.
On the Global Error of Discretization Methods for HighlyOscillatory Ordinary Differential Equations
, 2000
"... Commencing from a globalerror formula, originally due to Henrici, we investigate the accumulation of global error in the numerical solution of linear highlyoscillating systems of the form y 00 + g(t)y = 0, where g(t) t!1 \Gamma! 1. Using WKB analysis we derive an explicit form of the globalerror ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Commencing from a globalerror formula, originally due to Henrici, we investigate the accumulation of global error in the numerical solution of linear highlyoscillating systems of the form y 00 + g(t)y = 0, where g(t) t!1 \Gamma! 1. Using WKB analysis we derive an explicit form of the globalerror envelope for RungeKutta and Magnus methods. Our results are closely matched by numerical experiments. Motivated by the superior performance of Liegroup methods, we present a modification of the Magnus expansion which displays even better longterm behaviour in the presence of oscillations.
Improved high order integrators based on the Magnus expansion
 BIT
, 1999
"... We build high order efficient numerical integration methods for solving the linear differential equation X = A(t)X based on Magnus expansion. These methods ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
We build high order efficient numerical integration methods for solving the linear differential equation X = A(t)X based on Magnus expansion. These methods
Numerical Methods for Strong Solutions of Stochastic Dierential Equations: an Overview
, 2003
"... This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations (SDEs). We give a brief survey of the area focusing on a number of application areas where approximations to strong solution ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations (SDEs). We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications (section 1), and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes (section 2). In section 3 we present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods which preserve the underlying structure of the problem. In sections 4 and 5 we present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, in section 6 we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable stepsize implementations based on various types of control.
Quadrature Methods Based on the Cayley Transform
, 1999
"... Integration of Lie type equations on matrix Lie groups using Magnus series methods has recently been proposed by Iserles and Nørsett. The methods use the exponential mapping, whose computation maybevery costly. In this paper a smaller class of Lie groups is considered, for which the exponential mapp ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
Integration of Lie type equations on matrix Lie groups using Magnus series methods has recently been proposed by Iserles and Nørsett. The methods use the exponential mapping, whose computation maybevery costly. In this paper a smaller class of Lie groups is considered, for which the exponential mapping can be replaced by e.g. the Cayley transform or the diagonal Padé approximants. Particular methods are being derived, and numerical experiments that illustrate and verify properties of the new methods are included.
High order optimized geometric integrators for linear differential equations
, 2000
"... In this paper new integration algorithms for linear differential equations up to eighth order are obtained. Starting from Magnus expansion, methods based on Cayley transformation and Fer expansion are also built. The structure of the exact solution is retained while the computational cost is reduced ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
In this paper new integration algorithms for linear differential equations up to eighth order are obtained. Starting from Magnus expansion, methods based on Cayley transformation and Fer expansion are also built. The structure of the exact solution is retained while the computational cost is reduced compared to similar methods. Their relative performance is tested on some illustrative examples.
Interpolation in Lie groups and homogeneous spaces
 SIAM J. NUMER. ANAL
, 1998
"... We consider interpolation in Lie groups and homogeneous spaces. Based on points on the manifold together with tangent vectors at (some of) these points, we construct Hermite interpolation polynomials. If the points and tangent vectors are produced in the process of integrating an ordinary differenti ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We consider interpolation in Lie groups and homogeneous spaces. Based on points on the manifold together with tangent vectors at (some of) these points, we construct Hermite interpolation polynomials. If the points and tangent vectors are produced in the process of integrating an ordinary differential equation on a Lie group or a homogeneous space, we use the truncated inverse of the differential of the exponential mapping and the truncated BakerCampbellHausdorff formula to relatively cheaply construct an interpolation polynomial. Much effort has lately been put into research on geometric integration, i.e. the process of integrating a differential equation in such away that the configuration space is respected by the numerical solution. Some of these methods may be viewed as generalizations of classical methods, and we investigate the construction of intrinsic dense output devices as generalizations of the continuous RungeKutta methods.
Complexity theory for Liegroup solvers
, 1999
"... Commencing with a brief survey of Liegroup theory and differential equations evolving on Lie groups, we describe a number of numerical algorithms designed to respect Liegroup structure: RungeKuttaMuntheKaas schemes, Fer and Magnus expansions. This is followed by complexity analysis of Fer and ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Commencing with a brief survey of Liegroup theory and differential equations evolving on Lie groups, we describe a number of numerical algorithms designed to respect Liegroup structure: RungeKuttaMuntheKaas schemes, Fer and Magnus expansions. This is followed by complexity analysis of Fer and Magnus expansions, whose conclusion is that for order four, six and eight an appropriately discretized Magnus method is always cheaper than a Fer method of the same order. Each Liegroup method of the kind surveyed in this paper requires the computation of a matrix exponential. Classical methods, e.g. Krylovsubspace and rational approximants, may fail to map elements in a Lie algebra to a Lie group. Therefore we survey a number of approximants based on the splitting approach and demonstrate that their cost is compatible (and often superior) to classical methods. 1 Introduction A central goal of classical numerical analysis is to design, implement and analyse computational algorithms that ...
Interpolation in Lie Groups
 SIAM J. NUMER. ANAL.
, 1999
"... We consider interpolation in Lie groups. Based on points on the manifold together with tangent vectors at (some of) these points, we construct Hermite interpolation polynomials. If the points and tangent vectors are produced in the process of integrating an ordinary differential equation in terms ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We consider interpolation in Lie groups. Based on points on the manifold together with tangent vectors at (some of) these points, we construct Hermite interpolation polynomials. If the points and tangent vectors are produced in the process of integrating an ordinary differential equation in terms of Liealgebra actions, we use the truncated inverse of the di#erential of the exponential mapping and the truncated BakerCampbellHausdorff formula to relatively cheaply construct an interpolation polynomial. Much effort has lately been put into research on geometric integration, i.e., the process of integrating differential equations in such a way that the configuration space of the true solution is respected by the numerical solution. Some of these methods may be viewed as generalizations of classical methods, and we investigate the construction of intrinsic dense output devices as generalizations of the continuous RungeKutta methods.
Accurate and Efficient Simulation of Rigid Body Rotations
 J. of Comp. Phys
, 2000
"... This paper introduces efficient and accurate algorithms for simulating the rotation of a threedimensional rigid object and compares them to several prior methods. The paper considers algorithms which exactly preserve angular momentum and either closely preserve or exactly conserve energy. First, we ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This paper introduces efficient and accurate algorithms for simulating the rotation of a threedimensional rigid object and compares them to several prior methods. The paper considers algorithms which exactly preserve angular momentum and either closely preserve or exactly conserve energy. First, we introduce a secondorder accurate method that incorporates a thirdorder correction; then a thirdorder accurate method; and finally a fourthorder accurate method. These methods are singlestep and the update operation is only a single rotation. The algorithms are derived in a general Lie group setting. Second, we introduce a nearoptimal energycorrection method which allows exact conservation of energy. This algorithm is faster and easier to implement than implicit methods for exact energyconservation. Our thirdorder method with energy conservation is experimentally seen to act better than a fourthorder accurate method. These new methods are superior to naive RungeKutta or predictorcorrector methods, which are only secondorder accurate for spherevalued functions. They are also superior to the explicit methods of SimoWong. The secondorder symplectic McLachlanReich methods are observed to be excellent at approximate energyconservation for extended periods of time, but are not as good at longterm accuracy as our best methods. Finally we present comparisons with fourthorder accurate symplectic methods, which have good accuracy but higher computational cost.