Results 1  10
of
15
A nonparametric approach to pricing and hedging derivative securities via learning networks
 Journal of Finance
, 1994
"... http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, noncom ..."
Abstract

Cited by 104 (4 self)
 Add to MetaCart
http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, noncommercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at
Learning Nonlinear Dynamical Systems using an EM Algorithm
 Advances in Neural Information Processing Systems 11
, 1999
"... The Expectation Maximization (EM) algorithm is an iterative procedure for maximum likelihood parameter estimation from data sets with missing or hidden variables[2]. It has been applied to system identification in linear stochastic statespace models, where the state variables are hidden from the ob ..."
Abstract

Cited by 72 (2 self)
 Add to MetaCart
The Expectation Maximization (EM) algorithm is an iterative procedure for maximum likelihood parameter estimation from data sets with missing or hidden variables[2]. It has been applied to system identification in linear stochastic statespace models, where the state variables are hidden from the observer and both the state and the parameters of the model have to be estimated simultaneously [9]. We present a generalization of the EM algorithm for parameter estimation in nonlinear dynamical systems. The "expectation" step makes use of Extended Kalman Smoothing to estimate the state, while the "maximization" step reestimates the parameters using these uncertain state estimates. In general, the nonlinear maximization step is difficult because it requires integrating out the uncertainty in the states. However, if Gaussian radial basis function (RBF) approximators are used to model the nonlinearities, the integrals become tractable and the maximization step can be solved via systems of linear equations.
Local Dimensionality Reduction for Locally Weighted Learning
, 1997
"... Incremental learning of sensorimotor transformations in high dimensional spaces is one of the basic prerequisites for the success of autonomous robot devices as well as biological movement systems. So far, due to sparsity of data in high dimensional spaces, learning in such settings requires a signi ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
Incremental learning of sensorimotor transformations in high dimensional spaces is one of the basic prerequisites for the success of autonomous robot devices as well as biological movement systems. So far, due to sparsity of data in high dimensional spaces, learning in such settings requires a signi#cant amount of prior knowledge about the learning task, usually provided byahuman expert. In this paper we suggest a partial revision of the view. Based on empirical studies, it can been observed that, despite being globally high dimensional and sparse, data distributions from physical movement systems are locally low dimensional and dense. Under this assumption, we derive a learning algorithm, Locally Adaptive Subspace Regression, that exploits this property by combining a local dimensionality reduction as a preprocessing step with a nonparametric learning technique, locally weighted regression. The usefulness of the algorithm and the validity of its assumptions are illustrated for a synthetic data set and data of the inverse dynamics of an actual 7 degreeoffreedom anthropomorphic robot arm.
Adaptive Signal Processing by Particle Filters and Discounting of Old Measurements
"... In adaptive signal processing the principle of exponentially weighted recursive leastsquares plays a major role in developing various estimation algorithms. It is based on the concept of discounting of old measurements and allows for better performance in problems with timevarying signals and sign ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
In adaptive signal processing the principle of exponentially weighted recursive leastsquares plays a major role in developing various estimation algorithms. It is based on the concept of discounting of old measurements and allows for better performance in problems with timevarying signals and signals in nonstationary noise. In this paper we showhow this concept can be combined with the Bayesian methodology. We propose that the discounting of old measurements within the Bayesian framework be implemented by employing particle filters. The main idea is presented byway of a simple example. The methodology is very attractive and can be used in a very wide range of scenarios including ones that involve highly nonlinear models and nonGaussian noise.
Convergence to Global Minima for a Class of Diffusion Processes
 PHYSICA A
, 2000
"... We prove that there exists a gain function (#(t);#(t)) t0 such that the solution of the SDE dx t =#(t)(grad U (x t )dt+#(t)dB t ) `settles' down on the set of global minima of U . In particular, the existence of a gain function (#(t)) t0 so that y t satisfying dy =#(t)(grad U (y t )dt+dB t ) conve ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We prove that there exists a gain function (#(t);#(t)) t0 such that the solution of the SDE dx t =#(t)(grad U (x t )dt+#(t)dB t ) `settles' down on the set of global minima of U . In particular, the existence of a gain function (#(t)) t0 so that y t satisfying dy =#(t)(grad U (y t )dt+dB t ) converges to the set of the global minima of U is verified. Then we apply the results to the RobbinsMonro and the KieferWolfowitz procedures which are of particular interest in statistics.
Tracking TimeVarying CoefficientFunctions
"... A conditional parametric ARXmodel is an ARXmodel in which the parameters are replaced by smooth functions of an, possibly multivariate, external input signal. These functions are called coefficientfunctions. A method, which estimates these functions adaptively and recursively, and hence allows for ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
A conditional parametric ARXmodel is an ARXmodel in which the parameters are replaced by smooth functions of an, possibly multivariate, external input signal. These functions are called coefficientfunctions. A method, which estimates these functions adaptively and recursively, and hence allows for online tracking of the coefficientfunctions is suggested. Essentially, in its most simple form, this method is a combination of recursive least squares with exponential forgetting and local polynomial regression. However, it is argued, that it is appropriate to let the forgetting factor vary with the value of the external signal which is argument of the coefficientfunctions. The properties of the modified method are studied by simulation. A particular feature is the this effective forgetting factor will adapt to the bandwidth used so that the effective number of observations behind the estimates will be almost independent of the actual bandwidth or of the type of bandwidth selection used ...
Large Error Recovery for a Class of Frequency Tracking Algorithms
, 1997
"... The performance of a recently proposed high order adaptive notch filter (HANF) for frequency estimation and tracking is studied. An analysis technique utilizing approximations with linear filters is employed to derive closed form performance expressions for a noisy sinusoidal input signal. Important ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
The performance of a recently proposed high order adaptive notch filter (HANF) for frequency estimation and tracking is studied. An analysis technique utilizing approximations with linear filters is employed to derive closed form performance expressions for a noisy sinusoidal input signal. Important performance measures, such as stability, noise rejection, statistical efficiency, and tracking ability, are studied in detail, and rules for the design variables are given. A study is presented where the performance of HANF is compared with the performance of a minimal order adaptive notch filter (ANF), as well as with a frequency tracker based on least squares modeling  the multiple frequency tracker (MFT). The study reveals that HANF is a competitive alternative to ANF, but also that in general the MFT is the method of choice.
Fast learning of biomimetic oculomotor control with nonparametric regression networks
 In IEEE International Conference on Robotics and Automation (ICRA `00
, 2000
"... Accurate oculomotor control is one of the essential prerequisites of successful visuomotor coordination. Given the variable nonlinearities of the geometry of binocular vision as well as the possible nonlinearities of the oculomotor plant, it is desirable to accomplish accurate oculomotor control th ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Accurate oculomotor control is one of the essential prerequisites of successful visuomotor coordination. Given the variable nonlinearities of the geometry of binocular vision as well as the possible nonlinearities of the oculomotor plant, it is desirable to accomplish accurate oculomotor control through learning approaches. In this paper, we investigate learning control for a biomimetic active vision system mounted on a humanoid robot. By combining a biologically inspired cerebellar learning scheme with a stateoftheart statistical learning network, our robot system is able to acquire high performance visual stabilization re exes after about 40 seconds of learning despite signi cant nonlinearities and processing delays in the system. 1