Results 1  10
of
36
Application of statistical mechanics methodology to termstructure bondpricing models
 Mathl. Comput. Modelling
, 1991
"... Recent work in statistical mechanics has developed new analytical and numerical techniques to solve coupled stochastic equations. This paper applies the very fast simulated reannealing and pathintegral methodologies to the estimation of the Brennan and Schwartz twofactor term structure model. It ..."
Abstract

Cited by 32 (28 self)
 Add to MetaCart
Recent work in statistical mechanics has developed new analytical and numerical techniques to solve coupled stochastic equations. This paper applies the very fast simulated reannealing and pathintegral methodologies to the estimation of the Brennan and Schwartz twofactor term structure model. It is shown that these methodologies can be utilized to estimate more complicated nfactor nonlinear models. 1. CURRENT MODELS OF TERM STRUCTURE The modern theory of term structure of interest rates is based on equilibrium and arbitrage models in which bond prices are determined in terms of a few state variables. The onefactor models of Cox, Ingersoll and Ross (CIR) [14], and the twofactor models of Brennan and Schwartz (BS) [59] have been instrumental in the development of the valuation of interest dependent securities. The assumptions of these models include: • Bond prices are functions of a number of state variables, one to several, that follow Markov processes. • Inv estors are rational and prefer more wealth to less wealth. • Inv estors have homogeneous expectations.
Comparing between estimation approaches: Admissible and dominating linear estimators
 August 2005, EE Dept., Technion–Israel Institute of Technology
"... We treat the problem of evaluating the performance of linear estimators for estimating a deterministic parameter vector x in a linear regression model, with the meansquared error (MSE) as the performance measure. Since the MSE depends on the unknown vector x, direct comparison between estimators is ..."
Abstract

Cited by 13 (11 self)
 Add to MetaCart
We treat the problem of evaluating the performance of linear estimators for estimating a deterministic parameter vector x in a linear regression model, with the meansquared error (MSE) as the performance measure. Since the MSE depends on the unknown vector x, direct comparison between estimators is a difficult problem. Here we consider a framework for examining the MSE of different linear estimation approaches based on the concepts of admissible and dominating estimators. We develop a general procedure for determining whether or not a linear estimator is MSE admissible, and for constructing an estimator strictly dominating a given inadmissible method, so that its MSE is smaller for all x. In particular we show that both problems can be addressed in a unified manner for arbitrary constraint sets on x by considering a certain convex optimization problem. We then demonstrate the details of our method for the case in which x is constrained to an ellipsoidal set, and for unrestricted choices of x. As a by product of our results, we derive a closed form solution for the minimax MSE estimator on an ellipsoid, which is valid for arbitrary model parameters, as long as the signaltonoiseratio exceeds a certain threshold. Key Words—Linear estimation, regression, admissible estimators, dominating estimators, meansquared error (MSE) estimation, minimax MSE estimation.
A Parallel CuttingPlane Algorithm for the Vehicle Routing Problem With Time Windows
, 1999
"... In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may on ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may only serve the customers on a route if the total demand does not exceed the capacity of the vehicle. The most effective solution method proposed to date for this problem is due to Kohl, Desrosiers, Madsen, Solomon, and Soumis. Their algorithm uses a cuttingplane approach followed by a branchand bound search with column generation, where the columns of the LP relaxation represent routes of individual vehicles. We describe a new implementation of their method, using Karger's randomized minimumcut algorithm to generate cutting planes. The standard benchmark in this area is a set of 87 problem instances generated in 1984 by M. Solomon; making using of parallel processing in both the cuttingpla...
Design and Evaluation of Feature Detectors
, 1998
"... Many applications in both image processing and computational vision rely upon the robust detection of parametric image features and the accurate estimation of their parameters. In this thesis, I address three fundamental questions related to the design and evaluation of parametric feature detectors. ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Many applications in both image processing and computational vision rely upon the robust detection of parametric image features and the accurate estimation of their parameters. In this thesis, I address three fundamental questions related to the design and evaluation of parametric feature detectors. Most feature detectors have been designed to detect a single type of feature, more often than not, the step edge. A large number of other features are also of interest. Since the task of designing a feature detector is very time consuming, repeating the design effort for each feature is wasteful. To address this deficiency, in the first part of this thesis I develop an algorithm that takes as input a description of a parametric feature and automatically constructs a detector for it. The development of many feature detectors begins with an ideal model of the feature. Since image data are noisy, feature detectors must actually detect features that are almost, but not quite, ideal. Many exist...
Databased Techniques to Improve State Estimation in Model Predictive Control
"... for their endless love and unconditional supportii ..."
Patchbased NearOptimal Image Denoising
, 2012
"... In this paper, we propose a denoising method motivated by our previous analysis [1], [2] of the performance bounds for image denoising. Insights from that study are used here to derive a highperformance, practical denoising algorithm. We propose a patchbased Wiener filter that exploits patch redund ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper, we propose a denoising method motivated by our previous analysis [1], [2] of the performance bounds for image denoising. Insights from that study are used here to derive a highperformance, practical denoising algorithm. We propose a patchbased Wiener filter that exploits patch redundancy for image denoising. Our framework uses both geometrically as well as photometrically similar patches to estimate the different filter parameters. We describe how these parameters can be accurately estimated directly from the input noisy image. Our denoising approach, designed for nearoptimal performance (in the mean squared error sense), has a sound statistical foundation that is analyzed in detail. The performance of our approach is verified experimentally on a variety of images and noise levels. The results presented here demonstrate that our proposed method is on par or exceeding the current stateoftheart, both visually and quantitatively.
Optimal Weighting Functions for Feature Detection
 In Proc. of the 1998 DARPA IUW
, 1998
"... One approach to feature detection is to match a parametric model of the feature to the image data. Naturally, the performance of such detectors is highly dependent upon the function used to measure the degree of fit between the feature model and the image data. In this paper, we first show how an ex ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
One approach to feature detection is to match a parametric model of the feature to the image data. Naturally, the performance of such detectors is highly dependent upon the function used to measure the degree of fit between the feature model and the image data. In this paper, we first show how an existing detector can be extended to use a weighted L 2 norm as the matching function with minimal extra computation. Next, we propose optimality criteria for the two fundamental aspects of feature detection performance: feature detection robustness and parameter estimation accuracy. We also show how to combine these criteria in various ways. We analyze the optimality criterion for parameter estimation under the approximating assumption that the feature manifold is locally linear. We also present a numerical algorithm that can be used to estimate the optimal weighting functions for the other optimality criteria. We include the results of applying this algorithm for step edge, line, and corne...
A New Method of Robust Linear Regression Analysis: Some Monte Carlo Experiments
"... I. Introduction: The outliers in a dataset are the points in a minority that are highly unlikely to belong to the population from which the other points (i.e. inliers), which are in a majority, have been drawn. Alternatively, the outliers exhibit a pattern or characteristics that are alien or noncon ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
I. Introduction: The outliers in a dataset are the points in a minority that are highly unlikely to belong to the population from which the other points (i.e. inliers), which are in a majority, have been drawn. Alternatively, the outliers exhibit a pattern or characteristics that are alien or nonconformal to those of the inliers. Stated differently, if a majority of data points, pi ∈ p, lie in a range (a, b), then a minority of data points, q j ∈ q, far exterior to (a, b), are outliers in the data