Results 1  10
of
18,084
Valuing American options by simulation: A simple leastsquares approach
 Review of Financial Studies
, 2001
"... This article presents a simple yet powerful new approach for approximating the value of America11 options by simulation. The kcy to this approach is the use of least squares to estimate the conditional expected payoff to the optionholder from continuation. This makes this approach readily applicable ..."
Abstract

Cited by 517 (9 self)
 Add to MetaCart
This article presents a simple yet powerful new approach for approximating the value of America11 options by simulation. The kcy to this approach is the use of least squares to estimate the conditional expected payoff to the optionholder from continuation. This makes this approach readily
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 476 (46 self)
 Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set
Structural equation modeling in practice: a review and recommended twostep approach.
 Psychological Bulletin,
, 1988
"... In this article, we provide guidance for substantive researchers on the use of structural equation modeling in practice for theory testing and development. We present a comprehensive, twostep modeling approach that employs a series of nested models and sequential chisquare difference tests. We di ..."
Abstract

Cited by 1825 (3 self)
 Add to MetaCart
In this article, we provide guidance for substantive researchers on the use of structural equation modeling in practice for theory testing and development. We present a comprehensive, twostep modeling approach that employs a series of nested models and sequential chisquare difference tests. We
Some special types of Square difference graphs
 International Journal of Mathematical Archives
, 2012
"... I defined a new labeling and a new graph called square difference labeling and the square difference graph. Let G be a (p, q) graph. G is said to be a square difference graph if there exists a bijection f: V(G) → { 0,1, …., p1} such that the induced function f * : E(G) → N given by f*(uv) =  [f(u ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
I defined a new labeling and a new graph called square difference labeling and the square difference graph. Let G be a (p, q) graph. G is said to be a square difference graph if there exists a bijection f: V(G) → { 0,1, …., p1} such that the induced function f * : E(G) → N given by f*(uv) =  [f
Algorithms for Nonnegative Matrix Factorization
 In NIPS
, 2001
"... Nonnegative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minim ..."
Abstract

Cited by 1246 (5 self)
 Add to MetaCart
Nonnegative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown
Square Difference Labeling for Some Graphs J.Shiama
"... Here I define a new labeling and a new graph called square difference labeling and the square difference graph. Let G be a (p, q) graph. G is said to be a square difference graph if there exists a bijection f: V(G) → { 0,1, …., p1} such that the induced function f * : E(G) → N given by f*(uv) = ..."
Abstract
 Add to MetaCart
Here I define a new labeling and a new graph called square difference labeling and the square difference graph. Let G be a (p, q) graph. G is said to be a square difference graph if there exists a bijection f: V(G) → { 0,1, …., p1} such that the induced function f * : E(G) → N given by f
Bid, ask and transaction prices in a specialist market with heterogeneously informed traders
 Journal of Financial Economics
, 1985
"... The presence of traders with superior information leads to a positive bidask spread even when the specialist is riskneutral and makes zero expected profits. The resulting transaction prices convey information, and the expectation of the average spread squared times volume is bounded by a number th ..."
Abstract

Cited by 1273 (5 self)
 Add to MetaCart
The presence of traders with superior information leads to a positive bidask spread even when the specialist is riskneutral and makes zero expected profits. The resulting transaction prices convey information, and the expectation of the average spread squared times volume is bounded by a number
Closedform solution of absolute orientation using unit quaternions
 J. Opt. Soc. Am. A
, 1987
"... Finding the relationship between two coordinate systems using pairs of measurements of the coordinates of a number of points in both systems is a classic photogrammetric task. It finds applications in stereophotogrammetry and in robotics. I present here a closedform solution to the leastsquares pr ..."
Abstract

Cited by 989 (4 self)
 Add to MetaCart
translational offset is the difference between the centroid of the coordinates in one system and the rotated and scaled centroid of the coordinates in the other system. The best scale is equal to the ratio of the rootmeansquare deviations of the coordinates in the two systems from their respective centroids
An empirical comparison of voting classification algorithms: Bagging, boosting, and variants.
 Machine Learning,
, 1999
"... Abstract. Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and realworld datasets. We review these algorithms and describe a large empirical study comparing several vari ..."
Abstract

Cited by 707 (2 self)
 Add to MetaCart
and variance decomposition of the error to show how different methods and variants influence these two terms. This allowed us to determine that Bagging reduced variance of unstable methods, while boosting methods (AdaBoost and Arcx4) reduced both the bias and variance of unstable methods but increased
Detection and Tracking of Point Features
 International Journal of Computer Vision
, 1991
"... The factorization method described in this series of reports requires an algorithm to track the motion of features in an image stream. Given the small interframe displacement made possible by the factorization approach, the best tracking method turns out to be the one proposed by Lucas and Kanade i ..."
Abstract

Cited by 629 (2 self)
 Add to MetaCart
in 1981. The method defines the measure of match between fixedsize feature windows in the past and current frame as the sum of squared intensity differences over the windows. The displacement is then defined as the one that minimizes this sum. For small motions, a linearization of the image intensities
Results 1  10
of
18,084