• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 712
Next 10 →

H Hessian matrix

by Christopher Tyler Jones, Roberto Celi, B Control Matrix, C Output Matrix
"... This paper presents the development of a frequency response sensitivity function that is applied to the determination of a state space coupled rotor-fuselage helicopter flight dy-namics model using frequency domain system identification. The new function exposes the frequency-dependent sensitivity o ..."
Abstract - Add to MetaCart
the frequency response sensitivity function are introduced that substantially reduce the computational cost of the solution. Simulated flight test data are used to validate a direct state space ma-trix identification process incorporating the frequency response sensitivity function. Results demonstrate

Centerline Extraction Based on Hessian Matrix

by Li Guang-ming, Tian Jie, Zhao Ming-chang, He Hui-guang
"... Abstract: Virtual endoscopy, which is a noninvasive procedure for detecting anomalies inside human organs, is meaningful for medical diagnosis and surgery. In order to perform an accurate navigation, the centerline of the model must be extracted. In this paper, a new centerline extraction algorithm ..."
Abstract - Add to MetaCart
based on Hessian Matrix is proposed. First, the distance transformation is performed. Then the initial path is obtained by computing the eigenvalues and eigenvectors of the Hessian matrix. After that, the visibility test with an adaptive visibility sphere radius, which is determined by the eigenvalues

Discrete Hessian Matrix for L-convex Functions

by Satoko Moriguchi, Kazuo Murota , 2004
"... L-convex functions are nonlinear discrete functions on integer points that are computationally tractable in optimization. In this paper, a discrete Hessian matrix and a local quadratic expansion are defined for L-convex functions. We characterize L-convex functions in terms of the discrete Hessian m ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
L-convex functions are nonlinear discrete functions on integer points that are computationally tractable in optimization. In this paper, a discrete Hessian matrix and a local quadratic expansion are defined for L-convex functions. We characterize L-convex functions in terms of the discrete Hessian

Exploiting Eigenvalues of the Hessian Matrix for Volume Decimation

by Jirí Hladuvka, Eduard Gröller
"... In recent years the Hessian matrix and its eigenvalues became important in pattern recognition. Several algorithms based on the information they provide have been introduced. We recall the relationship between the eigenvalues of Hessian matrix and the 2nd order edge detection lter, show the usefu ..."
Abstract - Cited by 8 (2 self) - Add to MetaCart
In recent years the Hessian matrix and its eigenvalues became important in pattern recognition. Several algorithms based on the information they provide have been introduced. We recall the relationship between the eigenvalues of Hessian matrix and the 2nd order edge detection lter, show

ON DISCRETE HESSIAN MATRIX AND CONVEX EXTENSIBILITY

by Satoko Moriguchi, Kazuo Murota
"... ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
Abstract not found

Hidden Layer Training via Hessian Matrix Information

by Changhua Yu, Michael T. Manry, Jiang Li
"... The output weight optimization-hidden weight optimization (OWO-HWO) algorithm for training the multilayer perceptron alternately updates the output weights and the hidden weights. This layer-by-layer training strategy greatly improves convergence speed. However, in HWO, the desired net function actu ..."
Abstract - Add to MetaCart
actually evolves in the gradient direction, which inevitably reduces efficiency. In this paper, two improvements to the OWO-HWO algorithm are presented. New desired net functions are proposed for hidden layer training, which use Hessian matrix information rather than gradients. A weighted hidden layer

Second Order Backpropagation - Efficient Computation of the Hessian Matrix for Neural Networks

by Raúl Rojas , 1993
"... Traditional learning methods for neural networks use some kind of gradient descent in order to determine the network's weights for a given task. Some second order learning algorithms deal with a quadratic approximation of the error function determined from the calculation of the Hessian matrix, ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Traditional learning methods for neural networks use some kind of gradient descent in order to determine the network's weights for a given task. Some second order learning algorithms deal with a quadratic approximation of the error function determined from the calculation of the Hessian matrix

Second-order backpropagation algorithms for a stagewise-partitioned separable Hessian matrix

by Eiji Mizutani, Stuart E. Dreyfus, James W. Demmel - IN PROC. OF 2005 INT’L JOINT CONF. ON NEURAL NETWORKS (SEE WWW.IEOR.BERKELEY.EDU/PEOPLE/FACULTY/DREYFUS-PUBS/IJCNNESJ05.PDF , 2005
"... Recent advances in computer technology allow the implementation of some important methods that were assigned lower priority in the past due to their computational burdens. Second-order backpropagation (BP) is such a method that computes the exact Hessian matrix of a given objective function. We des ..."
Abstract - Cited by 6 (4 self) - Add to MetaCart
Recent advances in computer technology allow the implementation of some important methods that were assigned lower priority in the past due to their computational burdens. Second-order backpropagation (BP) is such a method that computes the exact Hessian matrix of a given objective function. We

Exact Calculation of the Product of the Hessian Matrix of Feed-Forward Network Error Functions and a Vector in O(N) Time

by Martin Møller
"... Several methods for training feed-forward neural networks require second order information from the Hessian matrix of the error function. Although it is possible to calculate the Hessian matrix exactly it is often not desirable because of the computation and memory requirements involved. Some learni ..."
Abstract - Cited by 4 (0 self) - Add to MetaCart
Several methods for training feed-forward neural networks require second order information from the Hessian matrix of the error function. Although it is possible to calculate the Hessian matrix exactly it is often not desirable because of the computation and memory requirements involved. Some

Exploiting the Hessian matrix for content-based retrieval of volume-data features

by Jirí Hladuvka, et al. , 2002
"... ..."
Abstract - Cited by 3 (1 self) - Add to MetaCart
Abstract not found
Next 10 →
Results 1 - 10 of 712
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University