• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 576
Next 10 →

Principal eigenvectors of irregular graphs

by Sebastian M. Cioabă, David A. Gregory - ELA , 2007
"... Let G be a connected graph. This paper studies the extreme entries of the principal eigenvector x of G, the unique positive unit eigenvector corresponding to the greatest eigenvalue λ1 of the adjacency matrix of G. If G has maximum degree ∆, the greatest entry xmax Õ of x isat most 1 / 1+λ2 1 /∆. ..."
Abstract - Cited by 5 (2 self) - Add to MetaCart
Let G be a connected graph. This paper studies the extreme entries of the principal eigenvector x of G, the unique positive unit eigenvector corresponding to the greatest eigenvalue λ1 of the adjacency matrix of G. If G has maximum degree ∆, the greatest entry xmax Õ of x isat most 1 / 1+λ2 1

CONTROLLING SPEECH DISTORTION IN ADAPTIVE FREQUENCY-DOMAIN PRINCIPAL EIGENVECTOR BEAMFORMING

by Ernst Warsitz, Reinhold Haeb-umbach
"... Broadband adaptive beamformers, which use a narrowband SNRmaximization optimization criterion for noise reduction, typically cause distortions of the desired speech signal at the beamformer output. In this paper two methods are investigated to control the speech distortion by comparing the eigenvect ..."
Abstract - Add to MetaCart
the eigenvector beamformer with a maximum likelihood beamformer: One is an analytic solution for the ideal case of absence of reverberation and the other one is a statistically motivated approach. We use the recently introduced gradient-ascent algorithm for adaptive principal eigenvector beamforming

Efficient Protocols for Principal Eigenvector Computation over Private Data

by Manas A. Pathak, Bhiksha Raj , 2011
"... In this paper we present a protocol for computing the principal eigenvector of a collection of data matrices belonging to multiple semi-honest parties with privacy constraints. Our proposed protocol is based on secure multi-party computation with a semi-honest arbitrator who deals with data encrypt ..."
Abstract - Add to MetaCart
In this paper we present a protocol for computing the principal eigenvector of a collection of data matrices belonging to multiple semi-honest parties with privacy constraints. Our proposed protocol is based on secure multi-party computation with a semi-honest arbitrator who deals with data

Blind Adaptive Principal Eigenvector Beamforming for Acoustical Source Separation

by Ernst Warsitz, Reinhold Haeb-umbach, Dang Hai, Tran Vu
"... For separating multiple speech signals given a convolutive mixture, time-frequency sparseness of the speech sources can be exploited. In this paper we present a multi-channel source separation method based on the concept of approximate disjoint orthogonality of speech signals. Unlike binary masking ..."
Abstract - Add to MetaCart
of singlechannel signals as e.g. applied in the DUET algorithm we use a likelihood mask to control the adaptation of blind principal eigenvector beamformers. Furthermore orthogonal projection of the adapted beamformer filters leads to mutually orthogonal filter coefficients thus enhancing the demixing performance

Decision Aiding Decision-making with the AHP: Why is the principal eigenvector necessary

by Thomas L Saaty
"... Abstract In this paper it is shown that the principal eigenvector is a necessary representation of the priorities derived from a positive reciprocal pairwise comparison judgment matrix A ¼ ða ij Þ when A is a small perturbation of a consistent matrix. When providing numerical judgments, an individu ..."
Abstract - Add to MetaCart
Abstract In this paper it is shown that the principal eigenvector is a necessary representation of the priorities derived from a positive reciprocal pairwise comparison judgment matrix A ¼ ða ij Þ when A is a small perturbation of a consistent matrix. When providing numerical judgments

A LOWER BOUND ON THE ENTRIES OF THE PRINCIPAL EIGENVECTOR OF A GRAPH

by Felix Goldberg
"... Abstract. We obtain a lower bound on each entry of the princi-pal eigenvector of a non-regular connected graph. 1. ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Abstract. We obtain a lower bound on each entry of the princi-pal eigenvector of a non-regular connected graph. 1.

SIMPLE, EFFECTIVE COMPUTATION OF PRINCIPAL EIGEN-VECTORS AND THEIR EIGENVALUES AND APPLICATION TO HIGH-RESOLUTION ESTIMATION OF FREQUENCIES

by Icsed Eheeeeeeeeeee/g Ueomhheeeeeeel, Donald W Tufts, Costas D. Melissinos, U Jant , 1985
"... under Contracts NOOOi4-83-K-0664 & NOOO14-84-K-0445 ..."
Abstract - Add to MetaCart
under Contracts NOOOi4-83-K-0664 & NOOO14-84-K-0445

Stochastic Perturbation Theory

by G. W. Stewart , 1988
"... . In this paper classical matrix perturbation theory is approached from a probabilistic point of view. The perturbed quantity is approximated by a first-order perturbation expansion, in which the perturbation is assumed to be random. This permits the computation of statistics estimating the variatio ..."
Abstract - Cited by 907 (36 self) - Add to MetaCart
and the eigenvalue problem. Key words. perturbation theory, random matrix, linear system, least squares, eigenvalue, eigenvector, invariant subspace, singular value AMS(MOS) subject classifications. 15A06, 15A12, 15A18, 15A52, 15A60 1. Introduction. Let A be a matrix and let F be a matrix valued function of A

Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition

by Jian Yang, David Zhang, Senior Member, Ro F. Frangi, Jing-yu Yang - IEEE Trans. Pattern Anal. Machine Intell , 2004
"... Abstract—In this paper, a new technique coined two-dimensional principal component analysis (2DPCA) is developed for image representation. As opposed to PCA, 2DPCA is based on 2D imagematrices rather than 1D vectors so the image matrix does not need to be transformed into a vector prior to feature e ..."
Abstract - Cited by 327 (13 self) - Add to MetaCart
Abstract—In this paper, a new technique coined two-dimensional principal component analysis (2DPCA) is developed for image representation. As opposed to PCA, 2DPCA is based on 2D imagematrices rather than 1D vectors so the image matrix does not need to be transformed into a vector prior to feature

Landman et al, Effects of Diffusion Weighting Schemes on DTI Contrasts at 1.5T 1 Effects of Diffusion Weighting Schemes on the Reproducibility of DTI-derived Fractional Anisotropy, Mean Diffusivity, and Principal Eigenvector Measurements at 1.5T

by Bennett A. L, Man A, Jonathan A. D. Farrell B, Craig K. Jones B, Seth A. Smith B, Jerry L, Prince A, Susumu Mori A, Susumu Mori Phd
"... Abstract: 245 ..."
Abstract - Add to MetaCart
Abstract: 245
Next 10 →
Results 1 - 10 of 576
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University