Results 1 - 10
of
10,980
MATRIX FACTORIZATION TECHNIQUES FOR RECOMMENDER SYSTEMS
- IEEE COMPUTER
, 2009
"... As the Netflix Prize competition has demonstrated, matrix factorization models are superior to classic nearest-neighbor techniques for producing product recommendations, allowing the incorporation of additional information such as implicit feedback, temporal effects, and confidence levels. Modern co ..."
Abstract
-
Cited by 593 (4 self)
- Add to MetaCart
As the Netflix Prize competition has demonstrated, matrix factorization models are superior to classic nearest-neighbor techniques for producing product recommendations, allowing the incorporation of additional information such as implicit feedback, temporal effects, and confidence levels. Modern
Learning the Kernel Matrix with Semi-Definite Programming
, 2002
"... Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract
-
Cited by 775 (21 self)
- Add to MetaCart
Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information
Processor Allocation for Matrix Products
"... In this paper, we present the problem of allocating processors for matrix products. First, we consider how many processors should be allocated for computing one matrix product on a parallel system. Then, we discuss how to allocate processors for a number of independent matrix products on the paralle ..."
Abstract
- Add to MetaCart
In this paper, we present the problem of allocating processors for matrix products. First, we consider how many processors should be allocated for computing one matrix product on a parallel system. Then, we discuss how to allocate processors for a number of independent matrix products
CACHING IN MATRIX PRODUCT ALGORITHMS
, 708
"... Abstract. A new type of diagram is introduced for visualizing matrix product states which makes transparent a connection between matrix product states and complex weighted finite state automata. It is then shown how one can proceed in the opposite direction: writing an automata that “generates ” an ..."
Abstract
- Add to MetaCart
Abstract. A new type of diagram is introduced for visualizing matrix product states which makes transparent a connection between matrix product states and complex weighted finite state automata. It is then shown how one can proceed in the opposite direction: writing an automata that “generates
Decoding of Matrix-Product Codes∗
"... We propose a decoding algorithm for the (u | u+ v)-construction that decodes up to half of the minimum distance of the linear code. We extend this algorithm for a class of matrix-product codes in two different ways. In some cases, one can decode beyond the error correction capability of the code. 1 ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
We propose a decoding algorithm for the (u | u+ v)-construction that decodes up to half of the minimum distance of the linear code. We extend this algorithm for a class of matrix-product codes in two different ways. In some cases, one can decode beyond the error correction capability of the code. 1
Closed-form solution of absolute orientation using unit quaternions
- J. Opt. Soc. Am. A
, 1987
"... Finding the relationship between two coordinate systems using pairs of measurements of the coordinates of a number of points in both systems is a classic photogrammetric task. It finds applications in stereophotogrammetry and in robotics. I present here a closed-form solution to the least-squares pr ..."
Abstract
-
Cited by 989 (4 self)
- Add to MetaCart
. These exact results are to be preferred to approximate methods based on measurements of a few selected points. The unit quaternion representing the best rotation is the eigenvector associated with the most positive eigenvalue of a symmetric 4 X 4 matrix. The elements of this matrix are combinations of sums
Capacity of a Mobile Multiple-Antenna Communication Link in Rayleigh Flat Fading
"... We analyze a mobile wireless link comprising M transmitter and N receiver antennas operating in a Rayleigh flat-fading environment. The propagation coefficients between every pair of transmitter and receiver antennas are statistically independent and unknown; they remain constant for a coherence int ..."
Abstract
-
Cited by 495 (22 self)
- Add to MetaCart
signals. We prove that there is no point in making the number of transmitter antennas greater than the length of the coherence interval: the capacity for M> Tis equal to the capacity for M = T. Capacity is achieved when the T M transmitted signal matrix is equal to the product of two statistically
Extracellular Matrix Production by the Adherent Cells of
"... bone marrow cultures Extracellular matrix production by the adherent cells of long-term murine ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
bone marrow cultures Extracellular matrix production by the adherent cells of long-term murine
On the Complexity of Matrix Product
- SIAM J. Comput
, 2002
"... We prove a lower bound of \Omega\Gamma m log m) for the size of any arithmetic circuit for the product of two matrices, over the real or complex numbers, as long as the circuit doesn't use products with field elements of absolute value larger than 1 (where m \Theta m is the size of each m ..."
Abstract
-
Cited by 34 (2 self)
- Add to MetaCart
matrix). That is, our lower bound is super-linear in the number of inputs and is applied for circuits that use addition gates, product gates and products with field elements of absolute value up to 1.
Computing Roots of Matrix Products
, 2000
"... Introduction Computing the square root of the product of two matrices can be used in model reduction methods based on the cross-Gramians; see [1] and the references therein. This lead us to consider the more general problem of computing the kth root of a matrix product W = A 1 A 2 \Delta \Delta \De ..."
Abstract
- Add to MetaCart
Introduction Computing the square root of the product of two matrices can be used in model reduction methods based on the cross-Gramians; see [1] and the references therein. This lead us to consider the more general problem of computing the kth root of a matrix product W = A 1 A 2 \Delta \Delta
Results 1 - 10
of
10,980