## Topographic Organization of User Preference Patterns in Collaborative Filtering

Venue: | Neural Network World |

Citations: | 3 - 1 self |

### BibTeX

@ARTICLE{Tino_topographicorganization,

author = {Peter Tino and Gabriela Polcicova},

title = {Topographic Organization of User Preference Patterns in Collaborative Filtering},

journal = {Neural Network World},

year = {},

volume = {13},

pages = {2003}

}

### OpenURL

### Abstract

We introduce topographic versions of two latent class models for collaborative filtering.

### Citations

8074 | Maximum likelihood from incomplete data via the EM algorithm
- Dempster, Laird, et al.
- 1977
(Show Context)
Citation Context ...e I and type II models, respectively. 5 Free parameters of the model, P (z Y |u) and #(v, y, z Z ), are fitted to the data D by maximizing the likelihood L via Expectation-Maximization (EM) algorithm =-=[3, 14]-=-. The EM algorithm is a standard method for maximum likelihood estimation in latent variable models. It consists of repeated alternation of two steps until a convergence criterion is satisfied. In the... |

3234 |
The self-organizing map
- Kohonen
- 1990
(Show Context)
Citation Context ...each other. Many variants of such topographic representations of data patterns can be found in the machine learning literature. Perhaps the most famous example is the Kohonen selforganizing map (SOM) =-=[10, 11]-=-. More recently, statistically principled reformulations and extensions of SOMs appeared in e.g. [1, 9]. Of particular interest to us is the link between SOMs and vector quantization through noisy com... |

1090 |
Self-organized Formation of Topologically Correct Feature Maps
- Kohonen
- 1982
(Show Context)
Citation Context ...each other. Many variants of such topographic representations of data patterns can be found in the machine learning literature. Perhaps the most famous example is the Kohonen selforganizing map (SOM) =-=[10, 11]-=-. More recently, statistically principled reformulations and extensions of SOMs appeared in e.g. [1, 9]. Of particular interest to us is the link between SOMs and vector quantization through noisy com... |

764 | A view of the EM algorithm that justifies incremental sparse and other variants
- Neal, Hinton
- 1998
(Show Context)
Citation Context ...e I and type II models, respectively. 5 Free parameters of the model, P (z Y |u) and #(v, y, z Z ), are fitted to the data D by maximizing the likelihood L via Expectation-Maximization (EM) algorithm =-=[3, 14]-=-. The EM algorithm is a standard method for maximum likelihood estimation in latent variable models. It consists of repeated alternation of two steps until a convergence criterion is satisfied. In the... |

691 | Using Collaborative Filtering to Weave an Information Tapestry
- Goldberg, Nichol, et al.
- 1992
(Show Context)
Citation Context ...e to share their evaluations, several techniques for leveraging existing user data (in the form of ratings/profiles/logs) has been proposed. Among the most popular is the collaborative filtering (CF) =-=[4]-=-. 1 There are two dominant approaches to CF, namely memory based approaches [12] and (more principled) latent class models (LCM) presented in [6]. The underlying principle of the former approach is th... |

591 | Grouplens: applying collaborative filtering to usenet news
- Konstan, Miller, et al.
- 1997
(Show Context)
Citation Context ...ata (in the form of ratings/profiles/logs) has been proposed. Among the most popular is the collaborative filtering (CF) [4]. 1 There are two dominant approaches to CF, namely memory based approaches =-=[12]-=- and (more principled) latent class models (LCM) presented in [6]. The underlying principle of the former approach is that in order to recommend new items to a given user, ratings /judgments of people... |

531 | Probabilistic latent semantic analysis
- Hofmann
- 1999
(Show Context)
Citation Context ...in order to recommend new items to a given user, ratings /judgments of people in the database with "similar" interests are used. The latter approach is based on Probabilistic Latent Semantic=-= Analysis [5]-=-. The main advantage of this approach is that it is able to automatically discover preference patterns in user profile data without su#ering the flaw of memory based approaches -- the inability to acc... |

280 | GTM: The generative topographic mapping
- Bishop, Svensén, et al.
- 1998
(Show Context)
Citation Context ...e learning literature. Perhaps the most famous example is the Kohonen selforganizing map (SOM) [10, 11]. More recently, statistically principled reformulations and extensions of SOMs appeared in e.g. =-=[1, 9]-=-. Of particular interest to us is the link between SOMs and vector quantization through noisy communication channels established in [2, 7, 13]. Briefly, in the information theoretic interpretation of ... |

54 | Vector quantization with complexity costs - BUHMANN, KÜHNEL - 1993 |

25 |
Statistical Methods for Speech Recognition (Language, Speech, and Communication
- Jelinek
- 1998
(Show Context)
Citation Context ...al distribution of ratings for film y by the users belonging to the SOM class z. Due to the data sparseness, we perform a smoothing of the empirical estimates by applying Laplace correction (see e.g. =-=[8]-=-): P (v|y, z) = N(v, y, z) +m m|V| + # v # #V N(v # , y, z) , (16) where m is a positive number 3 and N(v, y, z) is the number of times in the data set D that users belonging to the SOM center z rated... |

21 |
A combined latent class and trait model for the analysis and visualization of discrete data
- Kabán, Girolami
- 2001
(Show Context)
Citation Context ...e learning literature. Perhaps the most famous example is the Kohonen selforganizing map (SOM) [10, 11]. More recently, statistically principled reformulations and extensions of SOMs appeared in e.g. =-=[1, 9]-=-. Of particular interest to us is the link between SOMs and vector quantization through noisy communication channels established in [2, 7, 13]. Briefly, in the information theoretic interpretation of ... |

19 | Competitive learning algorithms for robust vector quantization
- Hofmann, Buhmann
- 1998
(Show Context)
Citation Context ...cipled reformulations and extensions of SOMs appeared in e.g. [1, 9]. Of particular interest to us is the link between SOMs and vector quantization through noisy communication channels established in =-=[2, 7, 13]-=-. Briefly, in the information theoretic interpretation of SOM, the topological organization of codebook vectors (that correspond to nodes (classes) on the latent grid) emerges through non-uniformity o... |

18 | Hierarchical GTM: Constructing localized nonlinear projection manifolds in a principled way - Tino, Nabney - 2002 |

16 |
What people (don’t) want
- HOFMANN
(Show Context)
Citation Context ...odels to a large collection of user ratings for films. 1 Introduction When deciding which book to read, which movie to watch, or which Web page to visit, we often rely on advise given by other people =-=[6]-=-. To help people to share their evaluations, several techniques for leveraging existing user data (in the form of ratings/profiles/logs) has been proposed. Among the most popular is the collaborative ... |

12 |
Hierarchical Vector Quantization
- Luttrell
(Show Context)
Citation Context ...cipled reformulations and extensions of SOMs appeared in e.g. [1, 9]. Of particular interest to us is the link between SOMs and vector quantization through noisy communication channels established in =-=[2, 7, 13]-=-. Briefly, in the information theoretic interpretation of SOM, the topological organization of codebook vectors (that correspond to nodes (classes) on the latent grid) emerges through non-uniformity o... |