Results 1  10
of
1,221
Eigentaste: A Constant Time Collaborative Filtering Algorithm
, 2000
"... Eigentaste is a collaborative filtering algorithm that uses universal queries to elicit realvalued user ratings on a common set of items and applies principal component analysis (PCA) to the resulting dense subset of the ratings matrix. PCA facilitates dimensionality reduction for offline clusterin ..."
Abstract

Cited by 368 (6 self)
 Add to MetaCart
Eigentaste is a collaborative filtering algorithm that uses universal queries to elicit realvalued user ratings on a common set of items and applies principal component analysis (PCA) to the resulting dense subset of the ratings matrix. PCA facilitates dimensionality reduction for offline clustering of users and rapid computation of recommendations. For a database of n users, standard nearestneighbor techniques require O(n) processing time to compute recommendations, whereas Eigentaste requires O(1) (constant) time. We compare Eigentaste to alternative algorithms using data from Jester, an online joke recommending system. Jester has collected approximately 2,500,000 ratings from 57,000 users. We use the Normalized Mean Absolute Error (NMAE) measure to compare performance of different algorithms. In the Appendix we use Uniform and Normal distribution models to derive analytic estimates of NMAE when predictions are random. On the Jester dataset, Eigentaste computes recommendations two ...
A Public Management for All Seasons
 Public Administration
, 1991
"... This articte discusses: the doctrinal content of the group of ideas known as 'new public management ' (NPM); the intellectual provenance of those ideas; explcinations for their apparent persuasiveness in the 1980s; and criticisms which have been made of the new doctrines. Particular atten ..."
Abstract

Cited by 344 (1 self)
 Add to MetaCart
This articte discusses: the doctrinal content of the group of ideas known as 'new public management ' (NPM); the intellectual provenance of those ideas; explcinations for their apparent persuasiveness in the 1980s; and criticisms which have been made of the new doctrines. Particular attention is paid to the claim that NPM offers an allpurpose key to better provision of public services. TTiis article argues that NPM has been most commonly criticized in terms of a claimed contradiction between 'equity ' and 'efficiency ' values, but that any critique which is to survive NPM's claim to 'infinite reprogrammability ' must be couched in terms of possible conflicts between administrative values. The conclusion is that the ESRC's Ivlanagement in Government ' research irutiative has been more valuable in helping to identify rather than to definitively answer, the key conceptual questions raised by NPM. THE RISE OF NEW PUBLIC MANAGEMENT (NPM) The rise of 'new public management ' (hereafter NPM) over the past 15 years is one of the most striking intemational trends in public administration. Though the research reported in the other papers in this issue refers mainly to UK experience, NPM is emphatically not a uniquely British development. NPM's rise seems to be
Truthful Mechanisms for OneParameter Agents
"... In this paper, we show how to design truthful (dominant strategy) mechanisms for several combinatorial problems where each agent’s secret data is naturally expressed by a single positive real number. The goal of the mechanisms we consider is to allocate loads placed on the agents, and an agent’s sec ..."
Abstract

Cited by 236 (3 self)
 Add to MetaCart
(Show Context)
In this paper, we show how to design truthful (dominant strategy) mechanisms for several combinatorial problems where each agent’s secret data is naturally expressed by a single positive real number. The goal of the mechanisms we consider is to allocate loads placed on the agents, and an agent’s secret data is the cost she incurs per unit load. We give an exact characterization for the algorithms that can be used to design truthful mechanisms for such load balancing problems using appropriate side payments. We use our characterization to design polynomial time truthful mechanisms for several problems in combinatorial optimization to which the celebrated VCG mechanism does not apply. For scheduling related parallel machines (QjjCmax), we give a 3approximation mechanism based on randomized rounding of the optimal fractional solution. This problem is NPcomplete, and the standard approximation algorithms (greedy loadbalancing or the PTAS) cannot be used in truthful mechanisms. We show our mechanism to be frugal, in that the total payment needed is only a logarithmic factor more than the actual costs incurred by the machines, unless one machine dominates the total processing power. We also give truthful mechanisms for maximum flow, Qjj P Cj (scheduling related machines to minimize the sum of completion times), optimizing an affine function over a fixed set, and special cases of uncapacitated facility location. In addition, for Qjj P wjCj (minimizing the weighted sum of completion times), we prove a lower bound of 2 p 3 for the best approximation ratio achievable by a truthful mechanism.
A Survey of Topk Query Processing Techniques in Relational Database Systems
"... Efficient processing of topk queries is a crucial requirement in many interactive environments that involve massive amounts of data. In particular, efficient topk processing in domains such as the Web, multimedia search and distributed systems has shown a great impact on performance. In this surve ..."
Abstract

Cited by 162 (6 self)
 Add to MetaCart
Efficient processing of topk queries is a crucial requirement in many interactive environments that involve massive amounts of data. In particular, efficient topk processing in domains such as the Web, multimedia search and distributed systems has shown a great impact on performance. In this survey, we describe and classify topk processing techniques in relational databases. We discuss different design dimensions in the current techniques including query models, data access methods, implementation levels, data and query certainty, and supported scoring functions. We show the implications of each dimension on the design of the underlying techniques. We also discuss topk queries in XML domain, and show their connections to relational approaches.
An Impossibility Theorem for Clustering
, 2002
"... Although the study of clustering is centered around an intuitively compelling goal, it has been very di#cult to develop a unified framework for reasoning about it at a technical level, and profoundly diverse approaches to clustering abound in the research community. Here we suggest a formal pers ..."
Abstract

Cited by 138 (0 self)
 Add to MetaCart
Although the study of clustering is centered around an intuitively compelling goal, it has been very di#cult to develop a unified framework for reasoning about it at a technical level, and profoundly diverse approaches to clustering abound in the research community. Here we suggest a formal perspective on the difficulty in finding such a unification, in the form of an impossibility theorem: for a set of three simple properties, we show that there is no clustering function satisfying all three. Relaxations of these properties expose some of the interesting (and unavoidable) tradeoffs at work in wellstudied clustering techniques such as singlelinkage, sumofpairs, kmeans, and kmedian.
On the Logic of Merging
, 1998
"... This work proposes an axiomatic characterization of merging operators. It underlines the differences between arbitration operators and majority operators. A representation theorem is stated showing that each merging operator corresponds to a family of partial preorders on interpretations. Examples o ..."
Abstract

Cited by 137 (12 self)
 Add to MetaCart
This work proposes an axiomatic characterization of merging operators. It underlines the differences between arbitration operators and majority operators. A representation theorem is stated showing that each merging operator corresponds to a family of partial preorders on interpretations. Examples of operators are given. They show the consistency of the axiomatic characterization. A new merging operator 4GMax is provided. It is proved that it is actually an arbitration operator. 1 Introduction In a growing number of applications, we face conflicting information coming from several sources. The problem is to reach a coherent piece of information from these contradicting ones. A lot of different merging methods have already been given [BI84, LMa, BKM91, BKMS92, Sub94]. Instead of giving one particular merging method we propose, in this paper, a characterization of such methods following the rationality of the postulates they satisfy. We shall call merging operators those meth...
Group modeling: Selecting a sequence of television items to suit a group of viewers. User Modeling and UserAdapted Interaction
, 2004
"... Abstract. Watching television tends to be a social activity. So, adaptive television needs to adapt to groups of users rather than to individual users. In this paper, we discuss different strategies for combining individual user models to adapt to groups, some of which are inspired by Social Choice ..."
Abstract

Cited by 115 (14 self)
 Add to MetaCart
(Show Context)
Abstract. Watching television tends to be a social activity. So, adaptive television needs to adapt to groups of users rather than to individual users. In this paper, we discuss different strategies for combining individual user models to adapt to groups, some of which are inspired by Social Choice Theory. In a first experiment, we explore how humans select a sequence of items for a group to watch, based on data about the individuals’ preferences. The results show that humans use some of the strategies such as the Average Strategy (a.k.a. Additive Utilitarian), the Average Without Misery Strategy and the Least Misery Strategy, and care about fairness and avoiding individual misery. In a second experiment, we investigate how satisfied people believe they would be with sequences chosen by different strategies, and how their satisfaction corresponds with that predicted by a number of satisfaction functions. The results show that subjects use normalization, deduct misery, and use the ratings in a nonlinear way. One of the satisfaction functions produced reasonable, though not completely correct predictions. According to our subjects, the sequences produced by five strategies give satisfaction to all individuals in the group. The results also show that subjects put more emphasis than expected on showing the best rated item to each individual (at a cost of misery for another individual), and that the ratings of the first and last items in the sequence are especially important. In a final experiment, we explore the influence viewing an item can have on the ratings of other items. This is important for deciding the order in which to present items. The results show an effect of both mood and topical relatedness.
Rationality and its Roles in Reasoning
 Computational Intelligence
, 1994
"... The economic theory of rationality promises to equal mathematical logic in its importance for the mechanization of reasoning. We survey the growing literature on how the basic notions of probability, utility, and rational choice, coupled with practical limitations on information and resources, in ..."
Abstract

Cited by 114 (5 self)
 Add to MetaCart
(Show Context)
The economic theory of rationality promises to equal mathematical logic in its importance for the mechanization of reasoning. We survey the growing literature on how the basic notions of probability, utility, and rational choice, coupled with practical limitations on information and resources, influence the design and analysis of reasoning and representation systems. 1 Introduction People make judgments of rationality all the time, usually in criticizing someone else's thoughts or deeds as irrational, or in defending their own as rational. Artificial intelligence researchers construct systems and theories to perform or describe rational thought and action, criticizing and defending these systems and theories in terms similar to but more formal than those of the man or woman on the street. Judgments of human rationality commonly involve several different conceptions of rationality, including a logical conception used to judge thoughts, and an economic one used to judge actions or...
Merging Information Under Constraints: A Logical Framework
, 2002
"... We consider the problem of merging several belief bases in the presence of integrity constraints. ..."
Abstract

Cited by 112 (12 self)
 Add to MetaCart
We consider the problem of merging several belief bases in the presence of integrity constraints.