Results 1  10
of
29
Learning permutations with exponential weights
 In 20th Annual Conference on Learning Theory
, 2007
"... Abstract. We give an algorithm for learning a permutation online. The algorithm maintains its uncertainty about the target permutation as a doubly stochastic matrix. This matrix is updated by multiplying the current matrix entries by exponential factors. These factors destroy the doubly stochastic ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
Abstract. We give an algorithm for learning a permutation online. The algorithm maintains its uncertainty about the target permutation as a doubly stochastic matrix. This matrix is updated by multiplying the current matrix entries by exponential factors. These factors destroy the doubly stochastic property of the matrix and an iterative procedure is needed to renormalize the rows and columns. Even though the result of the normalization procedure does not have a closed form, we can still bound the additional loss of our algorithm over the loss of the best permutation chosen in hindsight. 1
A Practical LiquiditySensitive Automated Market Maker
 IN PROCEEDINGS OF THE 11TH ACM CONFERENCE ON ELECTRONIC COMMERCE (EC
, 2010
"... Current automated market makers over binary events suffer from two problems that make them impractical. First, they are unable to adapt to liquidity, so trades cause prices to move the same amount in both thick and thin markets. Second, under normal circumstances, the market maker runs at a deficit. ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
Current automated market makers over binary events suffer from two problems that make them impractical. First, they are unable to adapt to liquidity, so trades cause prices to move the same amount in both thick and thin markets. Second, under normal circumstances, the market maker runs at a deficit. In this paper, we construct a market maker that is both sensitive to liquidity and can run at a profit. Our market maker has bounded loss for any initial level of liquidity and, as the initial level of liquidity approaches zero, worstcase loss approaches zero. For any level of initial liquidity we can establish a boundary in market state space such that, if the market terminates within that boundary, the market maker books a profit regardless of the realized outcome. Furthermore, we provide guidance as to how our market maker can be implemented over very large event spaces through a novel costfunctionbased sampling method.
An OptimizationBased Framework for Automated MarketMaking
 EC'11
, 2011
"... We propose a general framework for the design of securities markets over combinatorial or infinite state or outcome spaces. The framework enables the design of computationally efficient markets tailored to an arbitrary, yet relatively small, space of securities with bounded payoff. We prove that any ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
We propose a general framework for the design of securities markets over combinatorial or infinite state or outcome spaces. The framework enables the design of computationally efficient markets tailored to an arbitrary, yet relatively small, space of securities with bounded payoff. We prove that any market satisfying a set of intuitive conditions must price securities via a convex cost function, which is constructed via conjugate duality. Rather than deal with an exponentially large or infinite outcome space directly, our framework only requires optimization over a convex hull. By reducing the problem of automated market making to convex optimization, where many efficient algorithms exist, we arrive at a range of new polynomialtime pricing mechanisms for various problems. We demonstrate the advantages of this framework with the design of some particular markets. We also show that by relaxing the convex hull we can gain computational tractability without compromising the market institution’s bounded budget.
Designing Markets for Prediction
, 2010
"... We survey the literature on prediction mechanisms, including prediction markets and peer prediction systems. We pay particular attention to the design process, highlighting the objectives and properties that are important in the design of good prediction mechanisms. ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
We survey the literature on prediction mechanisms, including prediction markets and peer prediction systems. We pay particular attention to the design process, highlighting the objectives and properties that are important in the design of good prediction mechanisms.
Automated MarketMaking in the Large: The Gates Hillman Prediction Market
"... We designed and built the Gates Hillman Prediction Market (GHPM) to predict the opening day of the Gates and Hillman Centers, the new computer science buildings at Carnegie Mellon University. The market ran for almost a year and attracted 169 active traders who placed almost 40,000 bets with an auto ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
We designed and built the Gates Hillman Prediction Market (GHPM) to predict the opening day of the Gates and Hillman Centers, the new computer science buildings at Carnegie Mellon University. The market ran for almost a year and attracted 169 active traders who placed almost 40,000 bets with an automated market maker. Ranging over 365 possible opening days, the market’s event partition size is the largest ever elicited in any prediction market by an order of magnitude. A market of this size required new advances, including a novel spanbased elicitation interface. The results of the GHPM are important for two reasons. First, we uncovered two flaws of current automated market makers: spikiness and liquidityinsensitivity, and we develop the mathematical underpinnings of these flaws. Second, the market provides a valuable corpus of identitylinked trades. We use this data set to explore whether the market reacted to or anticipated official communications, how selfreported trader confidence had little relation to actual performance, and how trade frequencies suggest a power law distribution. Most significantly, the data enabled us to evaluate two competing hypotheses about how markets aggregate information, the Marginal Trader Hypothesis and the Hayek Hypothesis; the data strongly support the former.
Parimutuel betting on permutations
 In International Workshop on Internet and Network Economics
, 2008
"... We focus on a permutation betting market under parimutuel call auction model where traders bet on the final ranking of n candidates. We present a Proportional Betting mechanism for this market. Our mechanism allows the traders to bet on any subset of the n 2 ‘candidaterank ’ pairs, and rewards them ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
We focus on a permutation betting market under parimutuel call auction model where traders bet on the final ranking of n candidates. We present a Proportional Betting mechanism for this market. Our mechanism allows the traders to bet on any subset of the n 2 ‘candidaterank ’ pairs, and rewards them proportionally to the number of pairs that appear in the final outcome. We show that market organizer’s decision problem for this mechanism can be formulated as a convex program of polynomial size. More importantly, the formulation yields a set of n 2 unique marginal prices that are sufficient to price the bets in this mechanism, and are computable in polynomialtime. The marginal prices reflect the traders ’ beliefs about the marginal distributions over outcomes. We also propose techniques to compute the joint distribution over n! permutations from these marginal distributions. We show that using a maximum entropy criterion, we can obtain a concise parametric form (with only n 2 parameters) for the joint distribution which is defined over an exponentially large state space. We then present an approximation algorithm for computing the parameters of this distribution. In fact, the algorithm addresses the generic problem of finding the maximum entropy distribution over permutations that has a given mean, and may be of independent interest. 1
Adaptive Polling for Information Aggregation
"... The flourishing of online labor markets such as Amazon Mechanical Turk (MTurk) makes it easy to recruit many workers for solving small tasks. We study whether information elicitation and aggregation over a combinatorial space can be achieved by integrating small pieces of potentially imprecise infor ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
The flourishing of online labor markets such as Amazon Mechanical Turk (MTurk) makes it easy to recruit many workers for solving small tasks. We study whether information elicitation and aggregation over a combinatorial space can be achieved by integrating small pieces of potentially imprecise information, gathered from a large number of workers through simple, oneshot interactions in an online labor market. We consider the setting of predicting the ranking of n competing candidates, each having a hidden underlying strength parameter. At each step, our method estimates the strength parameters from the collected pairwise comparison data and adaptively chooses another pairwise comparison question for the next recruited worker. Through an MTurk experiment, we show that the adaptive method effectively elicits and aggregates information, outperforming a naïve method using a random pairwise comparison question at each step. 1
Price Updating in Combinatorial Prediction Markets with Bayesian Networks
"... To overcome the #Phardness of computing/updating prices in logarithm market scoring rulebased (LMSRbased) combinatorial prediction markets, Chen et al. [5] recently used a simple Bayesian network to represent the prices of securities in combinatorial prediction markets for tournaments, and showed ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
To overcome the #Phardness of computing/updating prices in logarithm market scoring rulebased (LMSRbased) combinatorial prediction markets, Chen et al. [5] recently used a simple Bayesian network to represent the prices of securities in combinatorial prediction markets for tournaments, and showed that two types of popular securities are structure preserving. In this paper, we significantly extend this idea by employing Bayesian networks in general combinatorial prediction markets. We reveal a very natural connection between LMSRbased combinatorial prediction markets and probabilistic belief aggregation, which leads to a complete characterization of all structure preserving securities for decomposable network structures. Notably, the main results by Chen et al. [5] are corollaries of our characterization. We then prove that in order for a very basic set of securities to be structure preserving, the graph of the Bayesian network must be decomposable. We also discuss some approximation techniques for securities that are not structure preserving. 1
Efficient market making via convex optimization, and a connection to online learning
 ACM Transactions on Economics and Computation. To Appear
, 2012
"... We propose a general framework for the design of securities markets over combinatorial or infinite state or outcome spaces. The framework enables the design of computationally efficient markets tailored to an arbitrary, yet relatively small, space of securities with bounded payoff. We prove that any ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We propose a general framework for the design of securities markets over combinatorial or infinite state or outcome spaces. The framework enables the design of computationally efficient markets tailored to an arbitrary, yet relatively small, space of securities with bounded payoff. We prove that any market satisfying a set of intuitive conditions must price securities via a convex cost function, which is constructed via conjugate duality. Rather than deal with an exponentially large or infinite outcome space directly, our framework only requires optimization over a convex hull. By reducing the problem of automated market making to convex optimization, where many efficient algorithms exist, we arrive at a range of new polynomialtime pricing mechanisms for various problems. We demonstrate the advantages of this framework with the design of some particular markets. We also show that by relaxing the convex hull we can gain computational tractability without compromising the market institution’s bounded budget. Although our framework was designed with the goal of deriving efficient automated market makers for markets with very large outcome spaces, this framework also provides new insights into the relationship between market design and machine learning, and into the complete market setting. Using our framework, we illustrate the mathematical parallels between cost function based markets and online learning and establish a correspondence between cost function based markets and market scoring rules for complete markets. 1
Combinatorial Prediction Markets for Event Hierarchies
"... We study combinatorial prediction markets where agents bet on the sum of values at any tree node in a hierarchy of events, for example the sum of page views among all the children within a web subdomain. We propose three expressive betting languages that seem natural, and analyze the complexity of p ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
We study combinatorial prediction markets where agents bet on the sum of values at any tree node in a hierarchy of events, for example the sum of page views among all the children within a web subdomain. We propose three expressive betting languages that seem natural, and analyze the complexity of pricing using Hanson’s logarithmic market scoring rule (LMSR) market maker. Sum of arbitrary subset (SAS) allows agents to bet on the weighted sum of an arbitrary subset of values. Sum with varying weights (SVW) allows agents to set their own weights in their bets but restricts them to only bet on subsets that correspond to tree nodes in a fixed hierarchy. We show that LMSR pricing is NPhard for both SAS and SVW. Sum with predefined weights (SPW) also restricts bets to nodes in a hierarchy, but using predefined weights. We derive a polynomial time pricing algorithm for SPW. We discuss the algorithm’s generalization to other betting contexts, including betting on maximum/minimum and betting on the product of binary values. Finally, we describe a prototype we built to predict web site page views and discuss the implementation issues that arose.