Results 1  10
of
22
Computation in a Distributed Information Market
, 2003
"... According to economic theory, supported by empirical and laboratory evidence, the equilibrium price of a financial security reflects all of the information regarding the security's value. We investigate the dynamics of the computational process on the path toward equilibrium, where information dis ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
According to economic theory, supported by empirical and laboratory evidence, the equilibrium price of a financial security reflects all of the information regarding the security's value. We investigate the dynamics of the computational process on the path toward equilibrium, where information distributed among traders is revealed stepby step over time and incorporated into the market price. We develop a simplified model of an information market, along with trading strategies, in order to formalize the computational properties of the process. We show that securities whose payoffs cannot be expressed as a weighted threshold function of distributed input bits are not guaranteed to converge to the proper equilibrium predicted by economic theory. On the other hand, securities whose payoffs are threshold functions are guaranteed to converge, for all prior probability distributions. Moreover, these threshold securities converge in at most n rounds, where n is the number of bits of distributed information. We also prove a lower bound, showing a type of threshold security that requires at least n/2 rounds to converge in the worst case.
The power of play: Efficiency and forecast accuracy in web market games
 NEC RESEARCH INSTITUTE
, 2000
"... We analyze the eciency and forecast accuracy of two market games on the World Wide Web: the Hollywood Stock Exchange (HSX) and the Foresight Exchange (FX). We quantify the degree of arbitrage available on HSX, and compare with a realmoney market of a similar nature. We show that prices of HSX movie ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
We analyze the eciency and forecast accuracy of two market games on the World Wide Web: the Hollywood Stock Exchange (HSX) and the Foresight Exchange (FX). We quantify the degree of arbitrage available on HSX, and compare with a realmoney market of a similar nature. We show that prices of HSX movie stocks provide good forecasts of actual box oce returns, and that prices of HSX securities in Oscar, Emmy, and Grammy award outcomes constitute accurate assessments of the actual likelihoods that nominees will win. Similar investigations reveal that FX securities prices serve as reliable indicators of uncertain future events. We argue that, in certain circumstances, market simulations can furnish some of the same societal benets as real markets, and can serve as acceptable substitute testbeds for conducting experiments that would otherwise be dicult or impossible.
Disagreement is Unpredictable
 Economics Letters
, 2002
"... Given common priors, no agent can publicly estimate a nonzero sign for the difference between his estimate and another agent’s future estimate. Thus rational agents cannot publicly anticipate the direction in which other agents will disagree with them. ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Given common priors, no agent can publicly estimate a nonzero sign for the difference between his estimate and another agent’s future estimate. Thus rational agents cannot publicly anticipate the direction in which other agents will disagree with them.
Theoretical investigation of prediction markets with aggregate uncertainty
 In Proceedings of the Seventh International Conference on Electronic Commerce Research (ICECR7
, 2004
"... Much evidence supports that financial markets have the ability to aggregate information. When tied to a random variable, a financial market can forecast the value of the random variable. It then becomes a prediction market. We establish a model of prediction markets with aggregate uncertainty, and t ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Much evidence supports that financial markets have the ability to aggregate information. When tied to a random variable, a financial market can forecast the value of the random variable. It then becomes a prediction market. We establish a model of prediction markets with aggregate uncertainty, and theoretically characterize some fundamental properties of prediction markets. Specifically, we have shown that a prediction market is guaranteed to converge to an equilibrium, where traders have consensus on the forecast. The best possible prediction a prediction market can make is the direct communication equilibrium. However, prediction markets do not always converge to it. We have proved that a sufficient condition for the convergence to the direct communication equilibrium under our model is that the private information of each trader, conditioned on the state of the world, is identically and independently distributed. Furthermore, if this condition is satisfied, the prediction market converges in at most two rounds. 1
Consensus By Identifying Extremists
"... Abstract. Given a finite state space and common priors, common knowledge of the identity of an agent with the minimal (or maximal) expectation of a random variable implies “consensus”, i.e., common knowledge of common expectations. This “extremist ” statistic induces consensus when repeatedly announ ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. Given a finite state space and common priors, common knowledge of the identity of an agent with the minimal (or maximal) expectation of a random variable implies “consensus”, i.e., common knowledge of common expectations. This “extremist ” statistic induces consensus when repeatedly announced, and yet, with n agents, requires at most log 2 n bits to broadcast. Key words: consensus, common knowledge, information pooling, Bayesian learning 1.
Designing informative securities
 In UAI
, 2012
"... We create a formal framework for the design of informative securities in prediction markets. These securities allow a market organizer to infer the likelihood of events of interest as well as if he knew all of the traders’ private signals. We consider the design of markets that are always informativ ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We create a formal framework for the design of informative securities in prediction markets. These securities allow a market organizer to infer the likelihood of events of interest as well as if he knew all of the traders’ private signals. We consider the design of markets that are always informative, markets that are informative for a particular signal structure of the participants, and informative markets constructed from a restricted selection of securities. We find that to achieve informativeness, it can be necessary to allow participants to express information that may not be directly of interest to the market organizer, and that understanding the participants’ signal structure is important for designing informative prediction markets. 1
Information Elicitation Sans Verification
, 2013
"... The recent advent of human computation — employing groups of nonexperts to solve problems — has motivated study of a question in mechanism design: How do we elicit useful information when we are unable to verify reports? Existing methods, such as peer prediction and Bayesian truth serum, require as ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The recent advent of human computation — employing groups of nonexperts to solve problems — has motivated study of a question in mechanism design: How do we elicit useful information when we are unable to verify reports? Existing methods, such as peer prediction and Bayesian truth serum, require assumptions either on the mechanism’s knowledge about the participants or on the information structure of participants for eliciting private information from the participants. Meanwhile, however, there are simple mechanisms in practice such as the ESP Game that seem to require no such assumptions, yet have achieved great empirical success. We attack this paradox from two directions. First, we provide a broad formalization of the problem of information elicitation without verification and show that, without assumptions on designer knowledge or participants ’ information, there do not exist mechanisms that can truthfully elicit the private information of the participants for this setting. Second, we define and analyze the output agreement class of mechanisms, an extremely broad but simple mechanism in which players are rewarded based on the metric distance between their reports. Output agreement makes no assumptions on designer knowledge or participants ’ information and thus cannot be “truthful”. We resolve the paradox by showing that it is useful: It elicits the correct answer according to the common knowledge among the players. We conclude with an analysis of the assumptions and results of various popular mechanisms for information elicitation without verification.
Polling games and information revelation in the Downsian framework
, 2005
"... We investigate the incentives faced by poll respondents when candidates use polling data to inform their selection of policy platforms. Focusing on models with a unidimensional policy space, single peaked preferences and two officeseeking candidates observing a summary statistic from polls that ask ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We investigate the incentives faced by poll respondents when candidates use polling data to inform their selection of policy platforms. Focusing on models with a unidimensional policy space, single peaked preferences and two officeseeking candidates observing a summary statistic from polls that ask respondents their preferences, we find that for most environments honest poll response cannot occur in a perfect Bayesian equilibrium. However, simple partiallyrevealing equilibria exist when the poll only asks respondents which party or candidate they prefer. When the candidates learn the sample average or see all the data, there are partially revealing equilibria that mimic those of the binary message game. Interpretation of polling data requires knowledge of the equilibrium played as the meanings of poll responses are endogenously determined. The analysis suggests that naive use of polling data may be misleading.
O.: Making consensus tractable
, 2010
"... The process of consensus voting, or decision making by unanimous agreement, has many distinct advantages: it fosters discussion and participation, empowers minorities and independent thinkers, and is more likely, after a decision has been made, to secure the participants’ support for the chosen cour ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The process of consensus voting, or decision making by unanimous agreement, has many distinct advantages: it fosters discussion and participation, empowers minorities and independent thinkers, and is more likely, after a decision has been made, to secure the participants’ support for the chosen course of action. These considerations, among others, have lead many institutions to adopt consensus voting as a practical method of decision making. The disadvantage of consensus decision making is, of course, the difficulty of reaching consensus. While this challenge is largely overcome in many theoretical settings such as Aumann’s “agree to disagree ” result and its related literature, a hitherto unsolved difficulty is the lack of a framework offering rational (i.e., Bayesian) consensus decision making that can be performed using simple and efficient calculations. We study a stochastic model featuring a finite group of agents that have to choose between one of two courses of action. Each member of the group has a private and independent signal at his or her disposal, giving some indication as to which action is optimal. To come to a common decision, the participants perform repeated rounds of voting. In each round, each agent casts a vote in favor of one of the two courses of action, reflecting his or her current conditional probabilities, and observes the votes of the rest in order to calculate an updated conditional probability. We prove four results: 1. Consensus is always reached. 2. Each round of voting improves the aggregation of information. 3. The chance of a correct decision quickly approaches one as the number of agents increases. This is achieved already at the second round of voting. 4. Most importantly, we provide an efficient algorithm for the calculation the agents have to perform.
Asymptotic Learning on Bayesian Social Networks
, 2012
"... Understanding information exchange and aggregation on networks is a central problem in theoretical economics, probability and statistics. We study a standard model of economic agents on the nodes of a social network graph who learn a binary “state of the world” S, from initial signals, by repeatedly ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Understanding information exchange and aggregation on networks is a central problem in theoretical economics, probability and statistics. We study a standard model of economic agents on the nodes of a social network graph who learn a binary “state of the world” S, from initial signals, by repeatedly observing each other’s best guesses. Asymptotic learning is said to occur on a family of graphs Gn = (Vn, En) with Vn  → ∞ if with probability tending to 1 as n → ∞ all agents in Gn eventually estimate S correctly. We identify sufficient conditions for asymptotic learning and contruct examples where learning does not occur when the conditions do not hold.