Results 1  10
of
25
Bayesian Algorithmic Mechanism Design
, 2010
"... The principal problem in algorithmic mechanism design is in merging the incentive constraints imposed by selfish behavior with the algorithmic constraints imposed by computational intractability. This field is motivated by the observation that the preeminent approach for designing incentive compatib ..."
Abstract

Cited by 45 (11 self)
 Add to MetaCart
The principal problem in algorithmic mechanism design is in merging the incentive constraints imposed by selfish behavior with the algorithmic constraints imposed by computational intractability. This field is motivated by the observation that the preeminent approach for designing incentive compatible mechanisms, namely that of Vickrey, Clarke, and Groves; and the central approach for circumventing computational obstacles, that of approximation algorithms, are fundamentally incompatible: natural applications of the VCG approach to an approximation algorithm fails to yield an incentive compatible mechanism. We consider relaxing the desideratum of (ex post) incentive compatibility (IC) to Bayesian incentive compatibility (BIC), where truthtelling is a BayesNash equilibrium (the standard notion of incentive compatibility in economics). For welfare maximization in singleparameter agent settings, we give a general blackbox reduction that turns any approximation algorithm into a Bayesian incentive compatible mechanism with essentially the same1 approximation factor.
Bayesian combinatorial auctions: Expanding single buyer mechanisms to many buyers
 In FOCS. 512–521
"... • Bronze Medal, 13th International Olympiad in Informatics, Tampere, Finland, ..."
Abstract

Cited by 40 (2 self)
 Add to MetaCart
• Bronze Medal, 13th International Olympiad in Informatics, Tampere, Finland,
When LP is the Cure for Your Matching Woes: Improved Bounds for Stochastic Matchings (Extended Abstract)
"... Abstract. Consider a random graph model where each possible edge e is present independently with some probability pe. We are given these numbers pe, and want to build a large/heavy matching in the randomly generated graph. However, the only way we can find out whether an edge is present or not is to ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Consider a random graph model where each possible edge e is present independently with some probability pe. We are given these numbers pe, and want to build a large/heavy matching in the randomly generated graph. However, the only way we can find out whether an edge is present or not is to query it, and if the edge is indeed present in the graph, we are forced to add it to our matching. Further, each vertex i is allowed to be queried at most ti times. How should we adaptively query the edges to maximize the expected weight of the matching? We consider several matching problems in this general framework (some of which arise in kidney exchanges and online dating, and others arise in modeling online advertisements); we give LProunding based constantfactor approximation algorithms for these problems. Our main results are: • We give a 5.75approximation for weighted stochastic matching on general graphs, and a 5approximation on bipartite graphs. This answers an open question from [Chen et al. ICALP 09]. • Combining our LProunding algorithm with the natural greedy algorithm, we give an improved 3.88approximation for unweighted stochastic matching on general graphs and 3.51approximation on bipartite graphs. • We introduce a generalization of the stochastic online matching problem [Feldman et al. FOCS 09] that also models preferenceuncertainty and timeouts of buyers, and give a constant factor approximation algorithm. 1
Bayesian Optimal Auctions via Multi to Singleagent Reduction
, 1203
"... We study an abstract optimal auction problem for a single good or service. This problem includes environments where agents have budgets, risk preferences, or multidimensional preferences over several possible configurations of the good (furthermore, it allows an agent’s budget and risk preference t ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
We study an abstract optimal auction problem for a single good or service. This problem includes environments where agents have budgets, risk preferences, or multidimensional preferences over several possible configurations of the good (furthermore, it allows an agent’s budget and risk preference to be known only privately to the agent). These are the main challenge areas for auction theory. A singleagent problem is to optimize a given objective subject to a constraint on the maximum probability with which each type is allocated, a.k.a., an allocation rule. Our approach is a reduction from multiagent mechanism design problem to collection of singleagent problems. We focus on maximizing revenue, but our results can be applied to other objectives (e.g., welfare). An optimal multiagent mechanism can be computed by a linear/convex program on interim allocation rules by simultaneously optimizing several singleagent mechanisms subject to joint feasibility of the allocation rules. For singleunit auctions, Border (1991) showed that the space of all jointly feasible interim allocation rules for n agents is a Ddimensional convex polytope which can be specified by 2D linear constraints, where D is the total number of all agents’
Bayesian Incentive Compatibility via Fractional Assignments
"... Very recently, Hartline and Lucier [14] studied singleparameter mechanism design problems in the Bayesian setting. They proposed a blackbox reduction that converted Bayesian approximation algorithms into BayesianIncentiveCompatible (BIC) mechanisms while preserving social welfare. It remains a ma ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
Very recently, Hartline and Lucier [14] studied singleparameter mechanism design problems in the Bayesian setting. They proposed a blackbox reduction that converted Bayesian approximation algorithms into BayesianIncentiveCompatible (BIC) mechanisms while preserving social welfare. It remains a major open question if one can find similar reduction in the more important multiparameter setting. In this paper, we give positive answer to this question when the prior distribution has finite and small support. We propose a blackbox reduction for designing BIC multiparameter mechanisms. The reduction converts any algorithm into an ɛBIC mechanism with only marginal loss in social welfare. As a result, for combinatorial auctions with subadditive agents we get an ɛBIC mechanism that achieves constant approximation. 1
The Exponential Mechanism for Social Welfare: Private, Truthful, and Nearly Optimal
, 2012
"... In this paper, we show that for any mechanism design problem, the exponential mechanism can be implemented as a truthful mechanism while still preserving differential privacy, if the objective is to maximize social welfare. Our instantiation of the exponential mechanism can be interpreted as a gener ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
In this paper, we show that for any mechanism design problem, the exponential mechanism can be implemented as a truthful mechanism while still preserving differential privacy, if the objective is to maximize social welfare. Our instantiation of the exponential mechanism can be interpreted as a generalization of the VCG mechanism in the sense that the VCG mechanism is the extreme case when the privacy parameter goes to infinity. To our knowledge, this is the first general tool for designing mechanisms that are both truthful and differentially private.
Priorindependent multiparameter mechanism design
 In Workshop on Internet and Network Economics (WINE
, 2011
"... Abstract. In a unitdemand multiunit multiitem auction, an auctioneer is selling a collection of different items to a set of agents each interested in buying at most unit. Each agent has a different private value for each of the items. We consider the problem of designing a truthful auction that m ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
Abstract. In a unitdemand multiunit multiitem auction, an auctioneer is selling a collection of different items to a set of agents each interested in buying at most unit. Each agent has a different private value for each of the items. We consider the problem of designing a truthful auction that maximizes the auctioneer’s profit in this setting. Previously, there has been progress on this problem in the setting in which each value is drawn from a known prior distribution. Specifically, it has been shown how to design auctions tailored to these priors that achieve a constant factor approximation ratio [2, 5]. In this paper, we present a priorindependent auction for this setting. This auction is guaranteed to achieve a constant fraction of the optimal expected profit for a large class of, so called, “regular ” distributions, without specific knowledge of the distributions. 1
The Complexity of Optimal Mechanism Design
, 1211
"... Myerson’s seminal work provides a computationally efficient revenueoptimal auction for selling one item to multiple bidders [17]. Generalizing this work to selling multiple items at once has been a central question in economics and algorithmic game theory, but its complexity has remained poorly und ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Myerson’s seminal work provides a computationally efficient revenueoptimal auction for selling one item to multiple bidders [17]. Generalizing this work to selling multiple items at once has been a central question in economics and algorithmic game theory, but its complexity has remained poorly understood. We answer this question by showing that a revenueoptimal auction in multiitem settings cannot be found and implemented computationally efficiently, unless ZPP ⊇ P #P. This is true even for a single additive bidder whose values for the items are independently distributedon tworationalnumberswith rationalprobabilities. Ourresult isvery general: we show that it is hard to compute any encoding of an optimal auction of any format (direct or indirect, truthful or nontruthful) that can be implemented in expected polynomial time. In particular, under wellbelieved complexitytheoreticassumptions, revenueoptimization in very simple multiitem settings can only be tractably approximated. We note that our hardness result applies to randomized mechanisms in a very simple setting, and is not an artifact of introducing combinatorial structure to the problem by allowing correlation among item values, introducing combinatorial valuations, or requiring the mechanism to be deterministic (whose structure is readily combinatorial). Our proof is enabled by a flowinterpretation of the solutions of an exponentialsize linear program for revenue maximization with an additional supermodularity constraint.
Approximation Schemes for Sequential Posted Pricing in MultiUnit
, 2010
"... We design algorithms for computing approximately revenuemaximizing sequential postedpricing mechanisms (SPM) in Kunit auctions, in a standard Bayesian model. A seller has K copies of an item to sell, and there are n buyers, each interested in only one copy, who have some value for the item. The se ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We design algorithms for computing approximately revenuemaximizing sequential postedpricing mechanisms (SPM) in Kunit auctions, in a standard Bayesian model. A seller has K copies of an item to sell, and there are n buyers, each interested in only one copy, who have some value for the item. The seller must post a price for each buyer, the buyers arrive in a sequence enforced by the seller, and a buyer buys the item if its value exceeds the price posted to it. The seller does not know the values of the buyers, but have Bayesian information about them. An SPM specifies the ordering of buyers and the posted prices, and may be adaptive or nonadaptive in its behavior. The goal is to design SPM in polynomial time to maximize expected revenue. We compare against the expected revenue of optimal SPM, and provide a polynomial time approximation scheme (PTAS) for both nonadaptive and adaptive SPMs. This is achieved by two algorithms: an efficient algorithm that gives a (1 − 1 √)approximation (and hence a PTAS for sufficiently 2πK large K), and another that is a PTAS for constant K. The first algorithm yields a nonadaptive SPM that yields its approximation guarantees against an optimal adaptive SPM – this implies that the adaptivity gap in SPMs vanishes as K becomes larger. 1
Selective call out and real time bidding
 In Internet and Network Economics
, 2010
"... Ads on the Internet are increasingly sold via ad exchanges such as RightMedia, AdECN and Doubleclick Ad Exchange. These exchanges allow realtime bidding, that is, each time the publisher contacts the exchange, the exchange “calls out ” to solicit bids from ad networks. This aspect of soliciting bid ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
Ads on the Internet are increasingly sold via ad exchanges such as RightMedia, AdECN and Doubleclick Ad Exchange. These exchanges allow realtime bidding, that is, each time the publisher contacts the exchange, the exchange “calls out ” to solicit bids from ad networks. This aspect of soliciting bids introduces a novel aspect, in contrast to existing literature. This suggests developing a joint optimization framework which optimizes over the allocation and well as solicitation. We model this selective call out as an online recurrent Bayesian decision framework with bandwidth type constraints. We obtain natural algorithms with bounded performance guarantees for several natural optimization criteria. We show that these results hold under different call out constraint models, and different arrival processes. Interestingly, the paper shows that under MHR assumptions, the expected revenue of generalized second price auction with reserve is constant factor of the expected welfare. Also the analysis herein allow us prove adaptivity gap type results for the adwords problem. 1