Results 11  20
of
72
Buying Cheap is Expensive: Hardness of NonParametric MultiProduct Pricing
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 68
, 2006
"... We investigate nonparametric unitdemand pricing problems, in which the goal is to find revenue maximizing prices for products P based on a set of consumer profiles C obtained, e.g., from an eCommerce website. A consumer profile consists of a number of nonzero budgets and a ranking of all the pro ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
We investigate nonparametric unitdemand pricing problems, in which the goal is to find revenue maximizing prices for products P based on a set of consumer profiles C obtained, e.g., from an eCommerce website. A consumer profile consists of a number of nonzero budgets and a ranking of all the products the consumer is interested in. Once prices are fixed, each consumer chooses to buy one of the products she can afford based on some predefined selection rule. We distinguish between the minbuying, maxbuying, and rankbuying models. For the minbuying and general rankbuying models the best known approximation ratio is O(log C) and, previously, the problem was only known to be APXhard. We obtain the first (near) tight lower bound showing that the problem is not approximable within O(log ε C) for some ε> 0, unless NP ⊆ DTIME(n loglog n). Going to slightly stronger (still reasonable) complexity theoretic assumptions we prove inapproximability within O(ℓ ε) (ℓ being an upper bound on the number of nonzero budgets per consumer) and O(P  ε) and provide matching upper bounds. Surprisingly, these hardness results hold even if a price ladder constraint, i.e., a predefined total order on the prices of all products, is given. This changes if we require that in the rankbuying model consumers’ budgets are consistent with their
Singlevalue combinatorial auctions and algorithmic implementation in undominated strategies
 In ACM Symposium on Discrete Algorithms
, 2011
"... In this paper we are interested in general techniques for designing mechanisms that approximate the social welfare in the presence of selfish rational behavior. We demonstrate our results in the setting of Combinatorial Auctions (CA). Our first result is a general deterministic technique to decouple ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
In this paper we are interested in general techniques for designing mechanisms that approximate the social welfare in the presence of selfish rational behavior. We demonstrate our results in the setting of Combinatorial Auctions (CA). Our first result is a general deterministic technique to decouple the algorithmic allocation problem from the strategic aspects, by a procedure that converts any algorithm to a dominantstrategy ascending mechanism. This technique works for any single value domain, in which each agent has the same value for each desired outcome, and this value is the only private information. In particular, for “singlevalue CAs”, where each player desires any one of several different bundles but has the same value for each of them, our technique converts any approximation algorithm to a dominant strategy mechanism that almost preserves the original approximation ratio. Our second result provides the first computationally efficient deterministic mechanism for the case of singlevalue multiminded bidders (with private value and private desired bundles). The mechanism achieves an approximation to the social welfare which is close to the best possible in polynomial time (unless P=NP). This mechanism is an algorithmic implementation in undominated strategies, a notion that we define and justify, and is of independent interest. 1
On the Computational Power of Iterative Auctions I: Demand Queries
 In Proceedings of the 6th ACM Conference on Electronic Commerce (EC
, 2005
"... ..."
From convex optimization to randomized mechanisms: Toward optimal combinatorial auctions
 In Proceedings of the 43rd annual ACM Symposium on Theory of Computing (STOC
, 2011
"... We design an expected polynomialtime, truthfulinexpectation, (1 − 1/e)approximation mechanism for welfare maximization in a fundamental class of combinatorial auctions. Our results apply to bidders with valuations that are matroid rank sums (MRS), which encompass mostconcreteexamplesofsubmodular ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
We design an expected polynomialtime, truthfulinexpectation, (1 − 1/e)approximation mechanism for welfare maximization in a fundamental class of combinatorial auctions. Our results apply to bidders with valuations that are matroid rank sums (MRS), which encompass mostconcreteexamplesofsubmodularfunctionsstudiedinthiscontext,includingcoveragefunctions, matroid weightedrank functions, and convex combinations thereof. Our approximation factor is the best possible, even for known and explicitly given coverage valuations, assuming P ̸ = NP. Ours is the first truthfulinexpectation and polynomialtime mechanism to achieve a constantfactor approximation for an NPhard welfare maximization problem in combinatorial auctions with heterogeneous goods and restricted valuations. Our mechanism is an instantiation of a new framework for designing approximation mechanisms based on randomized rounding algorithms. A typical such algorithm first optimizes over a fractional relaxation of the original problem, and then randomly rounds the fractional solution to an integral one. With rare exceptions, such algorithms cannot be converted into truthful mechanisms. The highlevel idea of our mechanism design framework is to optimize directly
Mechanism design for fractional scheduling on unrelated machines
 Automata, Languages and Programming
, 2007
"... machines ..."
Bayesian Incentive Compatibility via Fractional Assignments
"... Very recently, Hartline and Lucier [14] studied singleparameter mechanism design problems in the Bayesian setting. They proposed a blackbox reduction that converted Bayesian approximation algorithms into BayesianIncentiveCompatible (BIC) mechanisms while preserving social welfare. It remains a ma ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Very recently, Hartline and Lucier [14] studied singleparameter mechanism design problems in the Bayesian setting. They proposed a blackbox reduction that converted Bayesian approximation algorithms into BayesianIncentiveCompatible (BIC) mechanisms while preserving social welfare. It remains a major open question if one can find similar reduction in the more important multiparameter setting. In this paper, we give positive answer to this question when the prior distribution has finite and small support. We propose a blackbox reduction for designing BIC multiparameter mechanisms. The reduction converts any algorithm into an ɛBIC mechanism with only marginal loss in social welfare. As a result, for combinatorial auctions with subadditive agents we get an ɛBIC mechanism that achieves constant approximation. 1
Santa Claus Meets Hypergraph Matchings
, 2008
"... We consider the problem of maxmin fair allocation of indivisible goods. Our focus will be on the restricted version of the problem in which there are m items, each of which associated with a nonnegative value. There are also n players and each player is only interested in some of the items. The go ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
We consider the problem of maxmin fair allocation of indivisible goods. Our focus will be on the restricted version of the problem in which there are m items, each of which associated with a nonnegative value. There are also n players and each player is only interested in some of the items. The goal is to distribute the items between the players such that the least happy person is as happy as possible, i.e. one wants to maximize the minimum of the sum of the values of the items given to any player. This problem is also known as the Santa Claus problem [3]. Feige [9] proves that the integrality gap of a certain configuration LP, described by Bansal and Sviridenko [3], is bounded from below by some (unspecified) constant. This gives an efficient way to estimate the optimum value of the problem within a constant factor. However, the proof in [9] is nonconstructive: it uses the Lovasz local lemma and does not provide a polynomial time algorithm for finding an allocation. In this paper, we take a different approach to this problem, based upon local search techniques for finding perfect matchings in certain classes of hypergraphs. As a result, we prove that the integrality gap of the configuration LP is bounded by 1 5. Our proof is nonconstructive in the following sense: it does provide a local search algorithm which finds the corresponding allocation, but this algorithm is not known to converge to a local optimum in a polynomial number of steps. 1
Exponential Communication Inefficiency of Demand Queries
"... In the problem of finding an efficient allocation when agents' utilities are privately known, we examine the effect of restricting attention to mechanisms using demand queries, which ask agents to report an optimal allocation given a price list. We construct a combinatorial allocation problem wit ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
In the problem of finding an efficient allocation when agents' utilities are privately known, we examine the effect of restricting attention to mechanisms using demand queries, which ask agents to report an optimal allocation given a price list. We construct a combinatorial allocation problem with m items and two agents whose valuations lie in a certain class, such that (i) e ciency can be obtained with a mechanism using O(m) bits, but (ii) any demandquery mechanism guaranteeing a higher efficiency than giving all items to one agent uses a number of queries that is exponential in m. The same is proven for any demandquery mechanism achieving an improvement in expected efficiency, for a constructed joint probability distribution over agents' valuations from the class. These results cast doubt on the usefulness of such common combinatorial allocation mechanisms as iterative auctions and other preference elicitation mechanisms using demand queries, as well as value queries and order queries (which are easily replicated with demand queries in our setting).
A lower bound of 1+φ for truthful scheduling mechanisms
 In The Proc. of the 32nd International Symposium on Mathematical Foundations of Computer Science (MFCS
"... Abstract. We give an improved lower bound for the approximation ratio of truthful mechanisms for the unrelated machines scheduling problem. The mechanism design version of the problem which was proposed and studied in a seminal paper of Nisan and Ronen is at the core of the emerging area of Algorith ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Abstract. We give an improved lower bound for the approximation ratio of truthful mechanisms for the unrelated machines scheduling problem. The mechanism design version of the problem which was proposed and studied in a seminal paper of Nisan and Ronen is at the core of the emerging area of Algorithmic Game Theory. The new lower bound 1 + φ ≈ 2.618 is a step towards the final resolution of this important problem. 1