Results 1 -
8 of
8
A Theory of Expressiveness in Mechanisms
, 2007
"... A key trend in the world—especially in electronic commerce—is a demand for higher levels of expressiveness in the mechanisms that mediate interactions, such as the allocation of resources, matching of peers, and elicitation of opinions from large and diverse communities. Intuitively, one would think ..."
Abstract
-
Cited by 17 (8 self)
- Add to MetaCart
(Show Context)
A key trend in the world—especially in electronic commerce—is a demand for higher levels of expressiveness in the mechanisms that mediate interactions, such as the allocation of resources, matching of peers, and elicitation of opinions from large and diverse communities. Intuitively, one would think that this increase in expressiveness would lead to more efficient mechanisms (e.g., due to better matching of supply and demand). However, until now we have lacked a general way of characterizing the expressiveness of these mechanisms, analyzing how it impacts the actions taken by rational agents—and ultimately the outcome of the mechanism. In this technical report we introduce a general model of expressiveness for mechanisms. Our model is based on a new measure which we refer to as the maximum impact dimension. The measure captures the number of different ways that an agent can impact the outcome of a mechanism. We proceed to uncover a fundamental connection between this measure and the concept of shattering from computational learning theory. We also provide a way to determine an upper bound on the expected efficiency of any mechanism under its most efficient Nash equilibrium which, remarkably, depends only on the mechanism’s expressiveness. We show that for any setting and any prior over agent preferences, the
The Planning of Guaranteed Targeted Display Advertising
- INFORMS Annual Conf., INFORMS,
, 2012
"... As targeted advertising becomes prevalent in a wide variety of media vehicles, planning models become increasingly important to ad networks that need to match ads to appropriate audience segments, provide a high quality of service (meet advertisers' goals), and ensure that ad serving opportuni ..."
Abstract
-
Cited by 5 (0 self)
- Add to MetaCart
(Show Context)
As targeted advertising becomes prevalent in a wide variety of media vehicles, planning models become increasingly important to ad networks that need to match ads to appropriate audience segments, provide a high quality of service (meet advertisers' goals), and ensure that ad serving opportunities are not wasted. We define Guaranteed Targeted Display Advertising (GTDA) as a class of media vehicles that include webpage banner ads, video games, electronic outdoor billboards, and the next generation of digital TV, and formulate the GTDA planning problem as a transportation problem with quadratic objective. By modeling audience uncertainty, forecast errors, and the ad server's execution of the plan, we derive sufficient conditions that state when our quadratic objective is a good surrogate for several ad delivery performance metrics. Moreover, our quadratic objective allows us to construct duality-based bounds for evaluating aggregations of the audience space, leading to two efficient algorithms for solving large problems: the first intelligently refines the audience space into successively smaller blocks, and the second uses scaling to find a feasible solution given a fixed audience space partition. Near-optimal schedules can often be produced despite significant aggregation.
Computational Bundling for Auctions
, 2013
"... Revenue maximization in combinatorial auctions (and other multidimensional selling settings) is one of the most important and most elusive problems in mechanism design. The design problem is NP-complete, and the optimal designs include features that are not acceptable in many applications, such as ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
(Show Context)
Revenue maximization in combinatorial auctions (and other multidimensional selling settings) is one of the most important and most elusive problems in mechanism design. The design problem is NP-complete, and the optimal designs include features that are not acceptable in many applications, such as favoring some bidders over others and randomization. In this paper, we instead study a common revenue-enhancement approach- bundling- in the context of the most commonly studied combinatorial auction mechanism, the Vickrey-Clarke-Groves (VCG) mechanism. A second challenge in mechanism design for combinatorial auctions is that the prior distribution on each bidder’s valuation can be doubly exponential. Such priors do not exist in most applications. Rather, in many applications (such as premium display advertising markets), there is essentially a point prior, which may not be accurate. We adopt the point prior model, and prove robustness to inaccuracy in the prior. Then, we present a branch-and-bound framework for finding the optimal bundling. We introduce several techniques for branching, upper bounding, lower bounding,
A Framework for Automated Bundling and Pricing Using Purchase Data
, 2011
"... We present a framework for automatically suggesting highprofit bundle discounts based on historical customer purchase data. We develop several search algorithms that identify profit-maximizing prices and bundle discounts. We introduce a richer probabilistic valuation model than prior work by captu ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
We present a framework for automatically suggesting highprofit bundle discounts based on historical customer purchase data. We develop several search algorithms that identify profit-maximizing prices and bundle discounts. We introduce a richer probabilistic valuation model than prior work by capturing complementarity, substitutability, and covariance, and we provide a hybrid search technique for fitting such a model to historical shopping cart data. As new purchase data is collected, it is integrated into the valuation model, leading to an online technique that continually refines prices and bundle discounts. To our knowledge, this is the first paper to study bundle discounting using shopping cart data. We conduct computational experiments using our fitting and pricing algorithms that demonstrate several conditions under which offering discounts on bundles can benefit the seller, the buyer, and the economy as a whole. One of our main findings is that, in contrast to products typically suggested by recommender systems, the most profitable products to offer bundle discounts on appear to be those that are occasionally purchased together and often separately.
Posted Prices Exchange for Display Advertising Contracts
- In Proceedings of the 27th AAAI Conference on Artificial Intelligence (AAAI’13
, 2013
"... Abstract We propose a new market design for display advertising contracts, based on posted prices. Our model and algorithmic framework address several major challenges: (i) the space of possible impression types is exponential in the number of attributes, which is typically large, therefore a compl ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Abstract We propose a new market design for display advertising contracts, based on posted prices. Our model and algorithmic framework address several major challenges: (i) the space of possible impression types is exponential in the number of attributes, which is typically large, therefore a complete price space cannot be maintained; (ii) advertisers are usually unable or reluctant to provide extensive demand (willingnessto-pay) functions, (iii) the levels of detail with which supply and demand are specified are often not identical.
Large-scale hierarchical optimization for online advertising and
"... wind farm planning ..."
(Show Context)
Using Expressiveness to Increase Efficiency in Social and Economic Mechanisms
, 2011
"... Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments ..."
Abstract
- Add to MetaCart
(Show Context)
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY 2011 2. REPORT TYPE
Value-directed Compression of Large-scale Assignment Problems
"... Data-driven analytics—in areas ranging from consumer mar-keting to public policy—often allow behavior prediction at the level of individuals rather than population segments, of-fering the opportunity to improve decisions that impact large populations. Modeling such (generalized) assignment prob-lems ..."
Abstract
- Add to MetaCart
Data-driven analytics—in areas ranging from consumer mar-keting to public policy—often allow behavior prediction at the level of individuals rather than population segments, of-fering the opportunity to improve decisions that impact large populations. Modeling such (generalized) assignment prob-lems as linear programs, we propose a general value-directed compression technique for solving such problems at scale. We dynamically segment the population into cells using a form of column generation, constructing groups of individ-uals who can provably be treated identically in the optimal solution. This compression allows problems, unsolvable us-ing standard LP techniques, to be solved effectively. Indeed, once a compressed LP is constructed, problems can solved in milliseconds. We provide a theoretical analysis of the meth-ods, outline the distributed implementation of the requisite data processing, and show how a single compressed LP can be used to solve multiple variants of the original LP near-optimally in real-time (e.g., to support scenario analysis). We also show how the method can be leveraged in integer pro-gramming models. Experimental results on marketing con-tact optimization and political legislature problems validate the performance of our technique.