Results 1  10
of
69,642
Ranking with Submodular Valuations
"... We study the problem of ranking with submodular valuations. An instance of this problem consists of a ground set [m], and a collection of n monotone submodular set functions f 1,..., f n, where each f i: 2 [m] → R+. An additional ingredient of the input is a weight vector w ∈ Rn +. The objective is ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We study the problem of ranking with submodular valuations. An instance of this problem consists of a ground set [m], and a collection of n monotone submodular set functions f 1,..., f n, where each f i: 2 [m] → R+. An additional ingredient of the input is a weight vector w ∈ Rn +. The objective
Submodular functions, matroids and certain polyhedra
, 2003
"... The viewpoint of the subject of matroids, and related areas of lattice theory, has always been, in one way or another, abstraction of algebraic dependence or, equivalently, abstraction of the incidence relations in geometric representations of algebra. Often one of the main derived facts is that all ..."
Abstract

Cited by 350 (0 self)
 Add to MetaCart
called independent subsets of E such that (a) Every subset of an independent set is independent, and (b) For every A ⊆ E, every maximal independent subset of A, i.e., every basis of A, has the same cardinality, called the rank, r(A), of A (with respect to M). (This definition is not standard. It is prompted
Minimum Latency Submodular Cover∗
, 2013
"... We study the Minimum Latency Submodular Cover problem (MLSC), which consists of a metric (V, d) with source r ∈ V and m monotone submodular functions f1, f2,..., fm: 2V → [0, 1]. The goal is to find a path originating at r that minimizes the total cover time of all functions. This generalizes wells ..."
Abstract
 Add to MetaCart
studied problems, such as Submodular Ranking [1] and Group Steiner Tree [16]. We give a polynomial time O(log 1 · log2+δ V )approximation algorithm for MLSC, where > 0 is the smallest nonzero marginal increase of any {fi}mi=1 and δ> 0 is any constant. We also consider the Latency Covering Steiner Tree
Active Learning and Submodular Functions
, 2012
"... Active learning is a machine learning setting where the learning algorithm decides what data is labeled. Submodular functions are a class of set functions for which many optimization problems have efficient exact or approximate algorithms. We examine their connections. • We propose a new class of in ..."
Abstract
 Add to MetaCart
Active learning is a machine learning setting where the learning algorithm decides what data is labeled. Submodular functions are a class of set functions for which many optimization problems have efficient exact or approximate algorithms. We examine their connections. • We propose a new class
To steal or not to steal: firm attributes, legal environment, and valuation
 Journal of Finance
, 2005
"... Data on corporate governance and disclosure practices reveal wide withincountry variation that decreases with the strength of investors ’ legal protection. A simple model identifies three firm attributes related to that variation: investment opportunities, external financing, and ownership structu ..."
Abstract

Cited by 216 (8 self)
 Add to MetaCart
structure. Using firmlevel governance and transparency data from 27 countries, we find that all three firm attributes are related to the quality of governance and disclosure practices, and firms with higher governance and transparency rankings are valued higher in stock markets. All relations are stronger
Submodular Stochastic Probing on Matroids
"... In a stochastic probing problem we are given a universe E, where each element e ∈ E is active independently with probability pe ∈ [0, 1], and only a probe of e can tell us whether it is active or not. On this universe we execute a process that one by one probes elements — if a probed element is acti ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
submodular objective function. We give a (1 − 1/e)/(kin + kout + 1)approximation algorithm for the case in which we are given kin ≥ 0 matroids as inner constraints and kout ≥ 1 matroids as outer constraints. There are two main ingredients behind this result. First is a previously unpublished stronger bound
Representation, approximation and learning of submodular functions using lowrank decision trees
 In Proceedings of the Conference on Learning Theory (COLT
, 2013
"... We study the complexity of approximate representation and learning of submodular functions over the uniform distribution on the Boolean hypercube {0, 1}n. Our main result is the following structural theorem: any submodular function is close in `2 to a realvalued decision tree (DT) of depth O(1/2) ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
result is proved by constructing an approximation of a submodular function by a DT of rank 4/2 and a proof that any rankr DT can be approximated by a DT of depth 52 (r + log(1/)). We show that these structural results can be exploited to give an attributeefficient PAC learning algorithm for submodular
Submodular Functions: Learnability, Structure, and Optimization
, 2012
"... Submodular functions are discrete functions that model laws of diminishing returns and enjoy numerous algorithmic applications. They have been used in many areas, including combinatorial optimization, machine learning, and economics. In this work we study submodular functions from a learning theoret ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Submodular functions are discrete functions that model laws of diminishing returns and enjoy numerous algorithmic applications. They have been used in many areas, including combinatorial optimization, machine learning, and economics. In this work we study submodular functions from a learning
Results 1  10
of
69,642