The Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19) Learning Set Functions with Limited Complementarity Hanrui Zhang Computer Science Department Duke University Durham, NC 27705 [email protected] Abstract However, in real-world scenarios, it is common that a val- We study PMAC-learning of real-valued set functions with uation function exhibits limited complementarity. For exam- limited complementarity. We prove, to our knowledge, the ple, a pen is useful only if accompanied by paper to write first nontrivial learnability result for set functions exhibiting on. Paper therefore complements pens. This complementar- complementarity, generalizing Balcan and Harvey’s result for ity to pens is limited, in the sense that owning items other submodular functions. We prove a nearly matching informa- than paper, like computers, is unlikely to make pens more tion theoretical lower bound on the number of samples re- valuable. So in the above example, complementarity exists quired, complementing our learnability result. We conduct only between pens and paper. One significant real-world numerical simulations to show that our algorithm is likely to example of limited complementarity is the spectrum auc- perform well in practice. tions, where the rights to use specific bands in specific re- gions are sold. The complementarity there lies in the fact Introduction that a buyer would like the same band in neighboring re- A central problem in economics and algorithmic game the- gions (say states). Since there are only 50 states, one would ory is to price items. Intuitively, a seller would like to set naturally cosider the degree of complementarity limited. the highest price such that the customer would still buy, More motivating everyday examples of limited complemen- which requires decent understanding of the customer’s val- tarity can be found in (Feige et al. 2015; Eden et al. 2017; uation. When there are multiple items which can be sold Chen, Teng, and Zhang 2019). in any combination, the valuation is usually modeled as a In the past decade, there has been a growing interest in set function with the items being the ground set. That is, studying set functions with limited complementarity, es- the valuation function of the customer maps each subset of pecially in the combinatorial optimization and algorith- the items to her utility when she gets the subset. To be able mic game theory communities. In particular, recent results to better price the items for maximum profit, one need to seem to suggest, that there exists smooth transitions from learn the customer’s valuation function. Set functions are complement-free to completely arbitrary monotone set func- also used to model influence propogation in social networks tions, parametrized by the degree of complementarity of (Kempe, Kleinberg, and Tardos 2003), and for solving clus- the function. The transitions support graceful degrading of tering problems (Narasimhan and Bilmes 2007). In all these the approximation ratio for various combinatorial optimiza- scenarios, learning the corresponding set function plays an tion tasks (Feige and Izsak 2013; Feldman and Izsak 2014; essential part in solving the problem. Feige et al. 2015; Chen, Teng, and Zhang 2019), and the There is a rich body of research on learning of set func- revenue and efficiency (measured by the Price of Anarchy) tions, e.g. (Balcan and Harvey 2011; Balcan et al. 2012; of well-studied simple protocols for combinatorial auctions Lin and Bilmes 2012; Bach and others 2013). All of these re- (Feige et al. 2015; Feldman et al. 2016; Eden et al. 2017; sults focus on an important class of monotone set functions Chen, Teng, and Zhang 2019). — complement-free set functions. Such set functions model So one natural question arises: the natural property of diminishing returns, and are gener- Is there a way to generalize learnability of complement- ally considered much easier to tackle than general monotone free set functions to those with limited complementarity, set functions. For example, various optimization problems without incurring too much penalty? admit efficient constant factor approximations when the set function involved is complement-free or submodular (which In this paper, based on understanding of the underly- is stronger than complement-free) (Nemhauser and Wolsey ing combinatorial and statistical structures, we give, to our 1978; Vondrak´ 2008; Feige 2009), while for general mono- knowledge, the first nontrivial learnability result for mono- tone functions the best possible approximation ratio can be tone set functions with limited complementarity: arbitrarily large. Theorem 1 (Main Theorem (Informal)). Restricted to Copyright c 2019, Association for the Advancement of Artificial product distributions, there is an efficient algorithm that Intelligence (www.aaai.org). All rights reserved. O(1= log("))-approximately learns any monotone set func- 5749 tion with fixed degree of complementarity. • Fractionally subadditive (or XOS). A set function f is The above theorem generalizes a central result of (Balcan fractionally subadditive, if for any S ⊆ [n], k 2 N, T1;:::;Tk ⊆ [n], 0 ≤ α1; : : : ; αk ≤ 1, f(S) ≥ and Harvey 2011) beyond complement-free functions. We P also complement our result by a nearly matching informa- i2[k] αif(Ti), as long as the following holds: for any v 2 S, P α ≥ 1. In other words, if f(T ; α )g tion theoretical lower bound. We conduct numerical simula- i2[k]:v2Ti i i i i tions to show that our algorithm is likely to perform well in form a fractional cover of S, then the weighted sum of practice. f(Ti)’s is no smaller than f(S). Define the marginal of S given T , denoted by f(SjT ), to • Subadditive (or complement-free). A set function f is sub- be f(SjT ) := f(S[T )−f(T ). Throughout the paper, when additive, if for any S; T ⊆ [n], f(S) + f(T ) ≥ f(S [ T ). we refer to a set function f, unless otherwise specified, we It can be shown that every submodular function is fraction- always assume that: ally subadditive, and every fractionally subadditive function • f has a ground set [n] = f1; 2; : : : ; ng. That is, f maps is subadditive. all subsets of [n], denoted by 2[n], to real numbers. Beyond complement-free functions, several measures of • f is (weakly) monotone. That is, for any S ⊆ T ⊆ [n], complementarity have been proposed, and the ones particu- supermodular degree f(S) ≤ f(T ). larly helpful for our purposes are the (SD) hierarchy and the supermodular width (SMW) hierar- • f is 1-Lipschitz. That is, for any S ⊆ [n] and v 2 [n], chy. They build on the concepts of positive dependency and f(vjS) ≤ 1. supermodular sets respectively. Definition 2 (Positive Dependency (Feige and Izsak 2013)). PMAC-Learning Given a set function f, an element u 2 [n] depends pos- To study learnability of real-valued functions, we use the itively on v 2 [n], denoted by u !+ v, if there exists Probably Mostly Approximately Correct (PMAC) model S ⊆ [n] n fug, such that f(ujS) > f(ujS n fvg). introduced by Balcan and Harvey in (Balcan and Harvey Definition 3 (Supermodular Degree Hierarchy (Feige and 2011). Izsak 2013)). The supermodular degree of a set function f, Definition 1 (PMAC-Learning (Balcan and Harvey 2011)). denoted by SD(f), is defined to be Let F be a family of functions with domain 2[n]. We say that SD(f) := max jfv j u !+ vgj: an algorithm A PMAC-learns F with approximation factor u α, if for any distribution D over 2[n], target function f ∗ 2 F, For any d 2 f0; 1; : : : ; n − 1g, a function f is in the first and for any sufficiently small " ≥ 0; δ ≥ 0, A takes as input d levels of the supermodular degree hierarchy, denoted by ∗ f 2 SD-d, if SD(f) ≤ d. a set of samples f(Si; f (Si)gi2[`] where each Si is drawn independently from D, and outputs a function f : 2[n] ! R The definitions essentially say, that u depends positively in F that satisfies on v if adding v to some set makes the marginal of u given h i that set strictly larger, and the supermodular degree of f Pr Pr [f(S) ≤ f ∗(S) ≤ α · f(S)] ≥ 1 − " is then the maximum number of elements on which some S1;:::;S`∼D S∼D particular element positively depends. The degree then nat- ≥ 1 − δ; urally categorizes functions into hierarchies. where the number of samples ` and the running time of A Definition 4 (Supermodular Set (Chen, Teng, and Zhang are both poly(n; 1="; 1/δ). 2019)). A set T ⊆ [n] is a supermodular set w.r.t. f, if there exists v 2 [n] and S ⊆ [n], such that for all T 0 ( T , In words, the definition says the algorithm succeeds with f(vjS [ T ) > f(vjS [ T 0): probability 1 − δ, upon which it outputs an approximation f of f ∗ such that with probability 1 − ", f ∗(S) is within fac- Definition 5 (Supermodular Width Hierarchy (Chen, Teng, tor α of f(S). Note that restricted to Boolean-valued func- and Zhang 2019)). The supermodular width of a set function tions and letting α = 1, PMAC-learning becomes exactly f, denoted by SMW(f), is defined to be the classic PAC-learning. SMW(f) := maxfjT j j T is a supermodular setg: d 2 f0; 1; : : : ; n − 1g f Classes of Set Functions For any , a function is in the first d levels of the supermodular width hierarchy, denoted by Numerous classes of complement-free set functions have f 2 SMW-d, if SMW(f) ≤ d. been proposed and studied, among which the following That is to say, T is a supermodular set, if given “environ- classes are particularly natural and useful: submodular, frac- ment” S, v has a larger marginal given T than given any tionally subadditive, and subadditive functions.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-