Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified...

download Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified Continuous Greedy Algorithm for Submodular Maximization.

If you can't read please download the document

description

Region Sensor Coverage 3 Sensors k L – Large sensors k S – Small sensors Objective Cover as much as possible of the region with sensors. Observation Coverage exhibits a diminishing returns.

Transcript of Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified...

Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified Continuous Greedy Algorithm for Submodular Maximization. Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (FOCS 2011). Submodular Maximization with Cardinality Constraints. Niv Buchbinder, Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (SODA 2014). Comparing Apples and Oranges: Query Tradeoff in Submodular Maximization. Niv Buchbinder, Moran Feldman and Roy Schwartz (SODA 2015). 2 Submodular Maximization Subject to a Matroid Constraint What? Why? Generalizes Classical Problems Max-SAT, Max-Cut, k-cover, GAP Applications Machine Learning Image Processing Algorithmic Game Theory Region Sensor Coverage 3 Sensors k L Large sensors k S Small sensors Objective Cover as much as possible of the region with sensors. Observation Coverage exhibits a diminishing returns. Influence in Social Networks Objective: Sell SuperPhones. Means: Give k phones for free. S S S S S Again, we have diminishing returns Image Summarization 5 Objective Select k pictures representing as much as possible of the trip. Once more, diminishing returns. Can be quantified by image processing techniques. [Tschiatschek et al., NIPS14] The 3 Components of the Problem 6 1.A ground set N of elements Possible positions for large/small sensors Social network users Images A valuation function f Assigns numerical values to subsets Exhibits diminishing returns Such a function is called submodular Example and Formal Definition A B N, u B A A B u f(A+u) f(A) f(B+u) f(B) N The Third Component 8 Matroid Constraint k k k k Cardinality ConstraintPartition Matroid Constraint k1 k1 k2 k2 k3 k3 More fancy constraints: Graphical MatroidLinear Matroid x y z S The Problem 9 Generalizes NP-hard problems. Exact algorithm is unlikely. Approximation algorithms. Objective Find a subset S N obeying the constraint and maximizing f(S). Instance A ground set N A submodular function f A matroid constraint -Approx. Algorithm -Approx. Algorithm S is feasible f(S) f(OPT) Randomized S is feasible E[f(S)] f(OPT) Which additional properties of f can help us? How fast can it be done? 10 Submodular Maximization Subject to a Matroid Constraint Aspects of What can be done in polynomial time? Changing the model: online, streaming, secretary Remarks If negative values are allowed, f(OPT) can be assumed to be 0. Non-zero multiplicative approximation implies an exact algorithm. Non-negativity Assumption If f is represented by a table, the problem becomes trivial. Assume access to a value oracle. Polynomial time complexity in |N|. Value Oracle 11 0 Sf(S)f(S) 0 {a}{a}1 {b}{b}2 {a, b}2 The field of Submodular Maximization Our problem is an important representative problem. Demonstrating: Techniques Aspects Has many applications and a long history: 12 Very active field in recent years. As early as the 1970s [Fisher et al., Math. Program. Stud. 1978]. First Case Cardinality Constraint 13 Constraint A B NA B N A A B Objective Function f(A) f(B) k k k k Cardinality Submodular Non-negative Some results assume monotone objective. Hardness results in this presentation are unconditional. They are based on information theoretic arguments. Summary of Results The greedy algorithm Natural algorithm Approximation ratio of 1 1/e for monotone functions [Fisher et al., Math. Program. Stud. 1978] The best possible [Nemhauser and Wolsey, Math. Oper. Res. 1978] Random Greedy [Buchbinder et al., SODA 2014] A simple variant of greedy achieving an approximation ratio of 1/e for non-monotone functions. In the same work we achieve also the state of the art approximation ratio of 1/e The corresponding hardness is [Oveis Gharan and Vondrak, SODA 2011] 14 Faster Algorithms 15 Monotone FunctionsNon-monotone Functions Ratio:1 - 1/e - 1/e - Previous result: [Badanidiyurua and Vondrak, SODA 2014] O(nk) (Random Greedy) Our result: O(n ln -1 ) Oracle Queries as a Complexity Measure Queries are nontrivial in many applications. Independent of the model. Typically, the oracle queries represent the time complexity up to polylogarithmic factors. [Buchbinder et al., SODA 2015] State of the art The Greedy Algorithm 16 1.Start with the empty solution. 2.Do k times: 3.Add to the solution the element contributing the most. Analysis OPT S The Current Solution Value f(S)f(S) f(S OPT) f(OPT) Analysis (cont.) 17 Conclusion Some element increases the value by at least: Observation The elements of OPT \ S (together) increase the value by at least: Submodularity f(S0)f(S0) f(OPT) f(S1)f(S1) 1/k1-1/k f(S2)f(S2) 1/k(1-1/k)(1-1/k) 2 f(Sk)f(Sk) (1-1/k) k 1/e The Average Observation 18 Conclusion Some element increases the value by at least: Observation The elements of OPT \ S (together) increase the value by at least: Submodularity Recall Let M be the set of the k elements with the largest marginal contributions to S. Conclusion A random element of M increases the value, in expectation, by at least: This simple observation has applications for: Non-monotone functions. Fast algorithms. The Random Greedy Algorithm 1.Start with the empty solution. 2.Do k times: 3.Let M be the set of the k elements with the largest marginal contributions to the solution. 4.Add a random element of M to the solution. 19 Analysis For monotone functions, approximation ratio of 1 1/e, in expectation, by the Average Observation and the above analysis. For non-monotone functions, the analysis fails only because we can no longer bound: f(S OPT) f(OPT) Intuition - Why is Randomness Important There might be an element u which looks good has a large marginal contribution. However, this element might be evil any solution containing it is poor. The (deterministic) greedy might be tempted to take u. A randomized algorithm has a chance to avoid u. 20 How Bad Can f(S OPT) Be? All the elements of (together) can only decrease the value to 0. What happens if S contains every element with probability at most p? 21 p E[f(OPT S)] Concave by Submodularity Theorem: If S contains every element u with probability at most p, then: Random Greedy and Non-monotone Functions 22 An element is selected with probability at most 1/k in every iteration. Each element belongs to S i with probability at most: Thus, Plugging into the above analysis gives an approximation ratio of: 1/e Making Random Greedy Fast The elements of M in decreasing marginal contribution order: 23 M: u1u1 u2u2 ukuk Probability to be added For Monotone Functions p 1 + p 2 + + p k = 1 p 1 p 2 p k For Non-Monotone Functions p 1 = p 2 = = p k = 1/k p1p1 p2p2 pkpk A Fast Algorithm for Monotone Functions 1.Start with the empty solution. 2.Do k times: 3.Randomly choose a subset A containing elements. 4.Add to the solution the element of A contributing the most. 24 Approximation Ratio A contains in expectation ln -1 elements of M. With probability at least 1- : A M . Hence, By symmetry: p 1 p 2 p k p 1 + p 2 + + p k 1- Approximation ratio of 1 - 1/e - S S How Fast is this Algorithm Each iteration requires O(n/k log -1 ) oracle queries. k iterations. In total: O(n log -1 ) oracle queries. Fast Algorithm for Non-monotone Functions Why not use the same algorithm? If |A M| > 1, then we always select the best element. Consequently, p 1 >> p 2 >> >> p k, although we need them to be (roughly) equal. Desired Solution Select an element u A M uniformly at random. Unfortunately, determining |A M| requires us to look at all the elements too costly. Solution 25 Fast Algorithm for Non-monotone Functions (cont.) To make |A M| concentrate, we need A to be larger. Algorithm 1.Start with the empty solution. 2.Do k times: 3.Randomly choose a subset A containing elements. 4.Let B be the set of the best elements in A. 5.Add a uniformly random element of B to the solution. 26 |A M||A M| A Marginal Gain (|B| = E[|A M|]) B What did We Get? 27 Approximation Ratio The algorithm almost mimics the random greedy, and it approximation ratio is e -1 - . Oracle Queries Each iteration requires O(n -2 /k log -1 ) oracle queries. k iterations. In total: O(n -2 log -1 ) oracle queries. Q.E.D. General Matroid Constraint 28 General Algorithmic Scheme Solve a fractional relaxationRound the solution In the next slides we will: Define the relaxation. Explain how to approximately solve it. Rounding can be done without loss. Pipage rounding [Calinescu et al., SIAM J. Comp. 2011] Swap rounding [Chekuri et al., FOCS 2010] Summary of Results Continuous Greedy [Calinescu et al., SIAM J. Comp. 2011] An (1-1/e)-approximation for monotone functions. Optimal even for cardinality constraints [Nemhauser and Wolsey, Math. Oper. Res. 1978]. Measured Continuous Greedy [Feldman et al., FOCS 2011] An 1/e-approximation for non-monotone functions. State of the art previous result was a approximation [Chekuri et al., STOC 2011]. Hardness [Oveis Gharan and Vondrak, SODA 2011] 29 Improved results for special cases when the function is monotone: Submodular Max-SAT Submodular Welfare Relaxation 30 Matroid constraint Matroid polytope Convex hull of the feasible solutions set characteristic vector Objective function Multilinear extension For a vector x, R(x) is a random set containing every element u N with probability x u. The extension is: F(x) = E[f(R(x))]. For integral points: f and F agree. Ground set: N = {a, b, c} {a, b}(1, 1, 0) Optimize over [0, 1] N Before Presenting the Algorithms The algorithms we describe are continuous processes. An implementation has to discretize the process by working in small steps. The multilinear extension F: Cannot be evaluated exactly (in general). Can be approximated arbitrary well by sampling. 31 The Continuous Greedy For every time point t [0, 1]: Consider the directions corresponding to the feasible sets. Move in the best (locally) direction at a speed of 1. Feasibility: the output is a convex combination of feasible sets. 32 y Approximation Ratio OPT is a good direction y y + OPT Real direction y OPT Imaginary direction Monotonicity: real is better than imaginary. Concave by Submodularity F(y OPT) F(y)F(y) Approximation Ratio - Analysis 33 By the above discussion: Monotonicity By non-negativity, f(y) 0 at time 0. Hence, At time t = 1, for monotone functions: Mesured Continuous Greedy 34 y x x yy Px yy P x Px P Gain In some cases, allows running for more time. Recall: Allows optimal approximation ratios for: Submodular Max-SAT Submodular Welfare Main Idea Find the best imaginary direction. Walk in the found imaginary direction. Feasibility The matroid polytope is down-monotone. Reducing the step cannot get us outside of the polytope. Approximation Ratio The previous analysis still works. Non-monotone Functions 35 Uses of Monotonicity in the Analysis The real direction is better than the imaginary one. Bounding F(y OPT). At time t, for every element u, y u 1-e -t (y u t in the continuous greedy). Every element of appears with probability at most 1-e -t in R(y). F(y OPT) e -t f(OPT) An Old Trick Non-monotone Functions (cont.) 36 By the above discussion: By non-negativity, f(y) 0 at time 0. Hence, At time t = 1: Future Work Monotone Functions The basic question is answered (what can be done in polynomial time). Many open problems in other aspects: Fast algorithms almost linear time algorithms for more general constraints. Online and streaming algorithms getting tight bounds. Important for big data applications. 37 ? Future Work (cont.) Non-monotone Functions Largely terra-incognita. 38 Here be dragons Optimal approximation ratio For cardinality constraint? For general matroids? Properties that can help Symmetry? Others? Other Aspects Fast algorithms, online, Other Main Fields of Interest Online and Secretary algorithms State of the art result for the Matroid Secretary Problem. [Feldman et al., SODA 2015] Algorithmic Game Theory Mechanism Design Analysis of game models inspired by combinatorial problems. 39 Additional Results on Submodular Maximization Nonmonotone Submodular Maximization via a Structural Continuous Greedy Algorithm. Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (ICALP 2011). Improved Competitive Ratios for Submodular Secretary Problems. Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (APPROX 2011). Improved Approximations for k-Exchange Systems. Moran Feldman, Joseph (Seffi) Naor, Roy Schwartz and Justin Ward (ESA 2011). A Tight Linear Time (1/2)-Approximation for Unconstrained Submodular Maximization. Niv Buchbinder, Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (FOCS 2012). Online Submodular Maximization with Preemption. Niv Buchbinder, Moran Feldman and Roy Schwartz (SODA 2015).