[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/3600270.3601540guideproceedingsArticle/Chapter ViewAbstractPublication PagesnipsConference Proceedingsconference-collections
research-article

Submodular maximization in clean linear time

Published: 03 April 2024 Publication History

Abstract

In this paper, we provide the first deterministic algorithm that achieves 1/2- approximation for monotone submodular maximization subject to a knapsack constraint, while making a number of queries that scales only linearly with the size of the ground set n. Moreover, our result automatically paves the way for developing a linear-time deterministic algorithm that achieves the tight 1 - 1/e approximation guarantee for monotone submodular maximization under a cardinality (size) constraint. To complement our positive results, we also show strong information- theoretic lower bounds. More specifically, we show that when the maximum cardinality allowed for a solution is constant, no deterministic or randomized algorithm making a sub-linear number of function evaluations can guarantee any constant approximation ratio. Furthermore, when the constraint allows the selection of a constant fraction of the ground set, we show that any algorithm making fewer than Ω(n/ log(n)) function evaluations cannot perform better than an algorithm that simply outputs a uniformly random subset of the ground set of the right size. We extend our results to the general case of maximizing a monotone submodu- lar function subject to the intersection of a p-set system and multiple knapsack constraints. Finally, we evaluate the performance of our algorithms on multiple real-life applications, including movie recommendation, location summarization, Twitter text summarization, and video summarization.

Supplementary Material

Additional material (3600270.3601540_supp.pdf)
Supplemental material.

References

[1]
Naor Alaluf, Alina Ene, Moran Feldman, Huy L. Nguyen, and Andrew Suh. Optimal streaming algorithms for submodular maximization with cardinality constraints. In International Colloquium on Automata, Languages, and Programming (ICALP), volume 168, pages 6:1-6:19, 2020.
[2]
Georgios Amanatidis, Federico Fusco, Philip Lazos, Stefano Leonardi, and Rebecca Reiffenhausen Fast adaptive non-monotone submodular maximization subject to a knapsack constraint. In Advances in Neural Information Processing Systems (NeurIPS), 2020.
[3]
Francis R Bach. Structured sparsity-inducing norms through submodular functions. In Advances in Neural Information Processing Systems (NeurIPS), pages 118-126, 2010.
[4]
Ashwinkumar Badanidiyuru and Jan Vondrak. Fast algorithms for maximizing submodular functions. In ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 1497-1514, 2014.
[5]
Ashwinkumar Badanidiyuru, Baharan Mirzasoleiman, Amin Karbasi, and Andreas Krause. Streaming submodular maximization: massive data summarization on the fly. In ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), pages 671-680. ACM, 2014.
[6]
Ashwinkumar Badanidiyuru, Amin Karbasi, Ehsan Kazemi, and Jan Vondrák. Submodular Maximization Through Barrier Functions. In Advances in Neural Information Processing Systems (NeurIPS), 2020.
[7]
Jeff Bilmes. Submodularity in machine learning and artificial intelligence. CoRR, abs/2202.00132, 2022.
[8]
Niv Buchbinder and Moran Feldman. Deterministic algorithms for submodular maximization problems. ACM Trans. Algorithms, 14(3):32:1-32:20, 2018.
[9]
Niv Buchbinder and Moran Feldman. Constrained submodular maximization via a nonsymmetric technique. Math. Oper. Res., 44(3):988-1005, 2019.
[10]
Niv Buchbinder, Moran Feldman, Joseph Naor, and Roy Schwartz. Submodular maximization with cardinality constraints. In ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 1433-1452, 2014.
[11]
Niv Buchbinder, Moran Feldman, Joseph Naor, and Roy Schwartz. A tight linear time (1/2)-approximation for unconstrained submodular maximization. SIAM J. Comput., 44(5):1384-1402, 2015.
[12]
Niv Buchbinder, Moran Feldman, and Roy Schwartz. Comparing apples and oranges: Query trade-off in submodular maximization. Math. Oper. Res., 42(2):308-329, 2017.
[13]
L. Elisa Celis, Amit Deshpande, Tarun Kathuria, and Nisheeth K. Vishnoi. How to be Fair and Diverse? CoRR, abs/1610.07183, 2016.
[14]
Vasek Chvátal. The tail of the hypergeometric distribution. Discret. Math., 25(3):285-287, 1979.
[15]
Sandra Eliza Fontes De Avila, Ana Paula Brandao Lopes, Antonio da Luz Jr, and Arnaldo de Albuquerque Araújo. Vsumm: A mechanism designed to produce static video summaries and a novel evaluation method. Pattern Recognition Letters, 32(1):56-68, 2011.
[16]
Ethan R Elenberg, Rajiv Khanna, Alexandras G Dimakis, and Sahand Negahban. Restricted strong convexity implies weak submodularity. The Annals of Statistics, 46(6B):3539-3568, 2018.
[17]
Alina Ene and Huy L. Nguyen. Constrained submodular maximization: Beyond 1/e. In Foundations of Computer Science (FOCS), pages 248-257, 2016.
[18]
Alina Ene and Huy L. Nguyen. A nearly-linear time algorithm for submodular maximization with a knapsack constraint. In International Colloquium on Automata, Languages, and Programming (ICALP), pages 53:1-53:12, 2019.
[19]
Hossein Esfandiari, Amin Karbasi, and Vahab Mirrokni. Adaptivity in adaptive submodularity. In Conference on Learning Theory (COLT), pages 1823-1846. PMLR, 2021.
[20]
Uriel Feige, Vahab S. Mirrokni, and Jan Vondrák. Maximizing non-monotone submodular functions. SIAM J. Comput., 40(4):1133-1153, 2011.
[21]
Moran Feldman, Joseph Naor, and Roy Schwartz. A unified continuous greedy algorithm for submodular maximization. In Foundations of Computer Science (FOCS), pages 570-579, 2011.
[22]
Moran Feldman, Amin Karbasi, and Ehsan Kazemi. Do less, get more: Streaming submodular maximization with subsampling. Advances in Neural Information Processing Systems (NeurIPS), 31, 2018.
[23]
Moran Feldman, Ashkan Norouzi-Fard, Ola Svensson, and Rico Zenklusen. The one-way communication complexity of submodular maximization with applications to streaming and robustness. In ACM Symposium on Theory of Computing (STOC), pages 1363-1374. ACM, 2020.
[24]
Moran Feldman, Zeev Nutov, and Elad Shoham. Practical budgeted submodular maximization. CoRR, abs/2007.04937, 2020.
[25]
Marshall L. Fisher, George L. Nemhauser, and Laurence A. Wolsey. An analysis of approximations for maximizing submodular set functions-II. Mathematical Programming Study, 8, 1978.
[26]
Shayan Oveis Gharan and Jan Vondrák. Submodular maximization by simulated annealing. In ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 1098-1116, 2011.
[27]
Daniel Golovin and Andreas Krause. Adaptive submodularity: Theory and applications in active learning and stochastic optimization. Journal of Artificial Intelligence Research, 42: 427-486, 2011.
[28]
Andrew Guillory and Jeff Bilmes. Interactive Submodular Set Cover. In International Conference on Machine Learning, pages 415-422. Omnipress, 2010.
[29]
Ran Haba, Ehsan Kazemi, Moran Feldman, and Amin Karbasi. Streaming submodular maximization under a k-set system constraint. In International Conference on Machine Learning (ICML), pages 3939-3949. PMLR, 2020.
[30]
Kai Han, Shuang Cui, Tianshuai Zhu, Enpei Zhang, Benwei Wu, Zhizhuo Yin, Tong Xu, Shaojie Tang, and He Huang. Approximation algorithms for submodular data summarization with a knapsack constraint. In International Conference on Measurement and Modeling of Computer Systems, pages 65-66, 2021.
[31]
F Maxwell Harper and Joseph A Konstan. The movielens datasets: History and context. Acm Transactions on Interactive Intelligent Systems (TIIS), 5(4):1-19, 2015.
[32]
Christopher Harshaw, Ehsan Kazemi, Moran Feldman, and Amin Karbasi. The power of sub-sampling in submodular maximization. Mathematics of Operations Research, 2021.
[33]
Avinatan Hassidim and Yaron Singer. Submodular optimization under noise. In Conference on Learning Theory (COLT), pages 1069-1122, 2017.
[34]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep Residual Learning for Image Recognition. In Computer Vision and Pattern Recognition (CVPR), pages 770-778, 2016.
[35]
Wassily Hoeffding. Probability inequalities for sums of bounded random variables. Journal of the American Statistical Association, 58(301):13-30, 1963.
[36]
Chien-Chung Huang and Naonori Kakimura. Multi-pass streaming algorithms for monotone submodular function maximization. CoRR, abs/1802.06212, 2018.
[37]
Chien-Chung Huang and Naonori Kakimura. Improved streaming algorithms for maximizing monotone submodular functions under a knapsack constraint. Algorithmica, 83(3):879-902, 2021.
[38]
Chien-Chung Huang and Naonori Kakimura. Multi-pass streaming algorithms for monotone submodular function maximization. Theory Comput. Syst., 66(1):354-394, 2022.
[39]
Chien-Chung Huang, Naonori Kakimura, and Yuichi Yoshida. Streaming algorithms for maximizing monotone submodular functions under a knapsack constraint. Algorithmica, 82(4): 1006-1032, 2020.
[40]
Stefanie Jegelka and Jeff Bilmes. Submodularity beyond submodular energies: coupling edges in graph cuts. In Computer Vision and Pattern Recognition (CVPR), pages 1897-1904. IEEE, 2011.
[41]
Ehsan Kazemi, Morteza Zadimoghaddam, and Amin Karbasi. Scalable deletion-robust sub-modular maximization: Data summarization with privacy and fairness constraints. In International conference on machine learning (ICML), pages 2544-2553. PMLR, 2018.
[42]
Ehsan Kazemi, Marko Mitrovic, Morteza Zadimoghaddam, Silvio Lattanzi, and Amin Karbasi. Submodular streaming in all its glory: Tight approximation, minimum memory and low adaptive complexity. In International Conference on Machine Learning (ICML), pages 3311-3320. PMLR, 2019.
[43]
Ehsan Kazemi, Shervin Minaee, Moran Feldman, and Amin Karbasi. Regularized submodular maximization at scale. In International Conference on Machine Learning (ICML), pages 53565366. PMLR, 2021.
[44]
Andreas Krause and Daniel Golovin. Submodular Function Maximization. In Tractability: Practical Approaches to Hard Problems. Cambridge University Press, 2012.
[45]
Alan Kuhnle. Interlaced greedy algorithm for maximization of submodular functions in nearly linear time. In Advances in Neural Information Processing Systems (NeurIPS), pages 23712381, 2019.
[46]
Alan Kuhnle. Quick streaming algorithms for maximization of monotone submodular functions in linear time. In AISTATS, volume 130 of Proceedings of Machine Learning Research, pages 1360-1368. PMLR, 2021.
[47]
Alan Kuhnle. Personal communication, 2022.
[48]
Ariel Kulik, Hadas Shachnai, and Tami Tamir. Approximations for monotone and nonmonotone submodular maximization with knapsack constraints. Math. Oper. Res., 38(4):729-739, 2013.
[49]
Ariel Kulik, Roy Schwartz, and Hadas Shachnai. A faster tight approximation for submodular maximization subject to a knapsack constraint. CoRR, abs/2102.12879, 2021.
[50]
Jon Lee, Vahab S. Mirrokni, Viswanath Nagarajan, and Maxim Sviridenko. Maximizing nonmonotone submodular functions under matroid or knapsack constraints. SIAM J. Discret. Math., 23(4):2053-2078, 2010.
[51]
Jure Leskovec and Andrej Krevl. SNAP Datasets: Stanford Large Network Dataset Collection. http://snap.stanford.edu/data, June 2014.
[52]
Jure Leskovec, Andreas Krause, Carlos Guestrin, Christos Faloutsos, Jeanne VanBriesen, and Natalie Glance. Cost-effective outbreak detection in networks. In Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD), pages 420-429. ACM, 2007.
[53]
Vsevolod F. Lev and Raphael Yuster. On the size of dissociated bases. The Electronic Journal of Combinatorics, 18(1), 2011.
[54]
Erik M Lindgren, Shanshan Wu, and Alexandros G Dimakis. Sparse and greedy: Sparsifying submodular facility location problems. In NeuIPS Workshop on Optimization for Machine Learning, 2015.
[55]
Baharan Mirzasoleiman, Ashwinkumar Badanidiyuru, Amin Karbasi, Jan Vondrák, and Andreas Krause. Lazier than lazy greedy. In Blai Bonet and Sven Koenig, editors, Conference on Artificial Intelligence (AAAI), pages 1812-1818. AAAI Press, 2015.
[56]
Baharan Mirzasoleiman, Ashwinkumar Badanidiyuru, and Amin Karbasi. Fast constrained submodular maximization: Personalized data summarization. In International Conference on Machine Learning (ICML), pages 1358-1367, 2016.
[57]
Baharan Mirzasoleiman, Amin Karbasi, Rik Sarkar, and Andreas Krause. Distributed submodular maximization. The Journal of Machine Learning Research, 17(1):8330-8373, 2016.
[58]
Baharan Mirzasoleiman, Jeff Bilmes, and Jure Leskovec. Coresets for data-efficient training of machine learning models. In International Conference on Machine Learning (ICML), pages 6950-6960. PMLR, 2020.
[59]
Marko Mitrovic, Ehsan Kazemi, Morteza Zadimoghaddam, and Amin Karbasi. Data summarization at scale: A two-stage submodular approach. In International Conference on Machine Learning (ICML), pages 3596-3605. PMLR, 2018.
[60]
Marko Mitrovic, Ehsan Kazemi, Moran Feldman, Andreas Krause, and Amin Karbasi. Adaptive sequence submodularity. Advances in Neural Information Processing Systems (NeurIPS), 32, 2019.
[61]
G. L. Nemhauser and L. A. Wolsey. Best algorithms for approximating the maximum of a submodular set function. Mathematics of Operations Research, 3(3):177-188, 1978.
[62]
G. L. Nemhauser, L. A. Wolsey, and M. L. Fisher. An analysis of approximations for maximizing submodular set functions-I. Mathematical Programming, 14:265-294, 1978.
[63]
Shinsaku Sakaue. Guarantees of stochastic greedy algorithms for non-monotone submodular maximization with cardinality constraint. In International Conference on Artificial Intelligence and Statistics, (AISTATS), pages 11-21, 2020.
[64]
Mehraveh Salehi, Amin Karbasi, Dustin Scheinost, and R Todd Constable. A submodular approach to create individualized parcellations of the human brain. In International Conference on Medical Image Computing and Computer-Assisted Intervention, pages 478-485. Springer, 2017.
[65]
Matthew Skala. Hypergeometric tail inequalities: ending the insanity. CoRR, abs/1311.5939, 2013.
[66]
Scott Sussex, Caroline Uhler, and Andreas Krause. Near-optimal multi-perturbation experimental design for causal structure learning. Advances in Neural Information Processing Systems (NeurIPS), 34, 2021.
[67]
Maxim Sviridenko. A note on maximizing a submodular set function subject to a knapsack constraint. Oper. Res. Lett., 32(1):41-43, 2004.
[68]
Jing Tang, Xueyan Tang, Andrew Lim, Kai Han, Chongshou Li, and Junsong Yuan. Revisiting modified greedy algorithm for monotone submodular maximization with a knapsack constraint. Proc. ACM Meas. Anal. Comput. Syst., 5(1):08:1-08:22, 2021.
[69]
Ehsan Tohidi, Rouhollah Amiri, Mario Coutino, David Gesbert, Geert Leus, and Amin Karbasi. Submodularity in action: From machine learning to signal processing applications. IEEE Signal Processing Magazine, 37(5):120-133, 2020.
[70]
Sebastian Tschiatschek, Rishabh K Iyer, Haochen Wei, and Jeff A Bilmes. Learning mixtures of submodular functions for image collection summarization. In Advances in Neural Information Processing Systems (NeurIPS), 2014.
[71]
Jan Vondrák. Symmetry and approximability of submodular maximization problems. SIAM J. Comput., 42(1):265-304, 2013.
[72]
Kai Wei, Rishabh Iyer, and Jeff Bilmes. Fast multi-stage submodular maximization. In International conference on machine learning (ICML), pages 1494-1502. PMLR, 2014.
[73]
Grigory Yaroslavtsev, Samson Zhou, and Dmitrii Avdiukhin. "bring your own greedy"+max: Near-optimal 1/2-approximations for submodular knapsack. In International Conference on Artificial Intelligence and Statistics (AISTATS), pages 3263-3274, 2020.
[74]
Qilian Yu, Easton Li Xu, and Shuguang Cui. Streaming algorithms for news and scientific literature recommendation: Monotone submodular maximization with a d-knapsack constraint. IEEE Access, 6:53736-53747, 2018.
[75]
Yuxun Zhou and Costas J Spanos. Causal meets submodular: Subset selection with directed information. In Advances in Neural Information Processing Systems (NeurIPS), pages 2649-2657, 2016.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide Proceedings
NIPS '22: Proceedings of the 36th International Conference on Neural Information Processing Systems
November 2022
39114 pages

Publisher

Curran Associates Inc.

Red Hook, NY, United States

Publication History

Published: 03 April 2024

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 31 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media