Abstract
Given a monotone submodular set function with a knapsack constraint, its maximization problem has two types of approximation algorithms with running time \(O(n^2)\) and \(O(n^5)\), respectively. With running time \(O(n^5)\), the best performance ratio is \(1-1/e\). With running time \(O(n^2)\), the well-known performance ratio is \((1-1/e)/2\) and an improved one is claimed to be \((1-1/e^2)/2\) recently. In this paper, we design an algorithm with running \(O(n^2)\) and performance ratio \(1-1/e^{2/3}\), and an algorithm with running time \(O(n^3)\) and performance ratio 1/2.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Recently, there are many problems in the real world which are formulated into monotone submodular maximization with a knapsack constraint, including multiplex influence maximization (Alan et al. 2018), multiple-product viral marketing (Chen et al. 2020a, b; Zhang et al. 2016), and viral marketing with recommendation Guo and Weili (2019). In general, the monotone submodular maximization with a knapsack constraint is NP-hard (Feige 1998). Therefore, approximation algorithms for this problem has received a lot of attention. An upper bound for the performance ratio has been proved to be \(1-1/e\) Feige (1998). This bound is achieved by an greedy algorithm in Sviridenko (2004). However, its running time is \(O(n^5)\), which is too high for real world problems. Therefore, efforts are still being made on finding faster approximation algorithms. A \((1-1/e)/2\)-approximation with running time \(O(n^2)\) was obtained in Leskovec et al. (2007). An improvement \((1-1/e^2)/2\) with the same time running time was claimed in Zhang et al. (2016).
In this paper, we will present an approximation algorithms with performance ratio \(1-e^{-2/3}\approx 0.486\) (\(> (1--1/e^2)/2 \approx 0.432\)) and running time \(O(n^2)\). Moreover, a generalization yields an approximation algorithm with running time \(O(n^3)\) and performance ratio 1/2.
2 \((1-e^{-2/3})\)-approximation
Consider a set function \(f: 2^V \rightarrow {\mathbb {R}}\). f is submodular if for any two subsets \(A, A' \in 2^V\),
f is monotone nondecreasing if for any two subsets \(A, A' \in 2^V\) with \(A \subset A'\),
The following is a well-known property about monotone nondecreasing and submodular functions. They can be found in Du et al. (2022).
Lemma 2.1
A set function \(f: 2^V \rightarrow {\mathbb {R}}\) is submodular if and only if for any two subsets A and \(A'\) with \(A \subset A'\), and any \(x \in V\setminus A'\),
where \(\Delta _xf(A) = f(A \cup \{x\}) - f(A)\). A set function \(f: 2^V \rightarrow {\mathbb {R}}\) is monotone nondecreasing if and only if for any two subsets A and \(A'\) with \(A \subset A'\), and any \(x \in A'\),
Lemma 2.2
Suppose that a set function \(f: 2^V \rightarrow {\mathbb {R}}\) is submodular with \(f(\emptyset )=0\). Then, for any two sets A and \(A'\) with \(A \cap A' = \emptyset \),
Proof
Since f is submodular, we have
\(\square \)
In this paper, we study the following problem.
Problem 2.3
Let V be a set of elements with nonnegative cost \(c: V \rightarrow {\mathbb {R}}^+\). Let \(f: 2^V \rightarrow {\mathbb {R}}\) be a monotone nondecreasing and submodular function with \(f(\emptyset )=0\). Given positive integer B, the problem is as follows:
In Algorithm 1, we select two subsets \(S^1, S^2\) using different ways and pick the subset with larger f(.) value as the solution \(S^+\).
For \(S^1\), among all subsets of V with at most 2 elements, we pick the one with the highest f(.) value.
For \(S^2\), we iteratively select elements with the largest marginal gain per cost in f(.) and add to \(S^2\), until adding any elements that are not yet selected would violate the budget constraint.
Theorem 2.4
Algorithm 1 produces a \((1-1/e^{2/3})\)-approximation solution in \(O(n^2)\) for Problem 2.3.
Proof
Clearly, Algorithm 1 runs in \(O(n^2)\) time. Next, we analyze it performance ratio. Note that \(1/2 > 1 - 1/e^{2/3}\). Consider two cases.
Case 1. \(c(S^2) \le \frac{2B}{3}\). Let OPT be an optimal solution. In this case, for every \(x \in V \setminus S^2\), \(c(x) > B/3\) because, otherwise, x can be put in \(S^2\). Since \(c(OPT) \le B\), OPT contains at most two elements outside of \(S^2\). If \(OPT \subseteq S^2\), then by monotone nondecreasing property, \(f(S^2) = opt\), so that \(f(S^+)=opt\). Therefore, we may assume that OPT contains at least one element outside of \(S^2\). Thus, \(f(OPT {\setminus } S^2) \le f(S^1)\). It follows that
Therefore, \(f(S^+)=\max (f(S^1), f(S^2)) \ge \frac{1}{2}\cdot opt\) where \(opt = f(OPT)\).
Case 2. \(c(S^2) > \frac{2B}{3}\). Let \(S^2 = \{x_1, x_2,..., x_g \}\). Denote \(S_i=\{x_1, x_2,..., x_i\}\). Then
and \(c(S_g) =c(S^2) > 2B/3 \ge \frac{2}{3}\cdot c(OPT)\). Let \(OPT = \{y_1, y_2,..., y_h\}\) be an optimal solution. Denote \(OPT_j=\{y_1, y_2,..., y_j\}\). Then
Denote \(\alpha _i = opt - f(S_i)\). Then we have
Hence,
Therefore
Hence,
\(\square \)
3 1/2-approximation
In Algorithm 2, \(S^2\) is selected the same as Algorithm 1. The difference is in how \(S^1\) is generated. In Algorithm 2, we first create a subset W that contains all elements e with cost \(c(e) > 3B/10\). Then, for all subsets of W with at most 3 elements, the one with largest f(.) is selected as the set \(S^1\).
Similar to Algorithm 1, among \(S^1\) and \(S^2\), the subset with larger f(.) value is selected as the solution \(S^+\).
Theorem 3.1
Algorithm 2 produces a 1/2-approximation solution in \(O(n^3)\) time for Problem 2.3.
Proof
Clearly, Algorithm 2 runs in \(O(n^3)\) time. We next analyze it performance ratio. Consider two cases.
Case 1. \(c(S^2) \le \frac{7B}{10}\). Let OPT be an optimal solution. In this case, for every \(x \in V \setminus S^2\), \(c(x) > 3B/10\) because, otherwise, \(S^2\) should include x. This fact implies that OPT contains at most three elements outside of \(S^2\). If \(OPT \subseteq S^2\), then by monotone nondecreasing property, \(f(S^2) = opt\), so that \(f(S^+)=opt\). Therefore, we may assume that OPT contains at least one element outside of \(S^2\). Thus, \(f(OPT {\setminus } S^2) \le f(S^1)\). It follows that
Therefore, \(f(S^+)=\max (f(S^1), f(S^2)) \ge \frac{1}{2}\cdot opt\) where \(opt = f(OPT)\).
Case 2. \(c(S^2) > \frac{7B}{10}\). Let \(S^1 = \{x_1, x_2,..., x_g \}\). Denote \(S_i=\{x_1, x_2,..., x_i\}\). Then
and \(c(S_g) =c(S^1) > \frac{7B}{10} \ge c(OPT)\cdot (7/10)\). Let \(OPT = \{y_1, y_2,..., y_h\}\) be an optimal solution. Denote \(OPT_j=\{y_1, y_2,..., y_j\}\). Then
Denote \(\alpha _i = opt - f(S_i)\). Then we have
Hence,
Therefore,
Hence,
\(\square \)
Data availability
There exists no data associated with this manuscript.
References
Alan K, Abdul AM, Xiang L, Huiling Z, Thai My T (2018) Multiplex influence maximization in online social networks with heterogeneous diffusion models. IEEE Trans Comput Social Syst 5(2):418–429
Chen T, Liu B, Liu W, Fang Q, Yuan J, Weili W (2020) A random algorithm for profit maximization in online social networks. Theor Comput Sci 803:36–47
Chen T, Liu B, Liu W, Fang Q, Yuan J, Weili W (2020) A random algorithm for profit maximization in online social networks. Theor Comput Sci 803:36–47
Zhang Huiyuan, Zhang Huiling, Kuhnle Alan, Thai My T (2016) Profit maximization for multiple products in online social networks. In IEEE INFOCOM 2016-The 35th Annual IEEE International Conference on Computer Communications, pages 1–9. IEEE
Guo Jianxiong, Wu Weili (2019) Viral marketing for complementary products. Nonlinear Combinatorial Optimization, pages 309–315
Feige U (1998) A threshold of ln n for approximating set cover. J ACM (JACM) 45(4):634–652
Sviridenko M (2004) A note on maximizing a submodular set function subject to a knapsack constraint. Oper Res Lett 32(1):41–43
Leskovec Jure, Krause Andreas, Guestrin Carlos, Faloutsos Christos, VanBriesen Jeanne, Glance Natalie (2007) Cost-effective outbreak detection in networks. In Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 420–429
Du Dingzhu, Pardalos Panos M, Hu Xiaodong, Wu Weili, et al (2022) Introduction to combinatorial optimization. Springer
Funding
Open access funding provided by SCELC, Statewide California Electronic Library Consortium. Xiang Li was supported by NSF CNS-1948550.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Du, H.W., Li, X. & Wang, G. New approximations for monotone submodular maximization with knapsack constraint. J Comb Optim 48, 28 (2024). https://doi.org/10.1007/s10878-024-01214-x
Accepted:
Published:
DOI: https://doi.org/10.1007/s10878-024-01214-x