Abstract
In this paper, we introduce a new algorithm for solving variational inequality problems with monotone and Lipschitz-continuous mappings in real Hilbert spaces. Our algorithm requires only to compute one projection onto the feasible set per iteration. We prove under certain mild assumptions, a strong convergence theorem for the proposed algorithm to a solution of a variational inequality problem. Finally, we give some numerical experiments illustrating the performance of the proposed algorithm for variational inequality problems.
Similar content being viewed by others
References
Alvarez, F., Attouch, H.: An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping. Set-Valued Anal. 9, 3–11 (2001)
Aubin, J.P., Ekeland, I.: Applied Nonlinear Analysis. Wiley, New York (1984)
Baiocchi, C., Capelo, A.: Variational and Quasivariational Inequalities; Applications to Free Boundary Problems. Wiley, New York (1984)
Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)
Censor, Y., Gibali, A., Reich, S.: Strong convergence of subgradient extragradient methods for the variational inequality problem in Hilbert space. Optim. Methods Softw. 26, 827–845 (2011)
Censor, Y., Gibali, A., Reich, S.: Extensions of Korpelevich’s extragradient method for the variational inequality problem in Euclidean space. Optimization 61, 1119–1132 (2012)
Censor, Y., Gibali, A., Reich, S.: Algorithms for the split variational inequality problem. Numer. Algorithms 59, 301–323 (2012)
Ceng, L.C., Hadjisavvas, N., Wong, N.C.: Strong convergence theorem by a hybrid extragradient-like approximation method for variational inequalities and fixed point problems. J. Glob. Optim. 46, 635–646 (2010)
Dong, Q.L., Cho, Y.J., Zhong, L.L., Rassias, Th.M: Inertial projection and contraction algorithms for variational inequalities. J. Glob. Optim. 70, 687–704 (2018)
Dong, Q.L., Yuan, H.B., Cho, Y.J., Rassias, Th.M: Modified inertial Mann algorithm and inertial \(CQ\)-algorithm for nonexpansive mappings. Optim. Lett. 12, 87–102 (2018)
Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer Series in Operations Research, vol. I. Springer, New York (2003)
Gibali, A., Reich, S., Zalas, R.: Outer approximation methods for solving variational inequalities in Hilbert space. Optimization 66, 417–437 (2017)
Goebel, K., Reich, S.: Uniform Convexity, Hyperbolic Geometry, and Nonexpansive Mappings. Marcel Dekker, New York (1984)
Harker, P.T., Pang, J.S.: A damped-Newton method for the linear complementarity problem. Lect. Appl. Math. 26, 265–284 (1990)
Kimura, Y., Saejung, S.: Strong convergence for a common fixed point of two different generalizations of cutter operators. Linear Nonlinear Anal. 1, 53–65 (2015)
Kinderlehrer, D., Stampacchia, G.: An Introduction to Variational Inequalities and Their Applications. Academic Press, New York (1980)
Kraikaew, R., Saejung, S.: Strong convergence of the Halpern subgradient extragradient method for solving variational inequalities in Hilbert spaces. J. Optim. Theory Appl. 163, 399–412 (2014)
Konnov, I.V.: Combined Relaxation Methods for Variational Inequalities. Springer, Berlin (2001)
Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekon. Mat. Metody 12, 747–756 (1976)
Kopecká, E., Reich, S.: Approximating fixed points in the Hilbert ball. J. Nonlinear Convex Anal. 15, 819–829 (2014)
Maingé, P.E.: Inertial iterative process for fixed points of certain quasi-nonexpansive mappings. Set-Valued Anal. 15, 67–79 (2007)
Maingé, P.E.: A hybrid extragradient-viscosity method for monotone operators and fixed point problems. SIAM J. Control Optim. 47, 1499–1515 (2008)
Malitsky, Y.V., Semenov, V.V.: A hybrid method without extrapolation step for solving variational inequality problems. J. Glob. Optim. 61, 193–202 (2015)
Malitsky, Y.V.: Projected reflected gradient methods for monotone variational inequalities. SIAM J. Optim. 25, 502–520 (2015)
Moudafi, A.: Viscosity approximation methods for fixed points problems. J. Math. Anal. Appl. 241, 46–55 (2000)
Nadezhkina, N., Takahashi, W.: Weak convergence theorem by an extragradient method for nonexpansive mappings and monotone mappings. J. Optim. Theory Appl. 128, 191–201 (2006)
Polyak, B.T.: Some methods of speeding up the convergence of iterarive methods. Zh. Vychisl. Mat. Mat. Fiz. 4, 1–17 (1964)
Saejung, S., Yotkaew, P.: Approximation of zeros of inverse strongly monotone operators in Banach spaces. Nonlinear Anal. 75, 742–750 (2012)
Suantai, S., Pholasa, N., Cholamjiak, P.: The modified inertial relaxed CQ algorithm for solving the split feasibility problems. J. Ind. Manag. Optim. 13, 1–21 (2018). https://doi.org/10.3934/jimo.2018023
Thong, D.V., Hieu, D.V.: Weak and strong convergence theorems for variational inequality problems. Numer. Algorithms 78, 1045–1060 (2018)
Thong, D.V., Hieu, D.V.: Modified subgradient extragradient algorithms for variational inequality problems and fixed point problems. Optimization 67, 83–102 (2018)
Thong, D.V., Hieu, D.V.: Modified subgradient extragradient method for inequality variational problems. Numer. Algorithms 79, 597–610 (2018)
Thong, D.V., Hieu, D.V.: Inertial extragradient algorithms for strongly pseudomonotone variational inequalities. J. Comput. Appl. Math. 341, 80–98 (2018)
Thong, D.V., Hieu, D.V.: New extragradient methods for solving variational inequality problems and fixed point problems. J. Fixed Point Theory Appl. 20, 129 (2018). https://doi.org/10.1007/s11784-018-0610-x
Thong, D.V., Hieu, D.V.: Inertial subgradient extragradient algorithms with line-search process for solving variational inequality problems and fixed point problems. Numer. Algorithms (2018). https://doi.org/10.1007/s11075-018-0527-x
Tseng, P.: A modified forward–backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 38, 431–446 (2000)
Wang, F.H., Xu, H.K.: Weak and strong convergence theorems for variational inequality and fixed point problems with Tseng’s extragradient method. Taiwan. J. Math. 16, 1125–1136 (2012)
Yao, Y., Marino, G., Muglia, L.: A modified Korpelevich’s method convergent to the minimum-norm solution of a variational inequality. Optimization 63, 559–569 (2014)
Wang, Y.M., Xiao, Y.B., Wang, X., Cho, Y.J.: Equivalence of well-posedness between systems of hemivariational inequalities and inclusion problems. J. Nonlinear Sci. Appl. 9, 1178–1192 (2016)
Xiao, Y.B., Huang, N.J., Cho, Y.J.: A class of generalized evolution variational inequalities in Banach space. Appl. Math. Lett. 25, 914–920 (2012)
Acknowledgements
The authors would like to two anonymous reviewers for their comments on the manuscript which helped us very much in improving and presenting the original version of this paper. This paper was completed when the first two authors were visiting the Vietnam Institute for Advance Study in Mathematics (VIASM) and they thank the VIASM for financial support and hospitality and the second named author is funded by Vietnam National Foundation for Science and Technology Development (NAFOSTED) under Grant No.101.01-2017.08.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Thong, D.V., Vinh, N.T. & Cho, Y.J. A strong convergence theorem for Tseng’s extragradient method for solving variational inequality problems. Optim Lett 14, 1157–1175 (2020). https://doi.org/10.1007/s11590-019-01391-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-019-01391-3