In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov’s accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments.
Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of [21] possibly exploiting strong convexity of the objective function.
Révisé le :
Accepté le :
Publié le :
@article{OJMO_2022__3__A1_0, author = {Barr\'e, Mathieu and Taylor, Adrien and Bach, Francis}, title = {A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives}, journal = {Open Journal of Mathematical Optimization}, eid = {1}, pages = {1--15}, publisher = {Universit\'e de Montpellier}, volume = {3}, year = {2022}, doi = {10.5802/ojmo.12}, language = {en}, url = {http://www.numdam.org/articles/10.5802/ojmo.12/} }
TY - JOUR AU - Barré, Mathieu AU - Taylor, Adrien AU - Bach, Francis TI - A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives JO - Open Journal of Mathematical Optimization PY - 2022 SP - 1 EP - 15 VL - 3 PB - Université de Montpellier UR - http://www.numdam.org/articles/10.5802/ojmo.12/ DO - 10.5802/ojmo.12 LA - en ID - OJMO_2022__3__A1_0 ER -
%0 Journal Article %A Barré, Mathieu %A Taylor, Adrien %A Bach, Francis %T A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives %J Open Journal of Mathematical Optimization %D 2022 %P 1-15 %V 3 %I Université de Montpellier %U http://www.numdam.org/articles/10.5802/ojmo.12/ %R 10.5802/ojmo.12 %G en %F OJMO_2022__3__A1_0
Barré, Mathieu; Taylor, Adrien; Bach, Francis. A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open Journal of Mathematical Optimization, Tome 3 (2022), article no. 1, 15 p. doi : 10.5802/ojmo.12. http://www.numdam.org/articles/10.5802/ojmo.12/
[1] Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods (2021) (https://arxiv.org/abs/2102.02045v1)
[2] Potential-function proofs for gradient methods, Theory Comput., Volume 15 (2019) no. 1, 4, 32 pages | MR | Zbl
[3] Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2020) (https://arxiv.org/abs/2006.06041v2)
[4] Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2021) (https://arxiv.org/abs/2006.06041)
[5] A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., Volume 2 (2009) no. 1, pp. 183-202 | DOI | MR | Zbl
[6] On Inexact Accelerated Proximal Gradient Methods with Relative Error Rules (2020) (https://arxiv.org/abs/2005.03766)
[7] On the subdifferentiability of convex functions, Proc. Am. Math. Soc., Volume 16 (1965) no. 4, pp. 605-611 | DOI | MR | Zbl
[8] Enlargement of monotone operators with applications to variational inequalities, Set-Valued Anal., Volume 5 (1997) no. 2, pp. 159-180 | DOI | MR | Zbl
[9] -Enlargements of maximal monotone operators: Theory and applications, Reformulation: nonsmooth, piecewise smooth, semismooth and smoothing methods, Springer, 1998, pp. 25-43 | Zbl
[10] Robustness of the hybrid extragradient proximal-point algorithm, J. Optim. Theory Appl., Volume 111 (2001) no. 1, pp. 117-136 | DOI | MR | Zbl
[11] An introduction to continuous optimization for imaging, Acta Numer., Volume 25 (2016), pp. 161-319 | DOI | MR | Zbl
[12] LIBSVM: A library for support vector machines, ACM Trans. Intell. Syst. Technol., Volume 2 (2011), 27, 37 pages (Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm)
[13] The Proximity Operator Repository. User’s guide (2020) http://proximity-operator.net/download/guide.pdf
[14] Acceleration methods, Foundations and Trends® in Optimization, Volume 5 (2021) no. 1-2, pp. 1-245 | DOI
[15] New proximal point algorithms for convex minimization, SIAM J. Optim., Volume 2 (1992) no. 4, pp. 649-664 | DOI | MR | Zbl
[16] Convex analysis and minimization algorithms I: Fundamentals, Grundlehren der Mathematischen Wissenschaften, 305, Springer, 2013
[17] Proximal methods for sparse hierarchical dictionary learning, Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML), Association for Computing Machinery (2010), pp. 487-494 | Zbl
[18] An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP, SIAM J. Optim., Volume 22 (2012) no. 3, pp. 1042-1064 | DOI | MR | Zbl
[19] Convex and network flow optimization for structured sparsity, J. Mach. Learn. Res., Volume 12 (2011) no. Sep, pp. 2681-2720 | MR | Zbl
[20] Inexact proximal -subgradient methods for composite convex optimization problems, J. Glob. Optim., Volume 75 (2019) no. 4, pp. 1029-1060 | DOI | Zbl
[21] An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM J. Optim., Volume 23 (2013) no. 2, pp. 1092-1125 | DOI | MR | Zbl
[22] A method of solving a convex programming problem with convergence rate , Sov. Math., Dokl., Volume 27 (1983), pp. 372-376
[23] Introductory Lectures on Convex Optimization: a Basic Course, Applied Optimization, Kluwer Academic Publishing, 2004 | DOI
[24] Gradient methods for minimizing composite functions, Math. Program., Volume 140 (2013) no. 1, pp. 125-161 | DOI | MR | Zbl
[25] Proximal algorithms, Foundations and Trends® in Optimization, Volume 1 (2014) no. 3, pp. 127-239 | DOI
[26] Convex Analysis, Princeton Landmarks in Mathematics, Princeton University Press, 1997
[27] Total variation based image restoration with free local constraints, Proceedings of 1st International Conference on Image Processing, IEEE (1994), pp. 31-35 | DOI
[28] Nonlinear total variation based noise removal algorithms, Physica D, Volume 60 (1992) no. 1-4, pp. 259-268 | DOI | MR | Zbl
[29] Primer on monotone operator methods, Appl. Comput. Math., Volume 15 (2016) no. 1, pp. 3-43 | MR | Zbl
[30] Inexact and Accelerated Proximal Point Algorithms, J. Convex Anal., Volume 19 (2012) no. 4, pp. 1167-1192 | MR | Zbl
[31] Convergence rates of inexact proximal-gradient methods for convex optimization, Advances in neural information processing systems (NIPS), MIT Press (2011), pp. 1458-1466
[32] A hybrid approximate extragradient–proximal point algorithm using the enlargement of a maximal monotone operator, Set-Valued Anal., Volume 7 (1999) no. 4, pp. 323-345 | DOI | MR | Zbl
[33] A comparison of rates of convergence of two inexact proximal point algorithms, Nonlinear optimization and related topics, Springer, 2000, pp. 415-427 | DOI | Zbl
[34] Smooth strongly convex interpolation and exact worst-case performance of first-order methods, Math. Program., Volume 161 (2017) no. 1-2, pp. 307-345 | DOI | MR | Zbl
[35] On accelerated proximal gradient methods for convex-concave optimization (2008) (Technical report)
[36] Accelerated and inexact forward-backward algorithms, SIAM J. Optim., Volume 23 (2013) no. 3, pp. 1607-1633 | DOI | MR | Zbl
[37] A new alternating minimization algorithm for total variation image reconstruction, SIAM J. Imaging Sci., Volume 1 (2008) no. 3, pp. 248-272 | DOI | MR | Zbl
[38] A Lyapunov Analysis of Accelerated Methods in Optimization, J. Mach. Learn. Res., Volume 22 (2021) no. 113, pp. 1-34 | MR | Zbl
Cité par Sources :