- PII
- S0044466925030105-1
- DOI
- 10.31857/S0044466925030105
- Publication type
- Article
- Status
- Published
- Authors
- Volume/ Edition
- Volume 65 / Issue number 3
- Pages
- 364-375
- Abstract
- This paper proposes a new variant of the adaptive Frank–Wolfe algorithm for relatively smooth convex minimization problems. It suggests using a divergence different from half of the squared Euclidean norm in the step size adjustment formula. Convergence rate estimates for this algorithm are proven for minimization problems involving relatively smooth convex functions with the triangle scaling property. We also conducted computational experiments for the Poisson linear inverse problem and SVM models. The paper also identifies the conditions under which the proposed algorithm shows a clear advantage over the adaptive proximal gradient Bregman method and its accelerated variants.
- Keywords
- относительная гладкость алгоритм Франк–Вульфа адаптивные алгоритмы выпуклая оптимизация выпуклая минимизация дивергенция Брегмана
- Date of publication
- 17.09.2025
- Year of publication
- 2025
- Number of purchasers
- 0
- Views
- 27
References
- 1. Bauschke H. H., Bolte J., and Teboulle M. A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications // Mathematics of Operations Research. 2017. V. 42. P. 330–48.
- 2. Lu H., Freund R. M., and Nesterov Y. Relatively smooth convex optimization by first-order methods, and applications // SIAM Journal on Optimization 2018. V. 28. P. 333–54.
- 3. Hendrikx H., Xiao L., Bubeck S., Bach F., and Massoulie L. Statistically preconditioned accelerated gradient method for distributed optimization // In: International conference on machine learning. PMLR. 2020:4203–27.
- 4. Lu H. “Relative continuity” for non-Lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent // INFORMS Journal on Optimization 2019. V. 1. P. 288–303.
- 5. Nesterov Y. Implementable tensor methods in unconstrained convex optimization // Mathematical Programming. 2021. V. 186 P. 157–83.
- 6. Stonyakin F., Titov A., Alkousa M., Savchuk O., and Gasnikov A. Adaptive Algorithms for Relatively Lipschitz Continuous Convex Optimization Problems // Pure and Applied Functional Analysis. 2023. V. 8. P. 1505–26.
- 7. Dragomir R. A., Taylor A. B., d’Aspremont A, and Bolte J. Optimal complexity and certification of Bregman first-order methods // Mathematical Programming 2022:1–43.
- 8. Hanzely F., Richtarik P., and Xiao L. Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization // Comput Optim Appl. 2021. V. 79. P. 405–440.
- 9. Dragomir R. A. Bregman gradient methods for relatively-smooth optimization // PhD thesis. UT1 Capitole, 2021.
- 10. Combettes C. W. and Pokutta S. Complexity of linear minimization and projection on some sets // Operations Research Letters. 2021. V. 49. P. 565–71.
- 11. Bomze I. M., Rinaldi F., and Zeffiro D. Frank–Wolfe and friends: a journey into projection-free first order optimization methods // 4OR-Q J Oper Res. 2021. V. 14. P. 313–345.
- 12. Frank M., Wolfe P., et al. An algorithm for quadratic programming // Naval research logistics quarterly. 1956. V. 3. P. 95–110.
- 13. Aivazian G., Stonyakin F. S., Pasechnyk D., Alkousa M. S., Raigorodsky A., and Baran I. Adaptive variant of the Frank– Wolfe algorithm for convex optimization problems // Programming and Computer Software 2023. V. 49. P. 493–504.
- 14. Polyak B. Introduction to Optimization. 2020.
- 15. Braun G., Carderera A., Combettes C. W., et al. Conditional gradient methods // arXiv preprint arXiv:2211.14103 2022.