On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

Journal Article (2016)
Author(s)

E. De Klerk (Tilburg University, TU Delft - Discrete Mathematics and Optimization)

F Glineur (Université Catholique de Louvain)

Adrien B. Taylor (Université Catholique de Louvain)

Research Group
Discrete Mathematics and Optimization
Copyright
© 2016 E. de Klerk, François Glineur, Adrien B. Taylor
DOI related publication
https://doi.org/10.1007/s11590-016-1087-4
More Info
expand_more
Publication Year
2016
Language
English
Copyright
© 2016 E. de Klerk, François Glineur, Adrien B. Taylor
Research Group
Discrete Mathematics and Optimization
Issue number
7
Volume number
11
Pages (from-to)
1185–1199
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).