Print Email Facebook Twitter On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions Title On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions Author de Klerk, E. (TU Delft Discrete Mathematics and Optimization; Tilburg University) Glineur, François (Université Catholique de Louvain) Taylor, Adrien B. (Université Catholique de Louvain) Date 2016 Abstract We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014). Subject Gradient methodSteepest descentSemidefinite programmingPerformance estimation problem To reference this document use: http://resolver.tudelft.nl/uuid:b3ecfccf-5481-4660-ba40-d7d477ae2779 DOI https://doi.org/10.1007/s11590-016-1087-4 ISSN 1862-4472 Source Optimization Letters, 11 (7), 1185–1199 Part of collection Institutional Repository Document type journal article Rights © 2016 E. de Klerk, François Glineur, Adrien B. Taylor Files PDF s11590_016_1087_4.pdf 677.12 KB Close viewer /islandora/object/uuid:b3ecfccf-5481-4660-ba40-d7d477ae2779/datastream/OBJ/view