"The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions"
|Date||9 June 2021|
10:00-11:00 - Seminar, open to everyone
11:00-12:00 - Project meeting, by invitation only
Non-convex optimization arises naturally in many applications including machine learning, and recommendation systems, to name but a few. Recently, semidefinite programming performance estimation has been employed as a framework for the analysis of first-order methods. By using this approach, we study the convergence rate of the gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an L-smooth function. We establish a new worst-case convergence rate. By giving an example, we demonstrate that the bound may be exact in some cases. In addition, based on the bound, we present an optimal step length. Furthermore, we discuss the application of performance estimation for the investigation of stochastic gradient descent, that is commonly used in training large neural networks.
*Co-authored with Etienne de Klerk (Tilburg University).
The research project Optimization for and with Machine Learning (OPTIMAL) is an ENW-GROOT research project funded by NWO. It is a collaboration between researchers from Centrum Wiskunde & Informatica (CWI), Delft University of Technology (TU Delft) and the University of Amsterdam (UvA). It started in 2020 and will end in 2025. Research from this project will be shared in various events. For more information about this research project visit the project site OPTIMAL.uva.nl and the events calendar.
This seminar will be held via ZOOM. Attendance is possible invitation only. Please send an e-mail to email@example.com if your are interested in attending this seminar.