Conference 2023
Top image

 
Home
Program
Invited Speakers
PhD student presentations
Registration
Registration for seminar only
Participants
Location
Previous conferences
 
Return to LNMB Site
 

François Glineur:
An introduction to performance estimation of first-order methods

Abstract: First-order methods are well-suited to solve large-scale continuous optimization models, because of their simplicity and relatively cheap iteration cost. We will start this presentation with a very high-level overview of first-order methods and their main advantages. Then we will describe the performance estimation methodology, born in the original work of Drori and Teboulle in 2014, that allows to compute tight performance guarantees for a large class of first-order methods in a completely automated manner.

We consider a large class of black-box oracle-based first-order optimization methods designed to solve convex problems, which includes classical and accelerated gradient methods (allowing constrained and proximal variants) for smooth (strongly) convex optimization.

We show that the worst-case performance of those methods can be computed exactly by solving a semidefinite optimization problem (often improving the previously best known rates), while also producing an explicit problem instance for which each worst-case is attained. Our procedure also provides independently-checkable proofs for the exact convergence rates (in function value, gradient residual norm and distance to the solution).

We conclude with a quick overview of two toolboxes implementing our framework, called PESTO (Performance Estimation Toolbox, for MATLAB) and PEPit (Performance Estimation in Python), available from https://github.com/PerformanceEstimation

(this part is mainly joint work with Adrien B. Taylor and Julien M. Hendrickx, as well as Baptiste Goujaud and Céline Moucer for the PEPit toolbox)