Conference 2016
Top image

 
Home
Program LNMB conference
Invited Speakers LNMB Conference
Program PhD presentations
Abstracts PhD presentations
Registration LNMB Conference
Announcement NGB/LNMB Seminar
Abstracts/Bios NGB/LNMB Seminar
Registration NGB/LNMB Seminar
Registered Participants
Conference Office
How to get there
 
Return to LNMB Site
 

James Renegar: Greater Conic Structure Provides the Basis for Faster Algorithms

Abstract: The framework presented in the previous lecture applies to optimization problems defined over arbitrary convex cones. The framework gives a convex objective function which is Lipschitz continuous but not smooth. In order to apply accelerated gradient methods, however, smoothness of the objective function is essential.
In this lecture we present basics of hyperbolic programming, which generalizes semidefinite programming. Most practical optimization problems are hyperbolic programs. Hyperbolic programming was introduced by Guler in the mid-1990's, and has direct ties to many research topics, such as the recent, celebrated work by Marcus, Spielman and Srivastava, in which they solved the long-open Kadison-Singer Problem.
Hyperbolicity cones - the convex cones underlying hyperbolic programs - have pronounced algebraic/geometric structure that can be used to design faster algorithms. We show this for both first-order, and second-order, methods. Regarding first-order methods, in particular, we show the structure provides a natural smoothing to the objective function in the previous framework, thereby allowing accelerated gradient methods to be applied.