skip to main content
Caltech

Computing and Mathematical Sciences Colloquium

Monday, January 9, 2017
4:00pm to 5:00pm
Add to Cal
Annenberg 105
Subgradient Methods for General Convex Optimization
Professor James Renegar, Operations Research and Information Engineering, Cornell University,

Subgradient methods are the most basic of algorithms in convex optimization, applicable quite generally and having simple convergence proofs, modulo assumptions such as orthogonal projections onto convex sets being readily computable.  The fundamentals regarding subgradient methods have changed little in decades, perhaps in part due a supposition by researchers that not much else can be said for algorithms relying on hardly any problem structure other than convexity and Lipschitz continuity.  We explain that, to the contrary, further advances are possible. Under the assumption that a strictly feasible point is known, we display an elementary algorithmic device which removes the need for Lipschitz continuity, and that replaces orthogonal projections with line searches.

For more information, please contact Carmen Nemer-Sirois by phone at (626) 395-4561 or by email at [email protected].