EE Systems Seminar
Abstract:
This talk introduces new scalable and universal convex optimization algorithms for the prototypical constrained minimization template, which has a host of applications in signal processing, control, and machine learning. The new algorithms not only match theoretical lower bounds on their iteration complexity, but also rigorously trade off their convergence rates with their per-iteration time by optimally adapting to the unknown smoothness structures in the optimization problem.
We first describe our new model-based gap reduction technique [1] for designing optimal first-order primal-dual methods. Through a dual smoothing strategy, our technique also handles the augmented Lagrangian and alternating methods as special cases. As a result of our technique, we identify a fundamental uncertainty principle describing how the primal optimality of the algorithmic iterates must compete with their feasibility. We illustrate the algorithmic framework for a large-scale compressive sensing problem, which involves recovering a 1-Gigapixel image.
We then discuss how to remove the standard proximal mapping computations taken for granted in primal-dual constrained convex minimization [2]. We instead present a computationally cheaper alternative, which generalizes the linear minimization oracles---a hallmark of Frank-Wolfe type of algorithms. Based on our generalization, we build a new universal primal-dual scheme, which provides optimal convergence guarantees for the objective residual, as well as the feasibility gap, without having to know the smoothness structures of the problem. We illustrate the framework for solving robust low-rank matrix recovery problems (matrix completion with sparse perturbations and quantum tomography) with rank-1 updates.
Biography:
Volkan Cevher received the B.S. (valedictorian) degree in electrical engineering in 1999 from Bilkent University in Ankara, Turkey, and he received the Ph.D. degree in Electrical and Computer Engineering in 2005 from the Georgia Institute of Technology in Atlanta. He held research scientist positions at the University of Maryland, College Park from 2006 to 2007 and at Rice University in Houston, Texas, from 2008 to 2009. Currently, he is an Assistant Professor at the Swiss Federal Institute of Technology Lausanne with a complimentary appointment at the Electrical and Computer Engineering Department at Rice University. His research interests include machine learning, optimization, signal processing, and information theory. He received a Best Paper Award at SPARS in 2009 and an ERC StG in 2011.