skip to main content
Caltech

IST LUNCH BUNCH

Tuesday, January 17, 2017
12:00pm to 1:00pm
Add to Cal
Annenberg 105
Learning to Optimize for Structured Output Spaces
Yisong Yue, Assistant Professor, Computing and Mathematical Sciences, California Institute of Technology,

In many settings, predictions must be made over structured output spaces. Examples include both discrete structures such as sequences and clusterings, as well as continuous ones such as trajectories. The conventional machine learning approach to such "structured prediction" problems is to learn over a holistically pre-specified structured model class (e.g., via conditional random fields or structural SVMs). In this talk, I will discuss recent work along an alternative direction of using learning reductions, or "learning tooptimize".

In learning to optimize, the goal is to reduce the structured prediction problem into a sequence of standard prediction problems that can be solved via conventional supervised learning. Such an approach is attractive because it can easily leverage powerful function classes such as random forests and deep neural nets. The main challenge lies in identifying a good learning reduction that is both principled and practical. I will discuss two projects in detail: contextual submodular optimization, and smooth online sequence prediction.

This is joint work with Stephane Ross, Robin Zhou, Hoang Le, Jimmy Chen, Debadeepta Dey, Andrew Kang, Drew Bagnell, Jim Little, and Peter Carr.

For more information, please contact Diane Goodfellow by email at [email protected].