This Friday we will have Simon Lacoste-Julien, who is a researcher at INRIA in the SIERRA project team which is part of the Computer Science Department of École Normale Supérieure in Paris give a talk about Frank-Wolfe Optimization for Structured Machine Learning.
Looking forward to see many of you there,
j
--
Title: Frank-Wolfe Optimization for Structured Machine Learning
Who: Simon Lacoste-Julien
Where: AA1207
When: Friday, 18th December, 2:30pm
Abstract:
The Frank-Wolfe (FW) optimization algorithm has lately re-gainedpopularity thanks in particular to its ability to nicely handle thestructured constraints appearing in machine learning applications.However, its convergence rate is known to be slow (sublinear) when thesolution lies at the boundary. In the first part of the talk, I willpresent some less well-known variants of the FW algorithm for which weproved their global linear convergence rate recently for the firsttime, highlighting at the same time an interesting geometric notion of"condition number" for the constraint set appearing in the constant.In the second part of the talk, I will present an application of thesevariants for approximate marginal inference in a Markov random field,by optimizing the TRW variational objective over the marginalpolytope. The proposed algorithm, called "barrier FW" due to itssimilarities with barrier methods in optimization, is the firstprovably convergent algorithm of the TRW objective over the marginalpolytope, and gives more accurate marginals than previous methods inour experiments. If time permits, I will also present how FW can beused to obtain adaptive quadrature rules and be used in particular ina particle filter to obtain better accuracy than the usual randomsampling.This is joint work with David Sontag (NYU), Rahul Krishnan (NYU),Martin Jaggi (ETH), Fredrik Lindsten (U of Cambridge) and Francis Bach(INRIA).