[Lisa_seminaires] [mila-tous] [Tea Talk] Nicolas Loizou (FAIR) Fri November 16 2018 10:30 AM AA3195

Pablo Fonseca palefo at gmail.com
Ven 16 Nov 10:28:21 EST 2018


The streaming link is: https://mila.bluejeans.com/4255239897/webrtc




On Wed, Nov 14, 2018 at 12:44 PM <rim.assouel at gmail.com> wrote:

> This week we have *Nicolas Loizou* from * FAIR* giving a talk on *Fri November
> 16 2018* at *10:30 AM* in room *AA3195*
>
> Will this talk be streamed <https://mila.bluejeans.com/809027115/webrtc>?
> Yes Recorded? Yes
> And you can sign up to meet the speaker here:
>
>
> Getting lazy on Fridays ? FAIR enough, but Momentum is all you need ;)
>
> See you there!
> Rim and Sai
>
> *TITLE* Momentum and Stochastic Momentum for Stochastic Gradient, Newton,
> Proximal Point and Subspace Descent Methods
>
> *ABSTRACT*
> In this paper we study several classes of stochastic optimization
> algorithms enriched with heavy ball momentum. Among the methods studied
> are: stochastic gradient descent, stochastic Newton, stochastic proximal
> point and stochastic dual subspace ascent. This is the first time momentum
> variants of several of these methods are studied. We choose to perform our
> analysis in a setting in which all of the above methods are equivalent. We
> prove global non-assymptotic linear convergence rates for all methods and
> various measures of success, including primal function values, primal
> iterates (in L2 sense), and dual function values. We also show that the
> primal iterates converge at an accelerated linear rate in the L1 sense.
> This is the first time a linear rate is shown for the stochastic heavy ball
> method (i.e., stochastic gradient descent method with momentum). Under
> somewhat weaker conditions, we establish a sublinear convergence rate for
> Cesaro averages of primal iterates. Moreover, we propose a novel concept,
> which we call stochastic momentum, aimed at decreasing the cost of
> performing the momentum step. We prove linear convergence of several
> stochastic methods with stochastic momentum, and show that in some sparse
> data regimes and for sufficiently small momentum parameters, these methods
> enjoy better overall complexity than methods with deterministic momentum.
> Finally, we perform extensive numerical testing on artificial and real
> datasets, including data coming from average consensus problems.
>
> *BIO*
> Nicolas is a final year PhD student at The University of Edinburgh in
> School of Mathematics. More specifically he is a member of the Operational
> Research and Optimization Group (ERGO) under the supervision of Dr. Peter
> Richtarik.Before he moved to Edinburgh he spent 4 years in Athens as
> undergraduate student in department of Mathematics at  National and
> Kapodistrian University of Athens <http://en.uoa.gr/> and 1 year as
> postgraduate student at Imperial College London where he obtained an MSc
> in Computing (Computational Management Science)
> <http://www.imperial.ac.uk/computing/prospective-students/courses/pg/specialist-degrees/cms/>
> .
> His research interests include (but are not limited to): Large Scale
> Optimization, Machine Learning, Deep Learning, Randomized numerical linear
> algebra, Randomized and Distributed Algorithms .
> Website : https://www.maths.ed.ac.uk/~s1461357/
>
> --
> You received this message because you are subscribed to the Google Groups
> "MILA Tous" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to mila-tous+unsubscribe at mila.quebec.
>
-------------- section suivante --------------
Une pièce jointe HTML a été nettoyée...
URL: <http://mailman.iro.umontreal.ca/pipermail/lisa_seminaires/attachments/20181116/6fb063a2/attachment.html>


Plus d'informations sur la liste de diffusion Lisa_seminaires