[Lisa_seminaires] Talk by Surya Ganguli, this Friday 17th Jan, 14h00-15h00 AA3195

Ian Goodfellow goodfellow.ian at gmail.com
Mar 14 Jan 10:17:08 EST 2014


To give some context, this work is also an ICLR submission:
http://openreview.net/document/17b56863-5cf0-4567-b725-eb5bf149773d#17b56863-5cf0-4567-b725-eb5bf149773d
The first author is my friend Andrew Saxe.


2014/1/14 Razvan Pascanu <r.pascanu at gmail.com>

> Hi all,
>
>  This Friday we have a talk by Surya Ganguli at 14h00 in room 3195 Andre
> Aisenstadt.
>
> Exact Solutions to the Nonlinear Dynamics of Learning in Deep Linear
> Networks
>
> Despite the widespread practical success of deep learning methods, our
> theoretical understanding of the dynamics of learning in deep neural
> networks remains quite sparse. We attempt to bridge the gap between the
> theory and practice of deep learning by systematically analyzing learning
> dynamics for the restricted case of deep linear neural networks. Despite
> the linearity of their input-output map, such networks have nonlinear
> gradient descent dynamics on weights that change with the addition of each
> new hidden layer. We show that deep linear networks exhibit nonlinear
> learning phenomena similar to those seen in simulations of nonlinear
> networks, including long plateaus followed by rapid transitions to lower
> error solutions, and faster convergence from greedy unsupervised
> pretraining initial conditions than from random initial conditions. We
> provide an analytical description of these phenomena by finding new exact
> solutions to the nonlinear dynamics of deep learning. Our theoretical
> analysis also reveals the surprising finding that as the depth of a network
> approaches infinity, learning speed remains finite: for a special class of
> initial conditions on the weights, very deep networks incur only a finite
> delay in learning speed relative to shallow networks. We further show that,
> under certain conditions on the training data, unsupervised pretraining can
> find this special class of initial conditions.  We also discuss application
> to infant semantic development.
>
> I hope to see many of you there !
>
> Best,
> Razvan
>
> _______________________________________________
> Lisa_seminaires mailing list
> Lisa_seminaires at iro.umontreal.ca
> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_seminaires
>
>
-------------- section suivante --------------
Une pièce jointe HTML a été nettoyée...
URL: https://webmail.iro.umontreal.ca/mailman/private/lisa_seminaires/attachments/20140114/8d2910da/attachment.html 


Plus d'informations sur la liste de diffusion Lisa_seminaires