[Lisa_teatalk] Tea talk Wednesday (12th March) -- 13:00 AA 3195

Razvan Pascanu r.pascanu at gmail.com
Sun Mar 9 20:14:46 EDT 2014


Hi all,

 This Wednesday
13:00 room AA 3195, 
I  will give a tea talk on my work with Guido Montufar, KyungHyun Cho and
Yoshua regarding the importance of depth for feedforward neural networks
with piece-wise linear activation functions.

This work (split into two papers) was submitted to ICLR and COLT this year.


Ttile: On the number of linear regions of deep neural networks

Abstract:
We study the complexity of functions computable by deep feedforward neural
networks with piece-wise linear activations in terms of the number of
regions of linearity that they have. Deep networks are able to sequentially
map portions of each layer's input space to the same output. In this way,
deep models compute functions with a compositional structure that is able
to re-use pieces of computation exponentially often in terms of their
depth. This note investigates the complexity of such compositional maps and
contributes new theoretical results regarding the advantage of depth for
neural networks with piece-wise linear activation functions.

paper submitted to COLT : http://arxiv.org/abs/1402.1869
paper submitted to ICLR  : http://arxiv.org/abs/1312.6098

Hope to see many of you there,
Razvan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_teatalk/attachments/20140309/c6694400/attachment.html 


More information about the Lisa_teatalk mailing list