[Lisa_teatalk] [Lisa_seminaires] Eugene Vorontsov, July 21, AA6214, 13:45

Tariq Daouda tariq.daouda at gmail.com
Mon Jul 17 13:14:14 EDT 2017


I'd be interested.

Tariq Daouda

PhD candidate in bioinformatics
Institute for Research in Immunology and Cancer (IRIC).
Tel : 514-343-6111, poste : 0582
University of Montreal - Marcelle-Coutu building, #1375
http://www.tariqdaouda.com

2017-07-17 2:57 GMT-04:00 Junyoung Chung <elecegg at gmail.com>:

> We can use hangout video conference during the talk.
> If there are other people who also want to remotely watch the talk, please
> leave a reply here.
>
> Best,
> --Junyoung
>
> On Mon, Jul 17, 2017 at 2:10 AM, Yoshua Bengio <yoshua.umontreal at gmail.com
> > wrote:
>
>> I won't make it but I would  really like to watch the presentation and
>> get the slides, if possible.
>>
>> 2017-07-17 12:48 GMT+09:00 Junyoung Chung <elecegg at gmail.com>:
>>
>>> Hi all,
>>>
>>> our next speaker is Eugene Vorontsov. Hope to see you many of you there.
>>> When: 13:45, July 21
>>> Where: AA6214
>>>
>>> Title: On orthogonality and learning recurrent networks with long term
>>> dependencies
>>>
>>> Abstract:
>>>
>>> It is well known that it is challenging to train deep neural networks
>>> and recurrent neural networks for tasks that exhibit long term
>>> dependencies. The vanishing or exploding gradient problem is a well known
>>> issue associated with these challenges. One approach to addressing
>>> vanishing and exploding gradients is to use either soft or hard constraints
>>> on weight matrices so as to encourage or enforce orthogonality. Orthogonal
>>> matrices preserve gradient norm during backpropagation and may therefore be
>>> a desirable property. This paper explores issues with optimization
>>> convergence, speed and gradient stability when encouraging or enforcing
>>> orthogonality. To perform this analysis, we propose a weight matrix
>>> factorization and parameterization strategy through which we can bound
>>> matrix norms and therein control the degree of expansivity induced during
>>> backpropagation. We find that hard constraints on orthogonality can
>>> negatively affect the speed of convergence and model performance.
>>>
>>> Bio:
>>> I am a PhD student with professors Chris Pal and Samuel Kadoury at École
>>> Polytechnique de Montréal and MILA. I am working on medical image
>>> segmentation models and have recently begun developing an interest in
>>> optimization and regularization of deep neural networks. Prior to MILA, I
>>> studied Engineering Science at the University of Toronto, specializing in
>>> biomedical engineering. An aspect of computer science that appealed to me
>>> is the fast production of experimental results -- unfortunately, I also
>>> like big models.
>>>
>>> --Junyoung
>>>
>>> _______________________________________________
>>> Lisa_seminaires mailing list
>>> Lisa_seminaires at iro.umontreal.ca
>>> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_seminaires
>>>
>>>
>>
>
> _______________________________________________
> Lisa_teatalk mailing list
> Lisa_teatalk at iro.umontreal.ca
> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_teatalk
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_teatalk/attachments/20170717/0cbf5697/attachment.html 


More information about the Lisa_teatalk mailing list