Hi all,
our next speaker is Eugene Vorontsov. Hope to see you many of you there. When: 13:45, July 21 Where: AA6214
Title: On orthogonality and learning recurrent networks with long term dependencies
Abstract:
It is well known that it is challenging to train deep neural networks and recurrent neural networks for tasks that exhibit long term dependencies. The vanishing or exploding gradient problem is a well known issue associated with these challenges. One approach to addressing vanishing and exploding gradients is to use either soft or hard constraints on weight matrices so as to encourage or enforce orthogonality. Orthogonal matrices preserve gradient norm during backpropagation and may therefore be a desirable property. This paper explores issues with optimization convergence, speed and gradient stability when encouraging or enforcing orthogonality. To perform this analysis, we propose a weight matrix factorization and parameterization strategy through which we can bound matrix norms and therein control the degree of expansivity induced during backpropagation. We find that hard constraints on orthogonality can negatively affect the speed of convergence and model performance.
Bio: I am a PhD student with professors Chris Pal and Samuel Kadoury at École Polytechnique de Montréal and MILA. I am working on medical image segmentation models and have recently begun developing an interest in optimization and regularization of deep neural networks. Prior to MILA, I studied Engineering Science at the University of Toronto, specializing in biomedical engineering. An aspect of computer science that appealed to me is the fast production of experimental results -- unfortunately, I also like big models.
--Junyoung
Afficher les réponses par date
Hi all,
we will have a tea-talk tomorrow. Eugene will present his recent work on RNNs that was accepted to this ICML.
Best, --Junyoung
On Sun, Jul 16, 2017 at 11:48 PM, Junyoung Chung elecegg@gmail.com wrote:
Hi all,
our next speaker is Eugene Vorontsov. Hope to see you many of you there. When: 13:45, July 21 Where: AA6214
Title: On orthogonality and learning recurrent networks with long term dependencies
Abstract:
It is well known that it is challenging to train deep neural networks and recurrent neural networks for tasks that exhibit long term dependencies. The vanishing or exploding gradient problem is a well known issue associated with these challenges. One approach to addressing vanishing and exploding gradients is to use either soft or hard constraints on weight matrices so as to encourage or enforce orthogonality. Orthogonal matrices preserve gradient norm during backpropagation and may therefore be a desirable property. This paper explores issues with optimization convergence, speed and gradient stability when encouraging or enforcing orthogonality. To perform this analysis, we propose a weight matrix factorization and parameterization strategy through which we can bound matrix norms and therein control the degree of expansivity induced during backpropagation. We find that hard constraints on orthogonality can negatively affect the speed of convergence and model performance.
Bio: I am a PhD student with professors Chris Pal and Samuel Kadoury at École Polytechnique de Montréal and MILA. I am working on medical image segmentation models and have recently begun developing an interest in optimization and regularization of deep neural networks. Prior to MILA, I studied Engineering Science at the University of Toronto, specializing in biomedical engineering. An aspect of computer science that appealed to me is the fast production of experimental results -- unfortunately, I also like big models.
--Junyoung
We will have the tea talk in 30 minutes!
On Thu, Jul 20, 2017 at 6:09 PM Junyoung Chung elecegg@gmail.com wrote:
Hi all,
we will have a tea-talk tomorrow. Eugene will present his recent work on RNNs that was accepted to this ICML.
Best, --Junyoung
On Sun, Jul 16, 2017 at 11:48 PM, Junyoung Chung elecegg@gmail.com wrote:
Hi all,
our next speaker is Eugene Vorontsov. Hope to see you many of you there. When: 13:45, July 21 Where: AA6214
Title: On orthogonality and learning recurrent networks with long term dependencies
Abstract:
It is well known that it is challenging to train deep neural networks and recurrent neural networks for tasks that exhibit long term dependencies. The vanishing or exploding gradient problem is a well known issue associated with these challenges. One approach to addressing vanishing and exploding gradients is to use either soft or hard constraints on weight matrices so as to encourage or enforce orthogonality. Orthogonal matrices preserve gradient norm during backpropagation and may therefore be a desirable property. This paper explores issues with optimization convergence, speed and gradient stability when encouraging or enforcing orthogonality. To perform this analysis, we propose a weight matrix factorization and parameterization strategy through which we can bound matrix norms and therein control the degree of expansivity induced during backpropagation. We find that hard constraints on orthogonality can negatively affect the speed of convergence and model performance.
Bio: I am a PhD student with professors Chris Pal and Samuel Kadoury at École Polytechnique de Montréal and MILA. I am working on medical image segmentation models and have recently begun developing an interest in optimization and regularization of deep neural networks. Prior to MILA, I studied Engineering Science at the University of Toronto, specializing in biomedical engineering. An aspect of computer science that appealed to me is the fast production of experimental results -- unfortunately, I also like big models.
--Junyoung
--
--Junyoung
It's starting soon!
On Fri, Jul 21, 2017 at 1:15 PM Junyoung Chung elecegg@gmail.com wrote:
We will have the tea talk in 30 minutes!
On Thu, Jul 20, 2017 at 6:09 PM Junyoung Chung elecegg@gmail.com wrote:
Hi all,
we will have a tea-talk tomorrow. Eugene will present his recent work on RNNs that was accepted to this ICML.
Best, --Junyoung
On Sun, Jul 16, 2017 at 11:48 PM, Junyoung Chung elecegg@gmail.com wrote:
Hi all,
our next speaker is Eugene Vorontsov. Hope to see you many of you there. When: 13:45, July 21 Where: AA6214
Title: On orthogonality and learning recurrent networks with long term dependencies
Abstract:
It is well known that it is challenging to train deep neural networks and recurrent neural networks for tasks that exhibit long term dependencies. The vanishing or exploding gradient problem is a well known issue associated with these challenges. One approach to addressing vanishing and exploding gradients is to use either soft or hard constraints on weight matrices so as to encourage or enforce orthogonality. Orthogonal matrices preserve gradient norm during backpropagation and may therefore be a desirable property. This paper explores issues with optimization convergence, speed and gradient stability when encouraging or enforcing orthogonality. To perform this analysis, we propose a weight matrix factorization and parameterization strategy through which we can bound matrix norms and therein control the degree of expansivity induced during backpropagation. We find that hard constraints on orthogonality can negatively affect the speed of convergence and model performance.
Bio: I am a PhD student with professors Chris Pal and Samuel Kadoury at École Polytechnique de Montréal and MILA. I am working on medical image segmentation models and have recently begun developing an interest in optimization and regularization of deep neural networks. Prior to MILA, I studied Engineering Science at the University of Toronto, specializing in biomedical engineering. An aspect of computer science that appealed to me is the fast production of experimental results -- unfortunately, I also like big models.
--Junyoung
--
--Junyoung
lisa_seminaires@iro.umontreal.ca