[Lisa_teatalk] Tea Talk Tomorrow!

Aaron Courville aaron.courville at gmail.com
Wed Feb 23 06:41:39 EST 2011


Hi Gang,

Tomorrow we will have Razvan tell us about stuff he's been working on recently.

Date: Thursday. Feb. 24th 2011
Time: 14h30
Location LISA lab (AA3256)

Abstract:
Recurrent Neural Networks are a perfect framework for modelling
complex non-linear temporal information. Unfortunately all gradient
based
methods for training them suffer fro the "vanishing gradient" problem,
which means that RNN can at most discover only short term temporal
dependencies
in the data. There have been a few attempts to address this problem,
the most noteworthy being the Long-Short Term Memory network that
solves
the task by modifying the structure of the network. We will be looking
in a different direction, namely how can this problem be addressed by
looking
at the optimization algorithm ( Back Propagation Through Time in this
case). We propose a regularization term that forces RNN to look back
in time and
show a few results on synthetic data.


Cheers,
Aaron


-- 
Aaron C. Courville
Département d’Informatique et
de recherche opérationnelle
Université de Montréal
email:Aaron.Courville at gmail.com


More information about the Lisa_teatalk mailing list