---------------------------- Message original ---------------------------- Objet: Predoc oral de James Bergstra, jeudi 21 aout De: "Pierre L'Ecuyer" lecuyer@iro.umontreal.ca Date: Ven 15 août 2008 15:40 À: "seminaires" seminaires@iro.umontreal.ca "Pierre L'Ecuyer" lecuyer@iro.umontreal.ca --------------------------------------------------------------------------
Bonjour.
Vous êtes tous cordialement invités au predoc oral de *James Bergstra*, etudiant au DIRO sous la direction de Yoshua Bengio.
Date: jeudi 21 aout
Heure: 16h30
Salle: 3195, Pav. Aisenstadt
Titre: Object Recognition and Multiscale prediction with Recurrent Neural Networks
Résumé / Abstract: Object recognition rates in humans are much higher when subjects are allowed to look around an image and reflect a little, but shallow feedforward models are unable to exhibit this sort of behaviour. Deep models with multiple layers of nonlinear processing may be said to reflect a little, but my doctoral work will look at strategies for training recurrent neural networks with this capacity. Deep neural networks are much like recurrent networks: an unfolded recurrent architecture is a deep network with tied weights, and any deep network can be transformed into a recurrent architecture by adding hidden units to the recurrent state. I will to translate recent techniques for learning deep neural networks into the domain of recurrent networks. My thesis proposal will outline new recurrent models, new training strategies, and a new way to perform static classification with temporal models. My doctoral work so far has centered on the use of physiologically-motivated cell models in artificial neural networks, and on techniques for training recurrent networks to learn over short and medium-term timescales.