Hi all,
Junyoung Chung will defend his PhD thesis on March 20th (Tuesday) at 9am at PAA Z-209. Please come and honour his success.
Here is the title and abstract of the thesis:
On Deep Multiscale Recurrent Neural Networks
In this thesis, a subgroup of deep learning models, known as recurrent neural networks is studied in depth. Recurrent neural networks are special types of artificial neural networks that possess more strength in modeling temporal structures of sequential data such as text and speech. Recurrent neural networks are used as the core module of many practical applications including speech recognition, text-to-speech, machine translation, machine comprehension, and question and answering. This thesis includes a series of studies towards deep multiscale recurrent neural networks and novel architectures to overcome the inherent problems of recurrent neural networks.
There are three articles that propose advanced network architectures to implement deep multiscale recurrent neural networks. In the first article, we introduce a new type of network architecture that adds more communication channels to the recurrent neural networks. The recurrence is not only restricted to self-connections as the conventional recurrent neural networks do but fully connected between all hidden layers at consecutive time steps. the influence of information passing through the channel is adaptively controlled by parameterized gating units. In the second article, we study a neural machine translation system that exploits a character-level decoder. The motivation behind this work is to answer a fundamental question of generating a sequence of characters as translation instead of a sequence of words. We design a two-layered recurrent neural network architecture that captures fast and slow components of a sequence in a separate manner. In the third article, we investigate a recurrent neural network architecture that can change the states of hidden layers in multiple timescales in order to capture the hierarchical temporal structure of sequences. The proposed framework introduces a set of boundary detecting units that are used to find terminations of meaningful chunks. The inclusion of the boundary detectors leads to a novel update mechanism that allows the recurrent neural networks to update each hidden layer with a different timescale based on the states of the boundary detectors.
Finally, in the fourth article, we study the inclusion of latent variables to recurrent neural networks. The complexity and high signal-to-noise ratio of sequential data such as speech make it difficult to learn meaningful structures from the data. We propose a recurrent extension of the variational auto-encoder in order to introduce high-level latent variables to recurrent neural networks and show significant performance improvement on sequence modeling tasks such as human speech signals and handwriting examples.
---------- Message transféré ---------- De : Celine Begin beginc@iro.umontreal.ca Date : 8 mars 2018 à 07:51 Objet : Announce of the Thesis Defense of CHUNG, Junyoung - Ph.D. at University of Montreal À : Miklos Csuros miklos.csuros@umontreal.ca, Yoshua Bengio < bengioy@iro.umontreal.ca>, Alain Tapp tappa@iro.umontreal.ca, hochreit@bioinf.jku.at, Junyoung Chung elecegg@gmail.com
Good morning,
You are cordially invited to the Thesis Defense of Junyoung Chung which will take place on Tuesday, March 20th at 9:00 am in room Z-209 of Claire-McNicholl building.
You will find attached the announcement of this Thesis Defense.
Presently, I did receive a confirmation about "le représentant du doyen de la FAS". Maybe next week.
Thanks, Best,
Afficher les réponses par date
lisa_seminaires@iro.umontreal.ca