[Lisa_seminaires] Talk by David Grangier thursday 15h30 room 1360

Pascal Vincent vincentp at iro.umontreal.ca
Mer 11 Oct 11:04:56 EDT 2017


Hello MILA friends,

A reminder that tomorrow thursday at 15:30 we will have a talk by David
Grangier on a novel and much faster seq2seq architecture for translation
"Neural Machine Translation: Achieving Fast Training and Fast Inference
with Gated Convolutions."

This is part of the "colloques du DIRO series", so there is a traditional
coffee+biscuits half-an-hour before.


Reposting the full announcement below ( url: http://diro.umontreal.ca/
departement/colloques/une-nouvelle/news/david-grangier-
neural-machine-translation-achie-42551/ )

Conférencier : David Grangier, Facebook AI Research, Menlo Park, CA

Titre : Neural Machine Translation:
        Achieving Fast Training and Fast Inference with Gated Convolutions

Travail conjoint avec Michael Auli, Yann Dauphin, Angela Fan, Jonas Ghering
                      et Sergey Edunov

Local : André-Aisenstadt 1360

Date et heure : Jeudi 12 octobre, 15h30, café-biscuits ŕ 15h

Résumé :
Neural architectures for Machine Translation (MT) and related language
modeling tasks is an active research field. The first part of our talk
introduces several architectural changes to the original work of Bahdanau
et al. 2014. We replace non-linearities with our novel gated linear units,
recurrent units with convolutions and introduce multi-hop attention to
allow more complex attention patterns. These changes improve generalization
performance, training efficiency and decoding speed. The second part of our
talk analyzes the properties of the distribution predicted by the model,
examine how predictions differ from their empirical counterpart and we
discuss how this influences beam search.

Biographie du conférencier :
David Grangier is a research scientist at Facebook AI Research, Menlo Park,
CA.
David earned his PhD in Machine Learning from Ecole Polytechnique Federale
de Lausanne advised by Samy Bengio. He worked at different industrial labs,
including NEC Labs America (2008-2011), AT&T Research (2011-2012) and
Microsoft
Research (2012-2014). Currently, David works on machine learning and its
application to natural language processing, he is particularly interested
in text generation tasks.  http://david.grangier.info/

Pour vous préparer, vous pourriez lire...

Convolutional Sequence to Sequence Learning
Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, Yann N. Dauphin
- International Conference on Machine Learning (ICML). 2017.

Language Modeling with Gated Convolutional Networks
Yann N. Dauphin, Angela Fan, Michael Auli and David Grangier -
International Conference on Machine Learning (ICML). 2017.

Efficient softmax approximation for GPUs
Edouard Grave, Armand Joulin, Moustapha Cisse and David Grangier and Hervé
Jegou - International Conference on Machine Learning (ICML). 2017.

A Convolutional Encoder Model for Neural Machine Translation
Jonas Gehring, Michael Auli, David Grangier, Yann N. Dauphin - Conference
of the Association for Computational Linguistics (ACL). 2017.

Neural Generation of Text from Structured Data with Application to the
Bibliography Domain
Remi Lebret, David Grangier and Michael Auli - Conference on Empirical
Methods in Natural Language Processing (EMNLP). 2016.

Vocabulary Selection Strategies for Neural Machine Translation
Gurvan L'Hostis, David Grangier, Michael Auli - arXiv:1610.00072. 2016.

Strategies for Training Large Vocabulary Neural Language Models
W. Chen, D. Grangier and M. Auli - Conference of the Association for
Computational Linguistics (ACL). 2016.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_seminaires/attachments/20171011/b5b02f79/attachment.html 


Plus d'informations sur la liste de diffusion Lisa_seminaires