[Lisa_seminaires] [Tea Talk] Rosemary Ke (MILA) Nov 10 10:30AM AA6214

Michael Noukhovitch mnoukhov at gmail.com
Ven 10 Nov 09:34:30 EST 2017


This is today in an hour!

On Tue, Nov 7, 2017, 13:43 Michael Noukhovitch, <mnoukhov at gmail.com> wrote:

> This week we have our very own *Rosemary Ke* giving a talk on *Friday Nov
> 10* at* 10:30AM* in room *AA6214*.
>
> See you there!
> Michael
>
> *KEYWORDS* RNN, BPTT, Attention Mechanism
>
> *TITLE *Sparse Attentive Backtracking: Long-Range Credit Assignment in
> Recurrent Networks
>
> *ABSTRACT*
> A major drawback of backpropagation through time (BPTT) is the difficulty
> of learning long-term dependencies, coming from having to propagate credit
> information backwards through every single step of the forward computation.
> This makes BPTT both computationally impractical and biologically
> implausible. For this reason, full backpropagation through time is rarely
> used on long sequences, and truncated backpropagation through time is used
> as a heuristic.  However, this usually leads to biased estimates of the
> gradient in which longer term dependencies are ignored.  Addressing this
> issue, we propose an alternative algorithm, Sparse Attentive Backtracking,
> which might also be related to principles used by brains to learn long-term
> dependencies. Sparse Attentive Backtracking learns an attention mechanism
> over the hidden states of the past and selectively backpropagates through
> paths with high attention weights.  This allows the model to learn long
> term dependencies while only backtracking for a small number of time steps,
> not just from the recent past but also from attended relevant past
> states.
>
-------------- section suivante --------------
Une pièce jointe HTML a été nettoyée...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_seminaires/attachments/20171110/214c048c/attachment.html 


Plus d'informations sur la liste de diffusion Lisa_seminaires