[Lisa_teatalk] Tea Talk Tomorrow!

Aaron Courville aaron.courville at gmail.com
Tue Nov 15 09:04:09 EST 2011


Hey Gang,

This week we have Guillaume Desjardins talking about a recent ICML paper on
enhanced gradient methods.

When: Nov 16th 14h00
Where: LISA Lab (AA3256)

Abstract:

In this tea talk, I will present recent work by KyungHyun Cho on
"enhanced gradient" for RBMs. The motivation for this new gradient are
two-fold. First, it is easy to show that the typical maximum
likelihood gradient on the weights is a function of the gradients on
the biases. Second, the RBM is over-parametrized in that multiple
(visible/hidden states, parameter) configurations can lead to the same
energy function. The enhanced gradient addresses both of these
problems by being invariant to these bit-flip transformations. This
results in faster convergence, less "dead" filters and an invariance
to the actual binary representation of the data (e.g. ability to learn
bit-flipped MNIST). We shall also discuss links to the natural
gradient and time allowing, discuss their learning rate adaptation
schedule.


Cheers,
Aaron

-- 
Aaron C. Courville
Département d’Informatique et
de recherche opérationnelle
Université de Montréal
email:Aaron.Courville at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_teatalk/attachments/20111115/1eca9890/attachment.html 


More information about the Lisa_teatalk mailing list