[Lisa_teatalk] TeaTalk by Çağlar Gülçehre: Noisy Activation Functions

Jörg Bornschein j.bornschein at gmail.com
Wed Mar 30 10:05:40 EDT 2016


Hi,

this Friday we'll have a talk by Çağlar Gülçehre about Noisy Activation
Functiuons:

Who: Çağlar Gülçehre
When: Fri. April 1st, 14:30
Where: AA 3195

Common non-linear activation functions used in neural networks can cause
training difficulties due to the saturation behaviour of the activation
function, which may hide dependencies that are not visible to vanilla-SGD
(using first order gradients only). Gating mechanisms that use softly
saturating activation functions to emulate the discrete switching of
digital logic circuits are good examples of this. We propose to exploit the
injection of appropriate noise so that the gradients may flow easily, even
if the noiseless application of the activation function would yield zero
gradient.  Large noise will dominate the noise-free gradient and allow
stochastic gradient descent to explore more. By adding noise only to the
problematic parts of the activation function, we allow the optimization
procedure to explore the boundary between the degenerate (saturating) and
the well-behaved parts of the activation function. We also establish
connections to simulated annealing, when the amount of noise is annealed
down, making it easier to optimize hard objective functions. We find
experimentally that replacing such saturating activation functions by noisy
variants helps training in many contexts, yielding state-of-the-art or
competitive results on different datasets and task, especially when
training seems to be the most difficult, e.g., when curriculum learning is
necessary to obtain good results.



best and looking forward to see you there,

   Jorg Bornschein
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_teatalk/attachments/20160330/ad73bf86/attachment.html 


More information about the Lisa_teatalk mailing list