[Lisa_teatalk] Tea Talk Tomorrow!

Aaron Courville aaron.courville at gmail.com
Wed Mar 7 11:13:58 EST 2012


Greetings,

This week Razvan Pascanu will tell us about some of his recent work on
optimization methods. Hope to see you there.

When: 15h00 Tomorrow (Thursday March 8th 2012)
Where: LIDA Lab, (AA3256)

 Title : Yet Another Optimization Technique for Machine Learning ?

 Abstract:
 I've been trying to understand what Hessian Free or related methods are doing,
how and why they can claim to be getting better generalization error. Why they
work for RNNs better then SGD in certain cases?

I don't have all the answers, but I found a set of misnomers IMHO, and
 interesting
connections between natural gradient and recently proposed methods.
I want to provide a different interpretation of how fancier
optimization should be done
for complex non-linear models, and I want in the end to outline how one would go
about improving proposed algorithms.


Cheers,
Aaron

-- 
Aaron C. Courville
Département d’Informatique et
de recherche opérationnelle
Université de Montréal
email:Aaron.Courville at gmail.com


More information about the Lisa_teatalk mailing list