*spoiler alert* 

Ian is correct in pointing this out, but that's the main point of my talk.

The FFT is not new at all. Everybody knew about it.

It's just that nobody felt like revisiting the idea by actually implementing it and taking measurements.


On Tue, May 27, 2014 at 1:46 PM, Ian Goodfellow <goodfellow.ian@gmail.com> wrote:
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.


2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho <cho.k.hyun@gmail.com>:
Dear all,

Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.

See you there!
- Cho

============

- Speaker: Guillaume Alain
- Date and Time: 28 May 2014 @ 13.00
- Place: AA3195
- Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast Training of Convolutional Networks through FFTs. ICLR 2014

_______________________________________________
Lisa_labo mailing list
Lisa_labo@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo



_______________________________________________
Lisa_labo mailing list
Lisa_labo@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo