Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there! - Cho
============
- Speaker: Guillaume Alain - Date and Time: 28 May 2014 @ 13.00 - Place: AA3195 - Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast Training of Convolutional Networks through FFTs. ICLR 2014
Afficher les réponses par date
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.
2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho cho.k.hyun@gmail.com:
Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there!
- Cho
============
- Speaker: Guillaume Alain
- Date and Time: 28 May 2014 @ 13.00
- Place: AA3195
- Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast
Training of Convolutional Networks through FFTs. ICLR 2014
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
FYI. This paper titled “A Convolutional Neural Network for Modelling Sentenceshttps://www.google.ca/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CCcQFjAA&url=http%3A%2F%2Fnal.co%2Fpapers%2FKalchbrenner_DCNN_ACL14&ei=A9CEU923HZWiyASU8YHQCw&usg=AFQjCNG3QPFybr_uJS-EHCLWxb2IncmyMQ&bvm=bv.67720277,d.b2U” also mentioned FFT for convolution computation.
Thanks, Xiaohua
From: lisa_teatalk-bounces@iro.umontreal.ca [mailto:lisa_teatalk-bounces@iro.umontreal.ca] On Behalf Of Ian Goodfellow Sent: Tuesday, May 27, 2014 1:46 PM To: Kyung Hyun Cho Cc: Lisa Labo; lisa_teatalk@iro.umontreal.ca Subject: Re: [Lisa_teatalk] [Lisa_labo] Tea Talk Tomorrow 28 May @13:00 AA3195 by Guillaume Alain
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.
2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho <cho.k.hyun@gmail.commailto:cho.k.hyun@gmail.com>: Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there! - Cho
============ - Speaker: Guillaume Alain - Date and Time: 28 May 2014 @ 13.00 - Place: AA3195 - Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast Training of Convolutional Networks through FFTs. ICLR 2014
_______________________________________________ Lisa_labo mailing list Lisa_labo@iro.umontreal.camailto:Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
I'm planning to have a talk that probably lasts closer to 30 minutes than 60 minutes, so we'll have plenty of time after to get the historical narrative correct.
I will also try to give people some intuition to explain how the FFT manages to "cheat" at computing, but I'm not going to try to cram a full course on signal processing into a tea talk.
On Tue, May 27, 2014 at 1:49 PM, Liu, Xiaohua Xiaohua.Liu@nuance.comwrote:
FYI. This paper titled “A Convolutional Neural Network for Modelling Sentenceshttps://www.google.ca/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CCcQFjAA&url=http%3A%2F%2Fnal.co%2Fpapers%2FKalchbrenner_DCNN_ACL14&ei=A9CEU923HZWiyASU8YHQCw&usg=AFQjCNG3QPFybr_uJS-EHCLWxb2IncmyMQ&bvm=bv.67720277,d.b2U” also mentioned FFT for convolution computation.
Thanks,
Xiaohua
*From:* lisa_teatalk-bounces@iro.umontreal.ca [mailto: lisa_teatalk-bounces@iro.umontreal.ca] *On Behalf Of *Ian Goodfellow *Sent:* Tuesday, May 27, 2014 1:46 PM *To:* Kyung Hyun Cho *Cc:* Lisa Labo; lisa_teatalk@iro.umontreal.ca *Subject:* Re: [Lisa_teatalk] [Lisa_labo] Tea Talk Tomorrow 28 May @13:00 AA3195 by Guillaume Alain
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.
2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho cho.k.hyun@gmail.com:
Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there!
- Cho
============
Speaker: Guillaume Alain
Date and Time: 28 May 2014 @ 13.00
Place: AA3195
Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast
Training of Convolutional Networks through FFTs. ICLR 2014
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
*spoiler alert*
Ian is correct in pointing this out, but that's the main point of my talk.
The FFT is not new at all. Everybody knew about it.
It's just that nobody felt like revisiting the idea by actually implementing it and taking measurements.
On Tue, May 27, 2014 at 1:46 PM, Ian Goodfellow goodfellow.ian@gmail.comwrote:
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.
2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho cho.k.hyun@gmail.com:
Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there!
- Cho
============
- Speaker: Guillaume Alain
- Date and Time: 28 May 2014 @ 13.00
- Place: AA3195
- Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast
Training of Convolutional Networks through FFTs. ICLR 2014
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
Correction.
There is a good reason why it was not done before. When I was working on conv nets with Yann LeCun and Patrice Simard in the early 90's some people had tried it but there was no gain. The main reason is that we only had very few channels (like 1 in the input and 5 in the first layer) in the lower layers (where most of the computation took place). When the number of channels becomes large, the advantage of doing the FFT greatly increases because the log(n) overhead can be compensated by the NxN re-use of it through all the NxN channel combinations (N channels in the previous layer times N channels in the next). Also, I don't remember that somebody thought about the advantage brought by this re-use, but I was not involved in this directly so I am not sure.
-- Yoshua
On Tue, May 27, 2014 at 1:49 PM, Guillaume Alain < guillaume.alain.umontreal@gmail.com> wrote:
*spoiler alert*
Ian is correct in pointing this out, but that's the main point of my talk.
The FFT is not new at all. Everybody knew about it.
It's just that nobody felt like revisiting the idea by actually implementing it and taking measurements.
On Tue, May 27, 2014 at 1:46 PM, Ian Goodfellow goodfellow.ian@gmail.comwrote:
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.
2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho cho.k.hyun@gmail.com:
Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there!
- Cho
============
- Speaker: Guillaume Alain
- Date and Time: 28 May 2014 @ 13.00
- Place: AA3195
- Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun.
Fast Training of Convolutional Networks through FFTs. ICLR 2014
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
Also, FFTs typically speed up convolutions only for large filters (receptive fields). For small (local) filters, you have to zero pad them to the input size to use FFTs, so you could essentially increase the size of your filters for free, while a direct computation takes proportionally longer. I didn't read the paper but I wouldn't be surprised if it rehashes some of those same ideas.
Nicolas
Date: Tue, 27 May 2014 14:04:02 -0400 From: yoshua.bengio@gmail.com To: guillaume.alain.umontreal@gmail.com CC: lisa_teatalk@iro.umontreal.ca; lisa_labo@iro.umontreal.ca; cho.k.hyun@gmail.com Subject: Re: [Lisa_labo] Tea Talk Tomorrow 28 May @13:00 AA3195 by Guillaume Alain
Correction. There is a good reason why it was not done before. When I was working on conv nets with Yann LeCun and Patrice Simard in the early 90's some people had tried it but there was no gain. The main reason is that we only had very few channels (like 1 in the input and 5 in the first layer) in the lower layers (where most of the computation took place). When the number of channels becomes large, the advantage of doing the FFT greatly increases because the log(n) overhead can be compensated by the NxN re-use of it through all the NxN channel combinations (N channels in the previous layer times N channels in the next). Also, I don't remember that somebody thought about the advantage brought by this re-use, but I was not involved in this directly so I am not sure.
-- Yoshua
On Tue, May 27, 2014 at 1:49 PM, Guillaume Alain guillaume.alain.umontreal@gmail.com wrote:
*spoiler alert* Ian is correct in pointing this out, but that's the main point of my talk.
The FFT is not new at all. Everybody knew about it.
It's just that nobody felt like revisiting the idea by actually implementing it and taking measurements.
On Tue, May 27, 2014 at 1:46 PM, Ian Goodfellow goodfellow.ian@gmail.com wrote:
FFTs have been used to speed up convolution since at least the 1960s. I'm not sure why this isn't done more in the neural nets community, but this year's ICLR paper is not the place where the idea was first proposed.
2014-05-27 13:21 GMT-04:00 Kyung Hyun Cho cho.k.hyun@gmail.com:
Dear all,
Guillaume Alain will tell us about the recently proposed technique of using FFT for fast convolutional operation tomorrow. See below for the detail.
See you there!
- Cho
============
- Speaker: Guillaume Alain - Date and Time: 28 May 2014 @ 13.00 - Place: AA3195 - Paper to be Discussed: Michael Mathieu, Mikael Henaff, Yann LeCun. Fast Training of Convolutional Networks through FFTs. ICLR 2014
_______________________________________________
Lisa_labo mailing list
Lisa_labo@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
_______________________________________________
Lisa_labo mailing list
Lisa_labo@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
_______________________________________________
Lisa_labo mailing list
Lisa_labo@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
_______________________________________________ Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo