We have another round of ICLR tea-talks tomorrow, at 2:00pm. Speakers and
papers below. See you there !
I will be looking for 2 more volunteers for the following week, so please
speak up if you haven't presented yet.
Presenter: Li Yao
"No More Pesky Learning Rates". Tom Schaul, Sixin Zhang, Yann LeCun
"Adaptive learning rates and parallelization for stochastic, sparse,
non-smooth gradients<http://openreview.net/document/c14f2204-fd66-4d91-bed4-153523694041#c14f220…>".
Tom Schaul, Yann LeCun
Presenter: Mehdi Mirza
Stochastic Pooling for Regularization of Deep Convolutional Neural
Networks.<http://openreview.net/document/7230c92b-64e0-46f1-b6c8-d1c0018c5491#7230c92…>Matthew
D. Zeiler, Rob Fergus
--
Guillaume
There is unfortunately no tea-talk scheduled for today.
On a related note, we cannot have tea-talks without volunteers. So if you
haven't yet presented an ICLR paper, please do so. We have open spots for
the next 2-3 weeks.
--
Guillaume
Sorry for the late notice. Please join us for today's tea-talk at 2pm.
Speakers and titles below.
On Wed, Mar 6, 2013 at 8:52 AM, Guillaume Desjardins <
guillaume.desjardins(a)gmail.com> wrote:
> This week's tea-talk will be postponed to next week (Thursday, March
> 14th), as several students are away for reading week and one of the
> speakers is feeling a bit under the weather.
>
> Next week's speakers will be:
> * Ian Goodfellow, who will discuss a recent ICML submission on "Maxout
> Networks" <http://arxiv.org/abs/1302.4389>
> * Yann Dauphin, who will discuss his ICLR submission, "Big Neural
> Networks Waste Capacity<http://openreview.net/document/5563cdeb-e853-47ac-83b2-1b1b20cc1535#5563cde…>
> "
>
> --
> Guillaume
>
This week's tea-talk will be postponed to next week (Thursday, March 14th),
as several students are away for reading week and one of the speakers is
feeling a bit under the weather.
Next week's speakers will be:
* Ian Goodfellow, who will discuss a recent ICML submission on "Maxout
Networks" <http://arxiv.org/abs/1302.4389>
* Yann Dauphin, who will discuss his ICLR submission, "Big Neural Networks
Waste Capacity<http://openreview.net/document/5563cdeb-e853-47ac-83b2-1b1b20cc1535#5563cde…>
"
--
Guillaume