Hey everyone.
Just a tiny bit of background for the talk:
Bayesian Optimization is a general purpose Black-Box optimization technique.
The authors have been developing it as an approach to selecting/tuning hyperparameters for ML algorithms, including deep Neural Nets. They set SotA of CIFAR10 with this basic approach in 2012.
This work extends their earlier approach by allow pausing (freezing) and resuming (thawing) of experiments.
On Tue, Oct 21, 2014 at 1:33 PM, Kyung Hyun Cho cho.k.hyun@gmail.com wrote:
Dear all,
Tomorrow David Krueger will tell us about Bayesian optimization method based on a recent paper <Freeze-Thaw Bayesian Optimization> (Swesky, Snoek and Adams, 2014) at the usual place AA3195.
Sorry about a late announcement due to my travel!
- Cho
===
- Speaker: David Krueger
- Date and Place: 13.30 - 14.30, 22 Oct (Wed) at AA3195
- Title: Freeze-Thaw Bayesian Optimization (Swesky, Snoek and Adams, 2014)
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo