If you want to book a 1-on-1 slot with Danny this afternoon, please put your name on the spreadsheet (2 slots left):
https://docs.google.com/spreadsheets/d/1kIXzQVdlv2_yPspJo6U9P_25TJgSvxs3Lnui...
Dima
On Fri, 9 Jun 2017 at 14:53 Dzmitry Bahdanau dimabgv@gmail.com wrote:
Hi all,
Thanks for coming everyone, that was the record number of people! If you want to talk to Danny, he will be around at 6th floor for some time.
Dima
On Fri, 9 Jun 2017 at 08:46 Dzmitry Bahdanau dimabgv@gmail.com wrote:
Hi all,
Just a kind reminder about the talk :)
Dima
On Tue, 6 Jun 2017 at 15:22 Dzmitry Bahdanau dimabgv@gmail.com wrote:
Hi all,
You have just received an email about the tea-talk on June 13, but I have good news for you: we will have one more before that!
Our next (really next) speaker is *Danny Tarlow*, who is a Research Scientist at Google Brain Montreal. He will present on *June 9, 13:45, at AA5340* (our usual tea-talk slot). Please find detailed information below.
For those who are confused, the coming tea-talks will be given on June 9, 13 and 15 (yet to be announced).
*Title:* Learning to Code: Machine Learning for Program Induction
*Abstract:* I'll present two of our recent works on using machine learning ideas to induce computer programs from input-output examples. The first system is TerpreT, which casts program synthesis as a continuous optimization problem on which we perform gradient descent. It enables comparison of gradient-based program synthesis techniques to discrete search techniques that are popular in the programming languages community. Based on our learnings from TerpreT, we develop the second system, DeepCoder, which induces programs from input-output examples using a neural network to guide discrete search techniques. DeepCoder achieves an order of magnitude speedup over optimized search techniques, and it can solve problems of difficulty comparable to the very simplest problems on programming competition websites.
*Bio:* Danny Tarlow is a Research Scientist at Google Brain Montreal. His main research interests are in the application of machine learning to problems involving structured data, with a specific interest in the intersection of machine learning and programming languages. He is a co-editor of the recent MIT Press book on Perturbations, Optimization, and Statistics (2017). His work has won awards at UAI (Best Student Paper, Runner Up), the ICML Workshop on Constructive Machine Learning (Best Paper), the NIPS Workshop on Neural Abstract Machines and Program Induction (Best Paper), and NIPS (Best Paper). He holds a Ph.D. from the Machine Learning group at the University of Toronto (2013) and previously was a Researcher at Microsoft Research Cambridge UK (2013 - 2017) with a Research Fellowship at Darwin College, University of Cambridge (2013 - 2016).
Dima