Sorry for the delay but tomorrow we have *David Rolnick * visiting with Konrad Kording's lab at * UPenn * giving a talk on *August 24 2018* at *10:30* in room *AA6214*
**Note the different room AA6214**
Will this talk be streamed https://mila.bluejeans.com/809027115/webrtc? yes David will be free until only 1:30~2pm so if you'd like to go with him for lunch, stay after the talk and join in!
Be sure to come and hear about the impactful, *profound* insights David has to share. Michael
P.S. since David was so kind as to step in on short notice, I'll be making (or buying) cookies for early attendees ;)
*TITLE* The impact of depth on expressivity and learning
*KEYWORDS *Deep Learning Theory
*ABSTRACT* Deeper networks are more powerful than shallow ones, but can be harder to train. In this talk, we will rigorously investigate why both of these statements are true. Specifically, we will prove that depth leads to an exponentially greater ability to express even simple polynomial functions. We will identify why some initializations and architectures impede learning in deeper networks, and demonstrate (both mathematically and empirically) several principles to bear in mind when designing an MLP/ConvNet/ResNet that will learn effectively.
*BIO* David Rolnick completed his Ph.D. in Mathematics at MIT this year, co-advised by Nir Shavit, Max Tegmark, and Ed Boyden. His work focuses on the mathematical foundations of artificial and biological neural networks. A former NSF Graduate Research Fellow and Fulbright Scholar, David has also worked on machine learning research as an intern at Google and DeepMind. He will be joining Konrad Kording's group at UPenn as an NSF Mathematical Sciences Postdoctoral Research Fellow.