Sorry for the delay but tomorrow we have *David Rolnick * visiting with Konrad Kording's lab at * UPenn * giving a talk on *August 24 2018* at *10:30* in room *AA6214*
**Note the different room AA6214**
Will this talk be streamed https://mila.bluejeans.com/809027115/webrtc? yes David will be free until only 1:30~2pm so if you'd like to go with him for lunch, stay after the talk and join in!
Be sure to come and hear about the impactful, *profound* insights David has to share. Michael
P.S. since David was so kind as to step in on short notice, I'll be making (or buying) cookies for early attendees ;)
*TITLE* The impact of depth on expressivity and learning
*KEYWORDS *Deep Learning Theory
*ABSTRACT* Deeper networks are more powerful than shallow ones, but can be harder to train. In this talk, we will rigorously investigate why both of these statements are true. Specifically, we will prove that depth leads to an exponentially greater ability to express even simple polynomial functions. We will identify why some initializations and architectures impede learning in deeper networks, and demonstrate (both mathematically and empirically) several principles to bear in mind when designing an MLP/ConvNet/ResNet that will learn effectively.
*BIO* David Rolnick completed his Ph.D. in Mathematics at MIT this year, co-advised by Nir Shavit, Max Tegmark, and Ed Boyden. His work focuses on the mathematical foundations of artificial and biological neural networks. A former NSF Graduate Research Fellow and Fulbright Scholar, David has also worked on machine learning research as an intern at Google and DeepMind. He will be joining Konrad Kording's group at UPenn as an NSF Mathematical Sciences Postdoctoral Research Fellow.
Afficher les réponses par date
Reminder: talk in 15
On Thu, Aug 23, 2018, 12:56 Michael Noukhovitch mnoukhov@gmail.com wrote:
Sorry for the delay but tomorrow we have *David Rolnick * visiting with Konrad Kording's lab at * UPenn * giving a talk on *August 24 2018* at *10:30* in room *AA6214*
**Note the different room AA6214**
Will this talk be streamed https://mila.bluejeans.com/809027115/webrtc? yes David will be free until only 1:30~2pm so if you'd like to go with him for lunch, stay after the talk and join in!
Be sure to come and hear about the impactful, *profound* insights David has to share. Michael
P.S. since David was so kind as to step in on short notice, I'll be making (or buying) cookies for early attendees ;)
*TITLE* The impact of depth on expressivity and learning
*KEYWORDS *Deep Learning Theory
*ABSTRACT* Deeper networks are more powerful than shallow ones, but can be harder to train. In this talk, we will rigorously investigate why both of these statements are true. Specifically, we will prove that depth leads to an exponentially greater ability to express even simple polynomial functions. We will identify why some initializations and architectures impede learning in deeper networks, and demonstrate (both mathematically and empirically) several principles to bear in mind when designing an MLP/ConvNet/ResNet that will learn effectively.
*BIO* David Rolnick completed his Ph.D. in Mathematics at MIT this year, co-advised by Nir Shavit, Max Tegmark, and Ed Boyden. His work focuses on the mathematical foundations of artificial and biological neural networks. A former NSF Graduate Research Fellow and Fulbright Scholar, David has also worked on machine learning research as an intern at Google and DeepMind. He will be joining Konrad Kording's group at UPenn as an NSF Mathematical Sciences Postdoctoral Research Fellow.
Door is currently locked, we are going to delay the talk 15 minutes to figure this out
On Fri, Aug 24, 2018, 10:18 Michael Noukhovitch mnoukhov@gmail.com wrote:
Reminder: talk in 15
On Thu, Aug 23, 2018, 12:56 Michael Noukhovitch mnoukhov@gmail.com wrote:
Sorry for the delay but tomorrow we have *David Rolnick * visiting with Konrad Kording's lab at * UPenn * giving a talk on *August 24 2018* at *10:30* in room *AA6214*
**Note the different room AA6214**
Will this talk be streamed https://mila.bluejeans.com/809027115/webrtc? yes David will be free until only 1:30~2pm so if you'd like to go with him for lunch, stay after the talk and join in!
Be sure to come and hear about the impactful, *profound* insights David has to share. Michael
P.S. since David was so kind as to step in on short notice, I'll be making (or buying) cookies for early attendees ;)
*TITLE* The impact of depth on expressivity and learning
*KEYWORDS *Deep Learning Theory
*ABSTRACT* Deeper networks are more powerful than shallow ones, but can be harder to train. In this talk, we will rigorously investigate why both of these statements are true. Specifically, we will prove that depth leads to an exponentially greater ability to express even simple polynomial functions. We will identify why some initializations and architectures impede learning in deeper networks, and demonstrate (both mathematically and empirically) several principles to bear in mind when designing an MLP/ConvNet/ResNet that will learn effectively.
*BIO* David Rolnick completed his Ph.D. in Mathematics at MIT this year, co-advised by Nir Shavit, Max Tegmark, and Ed Boyden. His work focuses on the mathematical foundations of artificial and biological neural networks. A former NSF Graduate Research Fellow and Fulbright Scholar, David has also worked on machine learning research as an intern at Google and DeepMind. He will be joining Konrad Kording's group at UPenn as an NSF Mathematical Sciences Postdoctoral Research Fellow.
Hi all,
For some reason it seems the power outlets are off so we can't plug the camera and stream. Sorry about that!
Xavier
Le ven. 24 août 2018 10:27, Michael Noukhovitch mnoukhov@gmail.com a écrit :
Door is currently locked, we are going to delay the talk 15 minutes to figure this out
On Fri, Aug 24, 2018, 10:18 Michael Noukhovitch mnoukhov@gmail.com wrote:
Reminder: talk in 15
On Thu, Aug 23, 2018, 12:56 Michael Noukhovitch mnoukhov@gmail.com wrote:
Sorry for the delay but tomorrow we have *David Rolnick * visiting with Konrad Kording's lab at * UPenn * giving a talk on *August 24 2018* at *10:30* in room *AA6214*
**Note the different room AA6214**
Will this talk be streamed https://mila.bluejeans.com/809027115/webrtc? yes David will be free until only 1:30~2pm so if you'd like to go with him for lunch, stay after the talk and join in!
Be sure to come and hear about the impactful, *profound* insights David has to share. Michael
P.S. since David was so kind as to step in on short notice, I'll be making (or buying) cookies for early attendees ;)
*TITLE* The impact of depth on expressivity and learning
*KEYWORDS *Deep Learning Theory
*ABSTRACT* Deeper networks are more powerful than shallow ones, but can be harder to train. In this talk, we will rigorously investigate why both of these statements are true. Specifically, we will prove that depth leads to an exponentially greater ability to express even simple polynomial functions. We will identify why some initializations and architectures impede learning in deeper networks, and demonstrate (both mathematically and empirically) several principles to bear in mind when designing an MLP/ConvNet/ResNet that will learn effectively.
*BIO* David Rolnick completed his Ph.D. in Mathematics at MIT this year, co-advised by Nir Shavit, Max Tegmark, and Ed Boyden. His work focuses on the mathematical foundations of artificial and biological neural networks. A former NSF Graduate Research Fellow and Fulbright Scholar, David has also worked on machine learning research as an intern at Google and DeepMind. He will be joining Konrad Kording's group at UPenn as an NSF Mathematical Sciences Postdoctoral Research Fellow.
--
You received this message because you are subscribed to the Google Groups "MILA Tous" group. To unsubscribe from this group and stop receiving emails from it, send an email to mila-tous+unsubscribe@mila.quebec.
lisa_seminaires@iro.umontreal.ca