This week we have *Guillaume Lajoie* from *DMS *at *UdeM *giving a talk on *Friday April 13* at *10:30AM* in room *AA1360*.
Make sure to come to this dynamic talk and your research will spike! Michael
*TITLE *Dynamics of high-dimensional recurrent networks: how chaos shapes computation in biological and artificial neural networks
*KEYWORDS *Dynamic Systems Analysis, Neuroscience, DL Theory
*ABSTRACT*Networks of neurons —either biological or artificial— are called recurrent if their connections are distributed and contain feedback loops. Such networks can perform remarkably complex computations, as evidenced by their ubiquity throughout the brain and ever-increasing use in machine learning. They are, however, notoriously hard to control and their dynamics are generally poorly understood, especially in the presence of external forcing. This is because recurrent networks are typically chaotic systems, meaning they have rich and sensitive dynamics leading to variable responses to inputs. How does this chaos manifest in the neural code of the brain? How might we tame sensitivity to exploit complexity when training artificial recurrent networks for machine learning?
Understanding how the dynamics of large driven networks shape their capacity to encode and process information presents a sizeable challenge. In this talk, I will discuss the use of Random Dynamical Systems Theory as a framework to study information processing in high-dimensional, signal-driven networks. I will present an overview of recent results linking chaotic attractors to entropy production, dimensionality and input discrimination of dynamical observables. I will outline insights this theory provides on how cortex performs complex computations using sparsely connected inhibitory and excitatory neurons, as well as implications for gradient-based optimization methods for artificial networks.
*BIO*Since Jan 2018, Guillaume Lajoie is an Assistant Professor at the Math & Stats Dept. of UdeM. He obtained his PhD in applied mathematics from the University of Washington in 2013 and carried on research as an independent Bernstein Fellow at the Max Planck Institute for Dynamics and Self-Organization, as a visiting Scholar at the Courant Institute for Mathematical Sciences at NYU, and as a Washington Research Foundation Innovation Fellow at the University of Washington's Institute for Neuroengineering. His research interests lie at the intersection of applied mathematics, neuroscience and AI, where he works at leveraging tools from Dynamical Systems, Stochastic Processes, Information Theory and Machine Learning to better understand how recurrent network dynamics --either biological or artificial-- support computations. Applications of his research range from contributions to a theory of computation in recurrent networks, to the development of brain-computer interfaces.
Afficher les réponses par date
You can access the recording at this link: https://bluejeans.com/s/b7lnV/
On Tue, Apr 10, 2018 at 2:57 PM, Michael Noukhovitch mnoukhov@gmail.com wrote:
This week we have *Guillaume Lajoie* from *DMS *at *UdeM *giving a talk on *Friday April 13* at *10:30AM* in room *AA1360*.
Make sure to come to this dynamic talk and your research will spike! Michael
*TITLE *Dynamics of high-dimensional recurrent networks: how chaos shapes computation in biological and artificial neural networks
*KEYWORDS *Dynamic Systems Analysis, Neuroscience, DL Theory
*ABSTRACT*Networks of neurons —either biological or artificial— are called recurrent if their connections are distributed and contain feedback loops. Such networks can perform remarkably complex computations, as evidenced by their ubiquity throughout the brain and ever-increasing use in machine learning. They are, however, notoriously hard to control and their dynamics are generally poorly understood, especially in the presence of external forcing. This is because recurrent networks are typically chaotic systems, meaning they have rich and sensitive dynamics leading to variable responses to inputs. How does this chaos manifest in the neural code of the brain? How might we tame sensitivity to exploit complexity when training artificial recurrent networks for machine learning?
Understanding how the dynamics of large driven networks shape their capacity to encode and process information presents a sizeable challenge. In this talk, I will discuss the use of Random Dynamical Systems Theory as a framework to study information processing in high-dimensional, signal-driven networks. I will present an overview of recent results linking chaotic attractors to entropy production, dimensionality and input discrimination of dynamical observables. I will outline insights this theory provides on how cortex performs complex computations using sparsely connected inhibitory and excitatory neurons, as well as implications for gradient-based optimization methods for artificial networks.
*BIO*Since Jan 2018, Guillaume Lajoie is an Assistant Professor at the Math & Stats Dept. of UdeM. He obtained his PhD in applied mathematics from the University of Washington in 2013 and carried on research as an independent Bernstein Fellow at the Max Planck Institute for Dynamics and Self-Organization, as a visiting Scholar at the Courant Institute for Mathematical Sciences at NYU, and as a Washington Research Foundation Innovation Fellow at the University of Washington's Institute for Neuroengineering. His research interests lie at the intersection of applied mathematics, neuroscience and AI, where he works at leveraging tools from Dynamical Systems, Stochastic Processes, Information Theory and Machine Learning to better understand how recurrent network dynamics --either biological or artificial-- support computations. Applications of his research range from contributions to a theory of computation in recurrent networks, to the development of brain-computer interfaces.
Lisa_labo mailing list Lisa_labo@iro.umontreal.ca https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
If you liked last week's tea talk and want to discuss research ideas more with Guillaume, email me and we'll set something up! And since Guillaume is just down the hall, we're doing meetings with a bit more flexibility and can work around schedules!
And if you weren't at the tea talk, then you missed out! Luckily, thanks to Xavier, you can still watch the full recording here https://bluejeans.com/s/b7lnV/
Cheers! Michael
On Tue, Apr 10, 2018 at 2:57 PM Michael Noukhovitch mnoukhov@gmail.com wrote:
This week we have *Guillaume Lajoie* from *DMS *at *UdeM *giving a talk on *Friday April 13* at *10:30AM* in room *AA1360*.
Make sure to come to this dynamic talk and your research will spike! Michael
*TITLE *Dynamics of high-dimensional recurrent networks: how chaos shapes computation in biological and artificial neural networks
*KEYWORDS *Dynamic Systems Analysis, Neuroscience, DL Theory
*ABSTRACT*Networks of neurons —either biological or artificial— are called recurrent if their connections are distributed and contain feedback loops. Such networks can perform remarkably complex computations, as evidenced by their ubiquity throughout the brain and ever-increasing use in machine learning. They are, however, notoriously hard to control and their dynamics are generally poorly understood, especially in the presence of external forcing. This is because recurrent networks are typically chaotic systems, meaning they have rich and sensitive dynamics leading to variable responses to inputs. How does this chaos manifest in the neural code of the brain? How might we tame sensitivity to exploit complexity when training artificial recurrent networks for machine learning?
Understanding how the dynamics of large driven networks shape their capacity to encode and process information presents a sizeable challenge. In this talk, I will discuss the use of Random Dynamical Systems Theory as a framework to study information processing in high-dimensional, signal-driven networks. I will present an overview of recent results linking chaotic attractors to entropy production, dimensionality and input discrimination of dynamical observables. I will outline insights this theory provides on how cortex performs complex computations using sparsely connected inhibitory and excitatory neurons, as well as implications for gradient-based optimization methods for artificial networks.
*BIO*Since Jan 2018, Guillaume Lajoie is an Assistant Professor at the Math & Stats Dept. of UdeM. He obtained his PhD in applied mathematics from the University of Washington in 2013 and carried on research as an independent Bernstein Fellow at the Max Planck Institute for Dynamics and Self-Organization, as a visiting Scholar at the Courant Institute for Mathematical Sciences at NYU, and as a Washington Research Foundation Innovation Fellow at the University of Washington's Institute for Neuroengineering. His research interests lie at the intersection of applied mathematics, neuroscience and AI, where he works at leveraging tools from Dynamical Systems, Stochastic Processes, Information Theory and Machine Learning to better understand how recurrent network dynamics --either biological or artificial-- support computations. Applications of his research range from contributions to a theory of computation in recurrent networks, to the development of brain-computer interfaces.
lisa_seminaires@iro.umontreal.ca