We will have a talk today at 2pm. Our speaker is Michael Chang from MIT, working with Josh Tenenbaum.
Date / Time: Wednesday May 31st, 2pm-3pm Location: AA3195
Title: *Learning Visual and Physical Models of the Environment*
Abstract: An intelligent agent can leverage a model of the environment as a prior to accelerate future learning and generalize beyond its own experience. I view building such models as learning simulator programs that infer latent properties and generate predictions from observation. How can the agent learn the primitives for these programs from observation? What are the means of combination that operate on these primitives that allow zero-shot generalization to come naturally? In this talk, I discuss two of my recent work that propose potential directions for tackling these questions in the context of learning visual concepts and intuitive physics. First, I present an algorithm for learning factorized symbolic representations from raw visual data. I show that these representations capture latent factors of variation that can be manipulated like a graphics code. Next, I present a framework for learning predictive models of intuitive physics. With two-dimensional worlds of bouncing balls, I demonstrate that this framework generalizes to variable object count and variable scene configurations with only spatially and temporally local computation. I conclude by describing open research questions motivated by the results from these approaches.
Bio: Michael Chang is a senior in Computer Science at MIT, researching in Professor Joshua Tenenbaum's Computational Cognitive Science Group. He is interested in building algorithms for learning compositional programs and bridging the strengths of symbolic and neural representations. At MIT, he has worked on unsupervised learning of symbolic visual concepts and of physical dynamics. Michael has spent time at the University of Michigan researching with Professor Honglak Lee. Michael will be pursuing a Ph.D. in Computer Science at U.C. Berkeley beginning fall 2017. Links to papers and code are here: http://mbchang.github.io.
Afficher les réponses par date
lisa_seminaires@iro.umontreal.ca