Hi,
I would like to announce my own tea-talk for tomorrow -- starting at 2:30pm in AA3195. I will present some joint work with Asja and Samira:
Time: 2:30pm - 3:30pm
Where: AA3195
Title: Training deep generative (bidirectional) Helmholtz-Machines
== Abstract ==
Unsupervised training of deep generative models containing latent variables and performing inference remains a challenging problem. Various methods have been proposed and many of them train an auxiliary model to perform approximate inference for the generative model which is fitted to the training data. The top-down generative model is typically a directed model that starts from some prior over latent variables at the top, down to a distribution over the observed variables at the bottom. The approximate inference model runs in the opposite direction and is typically trained to efficiently infer high probability latent states given some observed data.
After presenting some of the well-known / state-of-the-art approaches (Wake sleep, NVIL and VAE) I will to talk about a new method we have been investigating, called a bidirectional Helmholtz machine (BiHM), that is based on the idea that the generative model should be close to the class of distributions that can be modeled by our approximate inference distribution. We achieve this by interpreting both the top-down and the bottom-up directed models as approximate inference distributions and by defining the target distribution we fit to the training data to be the geometric mean of these two. We present an upper-bound for the log-likelihood of this model and we show that optimizing this bound will pressure the model to stay close to the approximate inference distributions.Hope to see you tomorrow!j
_______________________________________________
Lisa_labo mailing list
Lisa_labo@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo