[Lisa_teatalk] Thesis defence - Guillaume Desjardins Thursday 15:30 AA3195

Razvan Pascanu r.pascanu at gmail.com
Tue Feb 18 09:44:01 EST 2014


The room and place **has changed**.

Guillaume defence is going to be **Thursday 15:00 in room Z-245, Pavillon
Claire McNicoll**.

Sorry for the mistake.

Hope to see many of you there.
Razvan


On Mon, Feb 17, 2014 at 7:45 PM, Razvan Pascanu <r.pascanu at gmail.com> wrote:

> Hi all,
>
>  It gives me great pleasure to announce Guillaume's defence. It is this
> Thursday 15:30 in AA3195.
> Here is the title and abstract:
>
> Improving Sampling, Optimization and Feature Extraction in Boltzmann
> Machines
> Despite the current widescale success of deep learning in training large
> scale hierarchical models through supervised learning, unsupervised
> learning promises to play a crucial role towards solving general Artificial
> Intelligence, where agents are expected to learn with little to no
> supervision. The work presented in this thesis tackles the problem of
> unsupervised feature learning and density estimation, using a model family
> at the heart of the deep learning phenomenon: the Boltzmann Machine (BM).
>  We present contributions in the areas of sampling, partition function
> estimation, optimization and the more general topic of invariant feature
> learning.
>
> With regards to sampling, we present a novel adaptive parallel tempering
> method which dynamically adjusts the temperatures under simulation to
> maintain good mixing in the presence of complex multi-modal distributions.
> When used in the context of (stochastic) maximum likelihood (SML) training,
> the improved ergodicity of our sampler translates to increased robustness
> to learning rates and faster per epoch convergence.  Though our application
> is limited to BM, our method is general and is applicable to sampling from
> arbitrary probabilistic models using Markov Chain Monte Carlo (MCMC)
> techniques.  While SML gradients can be estimated via sampling, computing
> data likelihoods requires an estimate of the partition function. Contrary
> to previous approaches which consider the model as a black box, we provide
> an efficient algorithm which instead tracks the change in log partition
> function incurred by successive parameter updates.  Our algorithm frames
> this estimation problem as one of filtering performed over a 2D lattice,
> with one dimension representing time and the other temperature.
>
> On the topic of optimization, our thesis presents a novel algorithm for
> applying the natural gradient to large scale Boltzmann Machines. Up until
> now, its application had been constrained by the computational and memory
> requirements of computing the Fisher Information Matrix (FIM), which is
> square in the number of parameters. The Metric-Free Natural Gradient
> algorithm (MFNG) avoids computing the FIM altogether by combining a linear
> solver with an efficient matrix-vector operation. The method shows promise
> in that the resulting updates yield faster per-epoch convergence, despite
> being slower in terms of wall-time.
>
> Finally, we explore how invariant features can be learnt through
> modifications to the BM energy function. We study the problem in the
> context of the spike \& slab RBM, which we extend to handle both binary and
> sparse input distributions. By associating each spike with several slab
> variables, latent variables can be made invariant to a rich, high
> dimensional subspace resulting in increased invariance in the learnt
> representation. When using the expected model posterior as input to a
> classifier, increased invariance translates to improved classification
> accuracy in the low-label data regime. We conclude by showing a connection
> between invariance and the more powerful concept of disentangling factors
> of variation. While invariance can be achieved by pooling over subspaces,
> disentangling can be achieved by learning multiple complementary views of
> the same subspace. In particular, we show how this can be achieved using
> third-order BMs featuring multiplicative interactions between pairs of
> random variables.
>
>
>
> Hope to see many of you there !
>
> Razvan
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_teatalk/attachments/20140218/e1bbae6a/attachment.html 


More information about the Lisa_teatalk mailing list