Please join us tomorrow morning for a talk by Joerg Bornschein. The talk will be held at 11:00 am in a room to be determined shortly (probably AA3195).
Title: Parallelized training of nonlinear sparse coding
Abstract:
I'd like to give a short presentation about our work on two closely
related nonlinear sparse coding models. Both
models are nonlinear in the sense, that they assume active dictionary
elements to combine nonlinearly when
explaining the observed data. One of the models assumes binary latent
variables, the other assumes continuous
latents governed by a spike-and-slab prior. We train both models on
natural image data and analyze the learned
receptive fields.
The models are trained using a parallelized EM implementation which we
typically run on ~300 CPU cores in
parallel. But the implementation demonstrated reasonable (weak)
scaling behavior even when running on
>1000 CPU cores.