Hi all,
This Tuesday (March 7) we will have an extra tea-talk by *Martin Arjovsky*. Please note that the time is different (14:30), but the place is the same (AA6214). Hope to see many of you there!
*Title: *On Different Distances Between Distributions and Generative Adversarial Networks
*Abstract: *Generative adversarial networks (GANs) are notoriously difficult to train. At the core of it, we show that these problems arise naturally when trying to learn distributions whose support lie in low dimensional manifolds. We show how these problems are consequences of trying to optimize the classical divergences (KL, JSD, etc) between our real and data distribution, and that these are symptoms of a more general phenomenon, pointing towards the inefficacy of the usual divergences in certain settings. After that, we bring into play the Wasserstein distance, which we prove doesn't suffer from the same behaviour, and provide a first step on an algorithm that tries to approximately optimize this distance.
Dima