[Lisa_seminaires] Alexandros Dimakis, June 15th 11AM, AA6214

Junyoung Chung elecegg at gmail.com
Jeu 15 Juin 12:20:13 EDT 2017


Hi all,

for people who are interested in talking to Alexandros,
please fill in your time in this sheet.
https://docs.google.com/spreadsheets/d/11doIbtfdPeL5E5M0L9ZG-J945j3M-0PkDXEvHgpxDQY/edit?usp=sharing

Best,
--Junyoung

On Thu, Jun 15, 2017 at 10:39 AM, Junyoung Chung <elecegg at gmail.com> wrote:

> Hi all,
>
> a friendly reminder: today's talk is happening shortly (11AM).
> The room will be AA6214.
>
> Title: Generative Models and Compressed Sensing
>
> Abstract: The goal of compressed sensing is to estimate a vector from an
> underdetermined system of noisy linear measurements, by making use of prior
> knowledge on the structure of vectors in the relevant domain. For almost
> all results in this literature, the structure is represented by sparsity in
> a well-chosen basis. We show how to achieve guarantees similar to standard
> compressed sensing but without employing sparsity at all. Instead, we
> suppose that vectors lie near the range of a generative model, e.g. a GAN
> or a VAE. We show how the problems of image inpainting and super-resolution
> are special cases of our general framework.
>
> We show how to generalize the RIP condition for generative models and that
> random gaussian measurement matrices have this property with high
> probability. A Lipschitz condition for the generative neural network is a
> key technical condition.
> We will also discuss on-going work for adding causality and distributed
> training to these models.
>
> (based on joint work with Ashish Bora, Ajil Jalal and Eric Price)
> Code: https://github.com/AshishBora/csgm
> Homepage: users.ece.utexas.edu/~dimakis
>
> Bio: Alex Dimakis is an Associate Professor at the ECE department,
> University of Texas at Austin. He received his Ph.D. in 2008 from UC
> Berkeley and the Diploma degree from the National Technical University of
> Athens in 2003. During 2009 he was a CMI postdoctoral scholar at Caltech.
> He received an NSF Career award, a Google faculty research award and the
> Eli Jury dissertation award. He is the co-recipient of several best paper
> awards including the joint Information Theory and Communications Society
> Best Paper Award in 2012. He is currently serving as an associate editor
> for IEEE Transactions on Information Theory. His research interests include
> information theory, coding theory and machine learning.
>
> Best,
> --Junyoung
>
> On Thu, Jun 15, 2017 at 8:00 AM, Junyoung Chung <elecegg at gmail.com> wrote:
>
>> Hi all,
>>
>> we have a tea-talk today at *11AM*.
>> The room will be AA6214.
>> You can find more details in my previous email.
>>
>>
> Best,
>> --Junyoung
>>
>> On Sun, Jun 11, 2017 at 10:42 PM, Junyoung Chung <elecegg at gmail.com>
>> wrote:
>>
>>> Hi all,
>>>
>>> we have another tea-talk next week. This talk will be given by Professor
>>> Alexandros Dimakis from the University of Texas at Austin. The talk will
>>> take place at *AA6214* at *11AM *on June 15th (*Thursday*).
>>>
>>> Please NOTE that there is a tea-talk on 13th (*Tuesday*) by Jackie C.
>>> K. Cheung (McGill University) at *Z109* (Pavillon Claire-McNicoll) at
>>> *2PM*. Check Dima's email for more detailed information about the talk.
>>>
>>> *Alexandros' talk.*
>>>
>>> *Title: *Generative Models and Compressed Sensing
>>>
>>> *Abstract: *The goal of compressed sensing is to estimate a vector from
>>> an underdetermined system of noisy linear measurements, by making use of
>>> prior knowledge on the structure of vectors in the relevant domain. For
>>> almost all results in this literature, the structure is represented by
>>> sparsity in a well-chosen basis. We show how to achieve guarantees similar
>>> to standard compressed sensing but without employing sparsity at all.
>>> Instead, we suppose that vectors lie near the range of a generative model,
>>> e.g. a GAN or a VAE. We show how the problems of image inpainting and
>>> super-resolution are special cases of our general framework.
>>>
>>> We show how to generalize the RIP condition for generative models and
>>> that random gaussian measurement matrices have this property with high
>>> probability. A Lipschitz condition for the generative neural network is a
>>> key technical condition.
>>> We will also discuss on-going work for adding causality and distributed
>>> training to these models.
>>>
>>> (based on joint work with Ashish Bora, Ajil Jalal and Eric Price)
>>> Code: https://github.com/AshishBora/csgm
>>> Homepage: users.ece.utexas.edu/~dimakis
>>>
>>> *Bio:* Alex Dimakis is an Associate Professor at the ECE department,
>>> University of Texas at Austin. He received his Ph.D. in 2008 from UC
>>> Berkeley and the Diploma degree from the National Technical University of
>>> Athens in 2003. During 2009 he was a CMI postdoctoral scholar at Caltech. He
>>> received an NSF Career award, a Google faculty research award and the Eli
>>> Jury dissertation award. He is the co-recipient of several best paper
>>> awards including the joint Information Theory and Communications Society
>>> Best Paper Award in 2012. He is currently serving as an associate editor
>>> for IEEE Transactions on Information Theory. His research interests include
>>> information theory, coding theory and machine learning.
>>>
>>> --Junyoung
>>>
>>
>>
>
-------------- section suivante --------------
Une pièce jointe HTML a été nettoyée...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_seminaires/attachments/20170615/03902983/attachment.html 


Plus d'informations sur la liste de diffusion Lisa_seminaires