---------- Forwarded message ----------
From: CRM <crm(a)crm.umontreal.ca>
Date: Mon, Oct 26, 2015 at 2:36 PM
Subject: COLLOQUE DES SCIENCES MATHÉMATIQUES DU QUÉBEC (30/10/2015,
Emmanuel Candès)
To: activites(a)crm.umontreal.ca
******************************************************************
COLLOQUE DES SCIENCES MATHÉMATIQUES DU QUÉBEC - Montréal
http://www.crm.umontreal.ca/Colloques/index.html
******************************************************************
DATE :
Le vendredi 30 octobre 2015 / Friday, October 30, 2015
HEURE / TIME :
16 h / 4:00 p.m.
CONFERENCIER(S) / SPEAKER(S) :
Emmanuel Candès (Stanford University)
TITRE / TITLE :
A knockoff filter for controlling the false discovery rate
LIEU / PLACE :
CRM, UdeM, Pav. André-Aisenstadt, 2920, ch. de la Tour, salle 6254
RESUME / ABSTRACT :
The big data era has created a new scientific paradigm: collect data first,
ask questions later. Imagine that we observe a response variable together
with a large number of potential explanatory variables, and would like to
be able to discover which variables are truly associated with the response.
At the same time, we need to know that the false discovery rate (FDR)---the
expected fraction of false discoveries among all discoveries---is not too
high, in order to assure the scientist that most of the discoveries are
indeed true and replicable. We introduce the knockoff filter, a new
variable selection procedure controlling the FDR in the statistical linear
model whenever there are at least as many observations as variables. This
method works by constructing fake variables, knockoffs, which can then be
used as controls for the true variables; the method achieves exact FDR
control in finite sample settings no matter the design or covariates, the
number of variables in the model, and the amplitudes of the unknown
regression coefficients, and does not require any knowledge of the noise
level. This is joint work with Rina Foygel Barber.
******************************************************************
Responsable(s) :
Yvan Saint-Aubin (yvan.saint-aubin(a)umontreal.ca)
Iosif Polterovich (iosif.polterovich(a)umontreal.ca)
Henri Darmon (darmon(a)math.mcgill.ca)
David A. Stephens (dstephens(a)math.mcgill.ca)
Hi everyone,
this Friday at 2:30pm we will have a talk by Sarath Chandar about Correlational
Neural Networks.
Looking forward to the talk and hope to see you there,
Jorg
== Details & Abstract ==
Title: Correlational Neural Networks
Speaker: Sarath Chandar
Time: Friday, October 30th, 2:30pm
Location: AA3195
Abstract:
Common Representation Learning (CRL), wherein different descriptions (or
views) of the data are embedded in a common subspace, is receiving a lot of
attention recently. Two popular paradigms here are Canonical Correlation
Analysis (CCA) based approaches and Autoencoder (AE) based approaches. CCA
based approaches learn a joint representation by maximizing correlation of
the views when projected to the common subspace. AE based methods learn a
common representation by minimizing the error of reconstructing the two
views. Each of these approaches has its own advantages and disadvantages.
For example, while CCA based approaches outperform AE based approaches for
the task of transfer learning, they are not as scalable as the latter. In
this work we propose an AE based approach called Correlational Neural
Network (CorrNet), that explicitly maximizes correlation among the views
when projected to the common subspace. Through a series of experiments, we
demonstrate that the proposed CorrNet is better than the above mentioned
approaches with respect to its ability to learn correlated common
representations. Further, we employ CorrNet for several cross language
tasks and show that the representations learned using CorrNet perform
better than the ones learned using other state of the art approaches.
CorrNet can be easily extended to the case where you have more than 2
views. We demonstrate this by applying CorrNet for two specific downstream
applications: cross language document classification across 12 different
languages and multilingual multimodal retrieval.
Links:
1. Correlational Neural Networks ( http://arxiv.org/abs/1504.07225 )
2. Bridge Correlational Neural Networks for Multilingual Multimodal
Representation Learning ( http://arxiv.org/abs/1510.03519 )
====
Dear all,
after a long pause we will have a tea talk this Friday at 2:30pm in AA3195.
We will have Galin Georgiev talk about
Universal neural nets: Gibbs machines and ACE
== Abstract ==
We study a class of neural nets - Gibbs machines - which are a type of
variational auto-encoders, designed for gradual learning. They offer an
universal platform for incrementally adding newly learned features,
including physical symmetries in space/time. Combining them with
classifiers gives rise to a brand of universal generative neural nets -
stochastic auto-classifier-encoders (ACE). ACE preserve the non-Gaussian
and clustering nature of real-life data and have state-of-the-art
performance, both for classification and density estimation for the MNIST
data set.
Link: http://arxiv.org/abs/1508.06585
Hope to see many of you there,
Jorg