[Lisa_seminaires] [Lisa_labo] Fwd: Connectionists: The ReScience journal

Yoshua Bengio yoshua.umontreal at gmail.com
Mar 8 Sep 07:26:43 EDT 2015


Unless
(a) the research is about things like the computational efficiency brought
by some distributed training scheme ;-)
or
(b) the result was obtained on a very large dataset and model, and would
take too much time to replicate in SlowpokeML.

-- Yoshua


2015-09-07 23:02 GMT-04:00 Guillaume Alain <
guillaume.alain.umontreal at gmail.com>:

> I've read the thread and some of the links provided, and my two cents to
> add to this story would be the following.
>
> Replicability doesn't have to involve an algorithm that runs at the same
> speed. An experiment conducted with 4 GPUs, running for one hour, might be
> replicated with a CPU running for a week.
>
> That doesn't provide an easy solution to the replicability problem, but
> it's one aspect to keep in mind. Replicability doesn't mean that you *need*
> to duplicate the whole pipeline in an efficient way.
>
>
> You can read a paper about linked lists from 30 years ago, use the code
> (in C), and run it on data that's 100+ times bigger, and it'll still work.
> You can read the classic SICP course in Scheme, work through all the
> exercises, and ignore the fact that it ran insanely more slowly on the
> machines that MIT had a the time that they created the course.
>
> If we had machine learning toolkit called "snail" or "slowpoke" or
> "escargot" or whatever, and it was relatively easy to replicate experiments
> in that language, but it ran slow as hell, then it might satisfy the
> replicability criterion. No vectorization (unless you want to) for
> mini-batches. No cluster computation. No fancy GPU convolutions. Your code
> just has to be correct.
>
> I'm taking as hypothesis here that re-implementing your favorite algorithm
> in SlowpokeML would be easy, but whether such a language exists or not is
> something that is a whole other issue.
>
> On Sun, Sep 6, 2015 at 1:45 AM, Kyle Kastner <kastnerkyle at gmail.com>
> wrote:
>
>> Important to distinguish replication and reproduction. Replication is (in
>> my opinion) when experiments in similar or same environments work.
>> Reproduction is more independent - implementing the same ideas in a very
>> different setting. This uncovers much more subtle (but important) issues
>> than basic replication. And we need both in ML - but replication doubly so.
>>
>> Also dwf's point is a strong reason to minimize dependencies in research
>> and library development. Its a personal opinion but one that has helped
>> greatly in the past.
>> On Sep 5, 2015 7:18 PM, "Jörg Bornschein" <bornj at iro.umontreal.ca> wrote:
>>
>>>
>>> I agree and I was basically thinking the same thing (+that Docker or
>>> other container virtualization techniques could ease the pain [*]).
>>>
>>> But for reproducability in the scientific sense it is already a huge win
>>> when an independent implementation ran only once or twice on the machines
>>> of the reproducing authors (and the editors). From that point of view it
>>> would not be devastating if the code was not in a runnable state a few
>>> months later.
>>>
>>>
>>>    j
>>>
>>> [*] I actually wonder why they did't make the dependencies more explicit
>>> in their submission format.
>>>
>>>
>>>
>>>
>>> On Sat, Sep 5, 2015 at 7:00 PM David Warde-Farley <
>>> d.warde.farley at gmail.com> wrote:
>>>
>>>> Very encouraging to see this happening and that other people are
>>>> concerned about it.
>>>>
>>>> I would add that reproducibility in machine learning looks simple
>>>> compared to other scientific domains, but looks are deceiving. Every
>>>> "simple Python script" is built upon a broad and deep tower of library
>>>> dependencies, leading to an exponential number of ways that your
>>>> computing environment can conspire against you (nevermind hardware
>>>> differences...).
>>>>
>>>> On Sat, Sep 5, 2015 at 6:35 PM, Yoshua Bengio
>>>> <yoshua.umontreal at gmail.com> wrote:
>>>> >
>>>> > Very interesting!
>>>> > Reproducibility is VERY weak in the machine learning community, and
>>>> needs to
>>>> > be improved.
>>>> >
>>>> > ---------- Forwarded message ----------
>>>> > From: Nicolas P. Rougier <Nicolas.Rougier at inria.fr>
>>>> > Date: 2015-09-03 8:57 GMT-04:00
>>>> > Subject: Connectionists: The ReScience journal
>>>> > To: Connectionists group <connectionists at cs.cmu.edu>
>>>> >
>>>> >
>>>> >
>>>> > It's our great pleasure to announce the creation of "ReScience" which
>>>> is a
>>>> > peer-reviewed journal that targets computational research and
>>>> encourages the
>>>> > explicit replication of already published research, promoting new and
>>>> > open-source implementations in order to ensure that the original
>>>> research is
>>>> > reproducible.
>>>> >
>>>> > To achieve such a goal, the whole editing chain is radically
>>>> different from
>>>> > any other traditional scientific journal. ReScience lives on GitHub
>>>> where
>>>> > each new implementation is made available together with comments,
>>>> > explanations and tests. Each submission takes the form of a pull
>>>> request
>>>> > that is publicly reviewed and tested in order to guarantee that any
>>>> > researcher can re-use it.
>>>> >
>>>> > Students are strongly encourage to submit to ReScience. Even if the
>>>> > publishing model is a bit different from other academic journals,
>>>> this will
>>>> > give them a first experience at peer-reviewed publishing where they
>>>> have to
>>>> > use a rigorous and scientific approach.
>>>> >
>>>> >         • More on the journal website:
>>>> > https://github.com/ReScience/ReScience/wiki
>>>> >         • Current issue:
>>>> > https://github.com/ReScience/ReScience/wiki/Current-Issue
>>>> >         • FAQ:
>>>> >
>>>> https://github.com/ReScience/ReScience/wiki/Frequently-Asked-Questions
>>>> >         • Follow us on twitter (@ReScienceEds):
>>>> > https://twitter.com/rescienceeds
>>>> >
>>>> > And if you're familiar with Git and GitHub, you can also become a
>>>> reviewer:
>>>> > just contact us.
>>>> >
>>>> >
>>>> > Konrad Hinsen & Nicolas Rougier
>>>> >
>>>> >
>>>> > _______________________________________________
>>>> > Lisa_labo mailing list
>>>> > Lisa_labo at iro.umontreal.ca
>>>> > https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
>>>> >
>>>> _______________________________________________
>>>> Lisa_labo mailing list
>>>> Lisa_labo at iro.umontreal.ca
>>>> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
>>>>
>>>
>>> _______________________________________________
>>> Lisa_seminaires mailing list
>>> Lisa_seminaires at iro.umontreal.ca
>>> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_seminaires
>>>
>>>
>> _______________________________________________
>> Lisa_labo mailing list
>> Lisa_labo at iro.umontreal.ca
>> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
>>
>>
>
> _______________________________________________
> Lisa_labo mailing list
> Lisa_labo at iro.umontreal.ca
> https://webmail.iro.umontreal.ca/mailman/listinfo/lisa_labo
>
>
-------------- section suivante --------------
Une pièce jointe HTML a été nettoyée...
URL: http://webmail.iro.umontreal.ca/pipermail/lisa_seminaires/attachments/20150908/ccb408fb/attachment.html 


Plus d'informations sur la liste de diffusion Lisa_seminaires