Blogs (1) >>
ICSE 2019
Sat 25 - Fri 31 May 2019 Montreal, QC, Canada

Recognizing and Rewarding Open Science in Software Engineering

The ROSE festival is a world-wide salute to replication and reproducibility in SE (for a definition of these terms, see the end of this CFP).

ROSE Festival Logo

Our aim is to create a venue where researchers can receive public credit for facilitating and participating in open science in SE (specifically, in creating replicated and reproduced results). ROSE is needed since most current conferences only evaluate research artifacts generated by that venue’s accepted papers. This makes it difficult for research papers to earn credit for replication and reproduction by other researchers (since no other team of researchers has yet to see this new result).

Enter ROSE. ROSE is a 90 minute session comprising lightning talks by researchers presenting replicated and reproduced results, followed by a panel discussing issues of replication in software engineering. Presentations can be about any prior SE publication (and is not restricted just to results from ICSE’19).

Note that:

  • ROSE is a non-archival forum. Material presented at ROSE may also be submitted to other forums.
  • Journal special issues are being planned to take the better ROSE-results to an archival publication. Venue TBD but, perhaps, the Empirical Software Engineering Journal.
You're viewing the program in a time zone which is different from your device's time zone change time zone

Thu 30 May

Displayed time zone: Eastern Time (US & Canada) change

12:30 - 14:00
ROSE FestivalROSE Festival at Agora
Chair(s): Robert Feldt Chalmers University of Technology, Tim Menzies North Carolina State University, Thomas Zimmermann Microsoft Research
12:30
3m
Talk
A Partial Reproduction of Malware Detection with RevealDroid
ROSE Festival
Haipeng Cai Washington State University, USA
12:33
3m
Talk
A Partial Replication of “Sentiment Analysis for Software Engineering: How Far Can We Go?”
ROSE Festival
Gias Uddin Polytechnique Montreal, Foutse Khomh Polytechnique Montréal, Yann-Gaël Guéhéneuc Concordia University and Polytechnique Montréal, Chanchal K. Roy University of Saskatchewan
12:36
3m
Talk
A Partial Replication of "Decoding the Representation of Code in the Brain: An fMRI Study of Code Review and Expertise"
ROSE Festival
Davide Fucci University of Hamburg, Daniela Girardi , Nicole Novielli University of Bari, Luigi Quaranta , Filippo Lanubile University of Bari
12:39
3m
Talk
The Impact of Code Review Measures on Post-Release Defects: Replications and Bayesian Networks
ROSE Festival
Andrey Krutauz , Tapajit Dey , Peter Rigby Concordia University, Montreal, Canada, Audris Mockus University of Tennessee - Knoxville
12:42
3m
Talk
An Eye Tracking Replication on How Developers Read and Summarize Java Methods
ROSE Festival
Nahla Abid , Bonita Sharif University of Nebraska-Lincoln, USA, Jonathan I. Maletic Kent State University
12:45
3m
Talk
An Investigation of Routine Repetitiveness in Open-Source Projects: A Partial Reproduction of "A large-scale study on repetitiveness, containment, and composability of routines in open-source projects"
ROSE Festival
Robert Dyer Bowling Green State University
12:48
3m
Talk
Partial Replication of Seven Studies on Comparing the Stability of Clone and Non-clone Code
ROSE Festival
Manishankar Mondal Assistant Professor, Khulna University, Md Saidur Rahman , Chanchal K. Roy University of Saskatchewan, Kevin Schneider University of Saskatchewan
12:51
3m
Talk
A partial replication of "Automatic Summarization of Bug Reports"
ROSE Festival
Akalanka Galappaththi University of Lethbridge, John Anvik
12:54
3m
Talk
Improving Source Code Readability: Theory and Practice
ROSE Festival
Devjeet Roy , Sarah Fakhoury Washington State University, Venera Arnaoudova Washington State University
12:57
10m
Talk
Mobile-App Analysis and Instrumentation Techniques Reimagined with DECREE
ROSE Festival
Yixue Zhao University of Southern California, USA, Nenad Medvidović University of Southern California
Pre-print
13:07
53m
Talk
(Panel) Ensuring the Success of Open Science: Practicalities, Tools, and Checklists
ROSE Festival

Accepted Papers

Title
An Eye Tracking Replication on How Developers Read and Summarize Java Methods
ROSE Festival
An Investigation of Routine Repetitiveness in Open-Source Projects: A Partial Reproduction of "A large-scale study on repetitiveness, containment, and composability of routines in open-source projects"
ROSE Festival
A partial replication of "Automatic Summarization of Bug Reports"
ROSE Festival
A Partial Replication of "Decoding the Representation of Code in the Brain: An fMRI Study of Code Review and Expertise"
ROSE Festival
A Partial Replication of “Sentiment Analysis for Software Engineering: How Far Can We Go?”
ROSE Festival
A Partial Reproduction of Malware Detection with RevealDroid
ROSE Festival
Improving Source Code Readability: Theory and Practice
ROSE Festival
Mobile-App Analysis and Instrumentation Techniques Reimagined with DECREE
ROSE Festival
Pre-print
(Panel) Ensuring the Success of Open Science: Practicalities, Tools, and Checklists
ROSE Festival

Partial Replication of Seven Studies on Comparing the Stability of Clone and Non-clone Code
ROSE Festival
The Impact of Code Review Measures on Post-Release Defects: Replications and Bayesian Networks
ROSE Festival

Call for Contributions

Due Date: Mar 8, 2019

Submit your proposal to Easychair.

Submissions to ROSE are an abstract (1page pdf, max) for a proposed lightning talk (2-5 mins). Each talk must be about two things:

  • A prior SE publication (Paper1) which has been replicated/reproduced.
  • Substantive evidence that that parts of Paper1 has been replicated/reproduced. This evidence must be substantive e.g.
    • A recent research SE paper (Paper2)
    • A Url links to an as yet unpublished pre-print
Note that Paper1 and Paper2 can come from any SE venue (ideally, peer-reviewed but if otherwise, reviewers will assess the paper on a case-by-case basis).

We also welcome methodological (meta) papers that help promote, facilitate or increase understanding about open science, replication and reproduction of software engineering research.

To facilitate easy reviewing, authors are encouraged to following the following format for their abstract:

  • TITLE: “A [ Partial] (Replication|Reproduction) of XYZ”. Please add the term “partial” to your title if only some of the original work could be replicated/reproduced.
  • WHO: name the original authors (and paper) and the authors that performed the replication/reproduction.
  • WHAT: describe the “thing” being replicated/reproduced;
  • WHY: clearly state why that “thing” is interesting/important;
  • HOW: say how it was done first
  • WHERE: describe the replication/reproduction. If the replication/reproduction was only partial, then explain what parts could be achieved or had to be missed.
  • DISCUSSION: What aspects of this thing made it easier/harder to replicate/reproduce. What are the lessons learned from this work that would enable more replication/reproduction in the future for other kinds of tasks or other kinds of research. Naturally, meta papers might need a different structure so if you are planning to do a meta talk please contact the chairs.

EVALUATION:

  • 2 PC members will review each abstract, possibly reaching out to the authors of the original Paper1. Abstracts will be ranked as follows.
  • If pc members do not find sufficient substantive evidence for replication/reproduction, the abstract will be rejected.
  • Any abstract that is overly critical of prior work, it will be rejected (*).
  • The remaining abstracts will be sorted according to (a) interestingness and (b) correctness.
  • The top 10 abstracts (or more, if there is time), will be invited to give lightning talks.

(*) Our goal is to foster a positive environment that supports and rewards researchers for conducting replications and reproductions. To that end, we require that all ROSE abstracts and presentations pay due respect to the work they are reproducing/replicating. Criticisms of prior work is acceptable only as part of a balanced and substantive discussion of prior accomplishments.

DEFINITIONS:

ROSE adopts the ACM artifact badging conventions. ROSE seeks replicated and reproduced results defined as follows: ACM artifact badging conventions

IMPORTANT POINT: Replication is more than just “they downloaded my scripts and ran exactly those”. There must be something changed in the replication work (but perhaps that change is not very large).

  • Chairs and organizers:
    • Robert Feldt, Chalmers Institute of Technology, Sweden
    • Tim Menzies, NC State University, USA
    • Thomas Zimmermann, Microsoft Research, USA
  • Program Committee:
    • Neil Ernst, University of Victoria
    • Thomas Zimmermann, Microsoft
    • Chakkrit Tantithamthavorn, Monash University
    • Robert Feldt, Blekinge Institute of Technology
    • Martin Monperrus, University of Lille & INRIA
    • Daniel Graziotin, University of Stuttgart
    • Sira Vegas, Universidad Politécnica de Madrid