Write a Blog >>
MSR 2021
Mon 17 - Wed 19 May 2021
co-located with ICSE 2021
Mon 17 May 2021 10:15 - 10:19 at MSR Room 2 - Testing and code review Chair(s): Jürgen Cito

Industrial reports indicate that flaky tests are one of the primary concerns of software testing mainly due to the false signals they provide. To deal with this issue, researchers have developed tools and techniques aiming at (automatically) identifying flaky tests with encouraging results. However, to reach industrial adoption and practice, these techniques need to be replicated and evaluated extensively on multiple datasets, occasions and settings. In view of this, we perform a replication study of a recently proposed method that predicts flaky tests based on their vocabulary. We thus replicate the original study on three different dimensions. First we replicate the approach on the same subjects as in the original study but using a different evaluation methodology, i.e., we adopt a time-sensitive selection of training and test sets that better reflect the envisioned use case. Second, we consolidate the findings of the initial study by building a new dataset of 837 flaky tests from 9 projects in a different programming language, i.e., Python while the original study was in Java, thus comforting the generalisability of the results. Third, we propose an extension to the original approach by experimenting with different features from the Code Under Test. Our results demonstrate that a more robust validation has a consistent negative impact on the reported results of the original study, but, fortunately, these do not invalidate the key conclusions of the study. We also find re-assuring results that the vocabulary-based models can also be used to predict test flakiness in Python and that the information lying in the Code Under Test has a limited impact in the performance of the vocabulary-based models.

Conference Day
Mon 17 May

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

10:00 - 10:50
Testing and code reviewTechnical Papers / Data Showcase / Registered Reports at MSR Room 2
Chair(s): Jürgen CitoTU Wien and Facebook
10:01
3m
Talk
A Traceability Dataset for Open Source Systems
Data Showcase
Mouna HammoudiJOHANNES KEPLER UNIVERSITY LINZ, Christoph Mayr-DornJohannes Kepler University, Linz, Atif MashkoorJohannes Kepler University Linz, Alexander EgyedJohannes Kepler University
Media Attached
10:04
4m
Talk
How Java Programmers Test Exceptional Behavior
Technical Papers
Diego MarcilioUSI Università della Svizzera italiana, Carlo A. FuriaUniversità della Svizzera italiana (USI)
Pre-print
10:08
4m
Talk
An Exploratory Study of Log Placement Recommendation in an Enterprise System
Technical Papers
Jeanderson CândidoDelft University of Technology, Jan HaesenAdyen N.V., Maurício AnicheDelft University of Technology, Arie van DeursenDelft University of Technology, Netherlands
Pre-print Media Attached
10:12
3m
Talk
Does Code Review Promote Conformance? A Study of OpenStack Patches
Technical Papers
Panyawut Sri-iesaranusornNara Institute of Science and Technology, Raula Gaikovina KulaNAIST, Takashi IshioNara Institute of Science and Technology
Pre-print
10:15
4m
Talk
A Replication Study on the Usability of Code Vocabulary in Predicting Flaky Tests
Technical Papers
Guillaume HabenUniversity of Luxembourg, Sarra HabchiUniversity of Luxembourg, Luxembourg, Mike PapadakisUniversity of Luxembourg, Luxembourg, Maxime CordyUniversity of Luxembourg, Luxembourg, Yves Le TraonUniversity of Luxembourg, Luxembourg
Pre-print Media Attached
10:19
3m
Talk
On the Use of Mutation in Injecting Test Order-Dependency
Registered Reports
Sarra HabchiUniversity of Luxembourg, Luxembourg, Maxime CordyUniversity of Luxembourg, Luxembourg, Mike PapadakisUniversity of Luxembourg, Luxembourg, Yves Le TraonUniversity of Luxembourg, Luxembourg
Pre-print Media Attached
10:22
28m
Live Q&A
Discussions and Q&A
Technical Papers


Information for Participants
Info for MSR Room 2: