Blogs (1) >>
ICSE 2019
Sat 25 - Fri 31 May 2019 Montreal, QC, Canada
Fri 31 May 2019 14:30 - 14:50 at Viger - Crowdsourcing in Software Engineering Chair(s): Tayana Conte

Crowdsourced testing has been widely adopted to improve the quality of various software products. Crowdsourced workers typically perform testing tasks and report their experiences through test reports. While the crowdsourced test reports provide feedbacks from real usage scenarios, inspecting such a large number of reports becomes a time-consuming yet inevitable task. To improve the efficiency of this task, existing widely used issue-tracking systems, such as JIRA, Bugzilla, and Mantis, have provided keyword-search-based methods to assist users in identifying duplicate test reports. However, on mobile devices (such as mobile phones), where the crowdsourced test reports often contain insufficient text descriptions but instead rich screenshots, these text-analysis-based methods become less effective because the data has fundamentally changed. In this paper, instead of focusing on only detecting duplicates based on textual descriptions, we present CTRAS: a novel approach to leveraging duplicates to enrich the content of bug descriptions and improve the efficiency of inspecting these reports. CTRAS is capable of automatically aggregating duplicates based on both textual information and screenshots, and further summarizes the duplicate test reports into a comprehensive and comprehensible report. To validate CTRAS, we conducted quantitative studies using more than 5000 test reports, collected from 12 industrial crowdsourced projects. The experimental results reveal that CTRAS can reach an accuracy of 0.87, on average, regarding automatically detecting duplicate reports, and it outperforms the classic Max-Coverage-based and MMR summarization methods under Jensen Shannon divergence metric.

Fri 31 May

Displayed time zone: Eastern Time (US & Canada) change

14:00 - 15:30
Crowdsourcing in Software EngineeringPapers / Software Engineering in Practice / Technical Track at Viger
Chair(s): Tayana Conte Universidade Federal do Amazonas
14:00
30m
Talk
(SEIP Talk) Crowdsourcing in Software Engineering: Models, Motivations, and ChallengesSEIPIndustry Program
Software Engineering in Practice
Thomas LaToza George Mason University
14:30
20m
Talk
CTRAS: Crowdsourced Test Report Aggregation and SummarizationTechnical TrackIndustry Program
Technical Track
hao rui , Yang Feng University of California, Irvine, James Jones University of California, Irvine, Yuying Li State Key Laboratory for Novel Software Technology, Nanjing University, Zhenyu Chen Nanjing University
14:50
20m
Talk
iSENSE: Completion-Aware Crowdtesting ManagementACM SIGSOFT Distinguished Paper AwardTechnical TrackIndustry Program
Technical Track
Junjie Wang Institute of Software, Chinese Academy of Sciences, Ye Yang Stevens institute of technology, Rahul Krishna NC State University, Tim Menzies North Carolina State University, Qing Wang Institute of Software, Chinese Academy of Sciences
15:10
20m
Talk
Discussion Period
Papers