Blogs (1) >>
ICSE 2019
Sat 25 - Fri 31 May 2019 Montreal, QC, Canada
Wed 29 May 2019 15:00 - 15:20 at Mansfield / Sherbrooke - DevOps and Logging Chair(s): Diomidis Spinellis

Developers rely on software logs for a wide variety of tasks, such as debugging, testing, program comprehension, verification, and performance analysis. Despite the importance of logs, prior studies show that there is no industrial standard on how to write logging statements. Recent research on logs often only considers the appropriateness of a log as an individual item (e.g., one single logging statement); while logs are typically analyzed in tandem. In this paper, we focus on studying duplicate logging statements, which are logging statements that have the same static text message. Such duplications in the text message are potential indications of logging code smells, which may affect developers’ understanding of the dynamic view of the system. We manually studied over 3K duplicate logging statements and their surrounding code in four large-scale open source systems: Hadoop, CloudStack, ElasticSearch, and Cassandra. We uncovered five patterns of duplicate logging code smells. For each instance of the code smell, we further manually identify the problematic (i.e., require fixes) and justifiable (i.e., do not require fixes) cases. Then, we contact developers in order to verify our manual study result. We integrated our manual study result and developers’ feedback into our automated static analysis tool, DLFinder, which automatically detects problematic duplicate logging code smells. We evaluated DLFinder on the four manually studied systems and two new systems: Camel and Wicket. In total, combining the results of DLFinder and our manual analysis, we reported 82 problematic code smell instances to developers and all of them have been fixed.

Wed 29 May

Displayed time zone: Eastern Time (US & Canada) change

14:00 - 15:30
DevOps and LoggingSoftware Engineering in Practice / Technical Track / Papers at Mansfield / Sherbrooke
Chair(s): Diomidis Spinellis Athens University of Economics and Business
14:00
20m
Talk
An Empirical Investigation of Incident Triage for Online Service SystemsSEIPIndustry Program
Software Engineering in Practice
Junjie Chen Peking University, Xiaoting He Microsoft, Qingwei Lin Microsoft Research, China, Yong Xu Microsoft, China, Hongyu Zhang The University of Newcastle, Dan Hao Peking University, Feng Gao Microsoft, Zhangwei Xu Microsoft, Yingnong Dang Microsoft Azure, Dongmei Zhang Microsoft Research, China
14:20
20m
Talk
Tools and Benchmarks for Automated Log ParsingSEIPIndustry Program
Software Engineering in Practice
Jieming Zhu Huawei Noah's Ark Lab, Shilin He Chinese University of Hong Kong, Jinyang Liu Sun Yat-Sen University, Pinjia He Computer Science and Engineering, The Chinese University of Hong Kong, Qi Xie Southwest Minzu University, Zibin Zheng School of Data and Computer Science, Sun Yat-sen University, Michael Lyu
14:40
20m
Talk
Mining Historical Test Logs to Predict Bugs and Localize Faults in the Test LogsTechnical TrackIndustry Program
Technical Track
Anunay Amar Concordia University, Peter Rigby Concordia University, Montreal, Canada
15:00
20m
Talk
DLFinder: Characterizing and Detecting Duplicate Logging Code SmellsTechnical TrackIndustry Program
Technical Track
Zhenhao Li Concordia University, Tse-Hsun (Peter) Chen Concordia University, Jinqiu Yang , Weiyi Shang Concordia University, Canada
15:20
10m
Talk
Discussion Period
Papers