Blogs (1) >>
ASE 2019
Sun 10 - Fri 15 November 2019 San Diego, California, United States
Wed 13 Nov 2019 16:00 - 16:20 at Hillcrest - Performance Chair(s): Tim Menzies

The performance of a software system plays a crucial role for user perception. Learning from the history of a software system’s performance behavior does not only help discovering and locating performance bugs, but also identifying evolutionary performance patterns and general trends, such as when technical debt accumulates in a slow but steady performance degradation. Exhaustive regression testing is usually impractical, because rigorous performance benchmarking requires executing a realistic workload per commit, which results in large execution times. In this paper, we propose a novel active revision sampling approach, which aims at tracking and understanding a system’s performance history by approximating the performance behavior of a software system across all of its revisions. In a nutshell, we iteratively sample and measure the performance of specific revisions that help us in building an exact performance- evolution model, and we use Gaussian Process models to assess in which revision ranges our model is most uncertain with the goal to to sample further revisions for measurement. We have conducted an empirical analysis of the evolutionary performance behavior modeled as a time series of the history of 6 real-world software systems. Our evaluation demonstrates that Gaussian Process models are able to accurately estimate the performance- evolution history of real-world software systems with only few measurements and to reveal interesting behaviors and trends.

Conference Day
Wed 13 Nov

Displayed time zone: Tijuana, Baja California change

16:00 - 17:50
PerformanceResearch Papers / Demonstrations at Hillcrest
Chair(s): Tim MenziesNorth Carolina State University
16:00
20m
Talk
Accurate Modeling of Performance Histories for Evolving Software Systems
Research Papers
Stefan MühlbauerBauhaus-University Weimar, Sven ApelSaarland University, Norbert SiegmundBauhaus-University Weimar
Pre-print
16:20
20m
Talk
An Industrial Experience Report on Performance-Aware Refactoring on a Database-centric Web Application
Research Papers
Boyuan ChenYork University, Zhen Ming (Jack) JiangYork University, Paul MatosCopywell Inc., Michael LacariaCopywell Inc.
Authorizer link Pre-print
16:40
20m
Talk
An Experience Report of Generating Load Tests Using Log-recovered Workloads at Varying Granularities of User Behaviour
Research Papers
Jinfu ChenJiangsu University, Weiyi ShangConcordia University, Canada, Ahmed E. HassanQueen's University, Yong WangAlibaba Group, Jiangbin LinAlibaba Group
Pre-print
17:00
10m
Talk
How Do API Selections Affect the Runtime Performance of Data Analytics Tasks?
Research Papers
Yida TaoShenzhen University, Shan TangShenzhen University, Yepang LiuSouthern University of Science and Technology, Zhiwu XuShenzhen University, Shengchao QinUniversity of Teesside
17:10
10m
Talk
Demystifying Application Performance Management Libraries for Android
Research Papers
Yutian TangThe Hong Kong Polytechnic University, Xian ZhanThe Hong Kong Polytechnic University, Hao ZhouThe Hong Kong Polytechnic University, Xiapu LuoThe Hong Kong Polytechnic University, Zhou XuWuhan University, Yajin ZhouZhejiang University, Qiben YanMichigan State University
17:20
10m
Demonstration
PeASS: A Tool for Identifying Performance Changes at Code Level
Demonstrations
David Georg ReicheltUniversität Leipzig, Stefan KühneUniversität Leipzig, Wilhelm HasselbringKiel University
Pre-print Media Attached File Attached
17:30
20m
Talk
ReduKtor: How We Stopped Worrying About Bugs in Kotlin Compiler
Research Papers
Daniil StepanovSaint Petersburg Polytechnic University, Marat AkhinSaint Petersburg Polytechnic University / JetBrains Research, Mikhail BelyaevSaint Petersburg Polytechnic University
Pre-print