Automating the Performance Deviation Analysis for Multiple System Releases: an Evolutionary Study

(This page is under development)

This page shows more complete results of the study presented in the paper titled "Automating the Performance Deviation Analysis for Multiple System Releases: an Evolutionary Study".

Paper Abstract:

This paper presents a scenario-based approach for the evaluation of the quality attribute of performance, measured in terms of execution time (response time). The approach is implemented by a metadata-based framework that uses dynamic analysis and repository mining techniques to provide an automated way for revealing potential sources of performance degradation of scenarios between releases of a software system. The approach defines four phases: (i) preparation - choosing the scenarios and preparing the target releases; (ii) dynamic analysis - determining the performance of scenarios and methods by calculating their execution time; (iii) degradation analysis - processing and comparing the results of the dynamic analysis for different releases; and (iv) repository mining - identifying development issues and commits associated with performance deviation. The paper also describes an evolutionary study of applying the approach to multiple releases of Netty, Wicket and Jetty network/web frameworks. The study analyzed seven releases of each system and addressed a total of 57 scenarios. Overall, we have found 14 scenarios with significant performance deviation for Netty, 13 for Wicket, and 9 for Jetty, almost all of which could be attributed to a source code change. We also discuss feedback obtained from eight developers of Netty, Wicket and Jetty as result of a questionnaire.

  • Netty
  • Wicket
  • Jetty
Download subjects overview.

PC Configuration:
  • AMD Phenom II processor
  • 8GB of RAM memory
  • Windows 7, Java version 7
Target Releases:
  • Netty: 4.0.0.Final, 4.0.6.Final, 4.0.10.Final, 4.0.15.Final, 4.0.17.Final, 4.0.18.Final and 4.0.21.Final
    • Evolution 1 (NE1): From 4.0.0.Final to 4.0.6.Final
    • Evolution 2 (NE2): From 4.0.6.Final to 4.0.10.Final
    • Evolution 3 (NE3): From 4.0.10.Final to 4.0.15.Final
    • Evolution 4 (NE4): From 4.0.15.Final to 4.0.17.Final
    • Evolution 5 (NE5): From 4.0.17.Final to 4.0.18.Final
    • Evolution 6 (NE6): From 4.0.18.Final to 4.0.21.Final
  • Wicket: 6.15.0, 6.16.0, 6.17.0, 6.18.0, 7.0.0-M1, 7.0.0-M2 and 7.0.0-M4
    • Evolution 1 (WE1): From 6.15.0 to 6.16.0
    • Evolution 2 (WE2): From 6.16.0 to 6.17.0
    • Evolution 3 (WE3): From 6.17.0 to 6.18.0
    • Evolution 4 (WE4): From 6.18.0 to 6.19.0
    • Evolution 5 (WE5): From 7.0.0-M1 to 7.0.0-M2
    • Evolution 6 (WE6): From 7.0.0-M2 to 7.0.0-M4
  • Jetty: 9.2.6, 9.2.7, 9.2.8, 9.2.9, 9.2.10, 9.3.0.M0 and 9.3.0.M1
    • Evolution 1 (JE1): From 9.2.6 to 9.2.7
    • Evolution 2 (JE2): From 9.2.7 to 9.2.8
    • Evolution 3 (JE3): From 9.2.8 to 9.2.9
    • Evolution 4 (JE4): From 9.2.9 to 9.2.10
    • Evolution 5 (JE5): From 9.2.10 to 9.3.0.M0
    • Evolution 6 (JE6): From 9.3.0.M0 to 9.3.0.M1
Study Results:
  • Scenarios that the study identified with performance deviation (download)
  • Overview of the potential sources of performance deviation identified by our study (download)
  • Packages and classes that contain sources of performance deviation (download)
  • Commits and type of issues related to sources of performance deviation (download)
Survey Answers:
Methods Information (Raw Data):
  • These tables contain raw data with information about every common method among the releases, including number of executions, average execution time and p-values (Netty, Wicket, Jetty).