Empircal Studies of Performance Bugs & Performance Analysis Approaches for Large Scale Software...
Transcript of Empircal Studies of Performance Bugs & Performance Analysis Approaches for Large Scale Software...
Empirical Studies of Performance Bugs & Performance Analysis Approaches for Large
Scale Software Systems
Shahed Zaman Supervisor: Dr. Ahmed E. Hassan
Software Analysis and Intelligence Lab (SAIL)School of ComputingQueen’s University
2
Performance
How fast and efficiently a system can perform
Performance Bug
Any bug related to performance
ProblemImprovement expectation
3
Bugs have a high impact on companies
482 bugs/week
Costly Affects Reputation
4
Software Performance
• Important non-functional characteristics
in a competitive market
• Considered to be of high priority in
practice for testing
5
Most research treats all bugs equally. Does this make sense?
6
Performance Security Other bugs
7
Research Hypothesis
Performance bugs have different
characteristics than other bugs and should be
treated differently in software maintenance
research and practice.
8
Quantitative Study
Qualitative Study
User-centric performance analysis Study
9
Quantitative Study
Qualitative Study
User-centric performance analysis Study
10
Performance Security
Chrome®
Other bugs
295,198 bugs7,603 performance bugs
847 security bugs
44,997 bugs510 performance bugs
327 security bugs
11
Quantitative Study Dimensions
Fix
Time Code Changes
People
12
Performance bugs are fixed by more experienced developers
Performance bugs take longest time to Fix
13
Quantitative Study Finding
• Performance bugs show different characteristics.
• Findings are not consistent across projects.
14
Quantitative Study
Qualitative Study
User-centric performance analysis Study
15
Qualitative Study
Performance Bugs Non-performance bugs
100 Bugs 100 Bugs
=200 Bugs
+
100 Bugs 100 Bugs
=200 Bugs
+
16
Dimensions Used in this study
Impact on the stakeholders
Available context of the bug
The bug fix
Fix validation
17
Regression
Blocking
WFM After a long time
People Talking about switching
Measurement Used
Has test cases
Contains stacktrace
Has reproducible info
Problem in reproducing
Reported by a project member
Duplicate bug
Problem discussion in comments
Depends on other bug
Blocking other bugs
Reporter provides some hint about the fix
Patch uploaded by the reporter
Discussion about the patch
Review
Super-review
Findings= Statistically significant difference between perf. and non-perf.
Performance bugs are different
Findings not consistent across projects
18
Findings for Firefox performance bugs
Quantitative Study Qualitative StudyRequire more time to fix 1. Problem in reproducing
2. More dependencies between bugs3. Collaborative root-cause analysis process4. WFM/Fixed/Won’tFix after a long time
Fixed by more experienced developers
1. More release blocking2. People switch to other software systems
19
Findings
Impact on the stakeholders
Available context of the bug
The bug fix
Perf. bugs have high impact on stakeholders
Perf. bug reports contain more context about the bug
Perf. bug fixing require more collaborative effort
20
Quantitative Study
Qualitative Study
User-centric performance analysis Study
21
Users threaten to switch to a competing product
22
User-centric VS Scenario-centricperformance analysis
23
Problem with current practice
Users
10 Requestsper user
Software System
1,000 Requests
10 RequestsBad Response Time
Requests
24
Scenario-Centric View
Users
Software System
10 RequestsBad Response Time
10 Requestsper user
1% bad request instance
1,000 Requests
25
User-Centric View
Users
Software System
1,000 Requests
10 RequestsBad Response Time
10 Requestsper user
0% bad request instance
50% bad request instance
1% bad request instance
User’s Perspective System’s Perspective
26
Data used in this study• 3 systems• 13 use-case scenarios
Factor Enterprise System 1
Enterprise System 2
Dell DVD store
Functionality Telecommunications E-commerce
Vendor’s Business Model
Commercial Open-source
Size Ultra Large Large Small
Complexity Complex Complex Simple
27
Our Study Dimensions
Overall Performance
Performance Trend
Performance consistency
vs
28
Performance Trend Over TimeScenario Centric View User Centric View
OldNew
0 15 30 45 60
30
35
4
0
4
5Re
spon
se T
ime
Running Time0 20 40 60 80 100 120 140
40
60
80
100
1
20Re
spon
se T
ime
Instance # for a user
OldNew
29
Findings
Overall Performance
Performance Trend
Performance Consistency
vs
10 out of 13 use-cases showed a complementary different view
8 out of 13 use-cases showed a
complementary different view
All 13 use-cases showed a
complementary different view
30
User-centric performance analysis study finding
User-centric view is a complementary useful view
31
Major Contributions
• First time ever empirical study on performance bugs
• Developed a taxonomy for the qualitative analysis of performance bug reports
• Proposed a new approach to analyze the performance of software systems
32
MSR 2011 MSR 2012
ICST 2012