Mobile Reliability Challenges
-
Upload
bob-binder -
Category
Technology
-
view
239 -
download
0
description
Transcript of Mobile Reliability Challenges
mVerify A Million Users in a Box ®
www.mVerify.com
™
Mobile Reliability Challenges Robert V. Binder
ISSRE 2004
November 2, 2004
© 2004 mVerify Corporation 2
Overview
21st Century IT Trends
Mobile Technology Crisis
Test Effectiveness Levels
Level 4 Case Study
Reliability Arithmetic
Test Performance Envelope
Conclusion
© 2004 mVerify Corporation 3
The New IT Reality
Last 25 years
Information driven society
New and better ways to do old things
Focused, skilled interaction
Tethered
Next 25 years
User population 100x
New things
Persistent partial attention
Mobile, ubiquitous
Very low visibility, very high failure impact
© 2004 mVerify Corporation 4
The New IT Reality: Ubiquity
Cheap fat pipes everywhere Optical backbone + wireless MAN, LAN, PAN WiMax: 802.16, 802.20 Ad-hoc Mobile networks
Cycles and storage Moore's law Watts per MIPS
No more shrink-wrap Application service provider/subscription model Web services, ultra large databases Grid computing
Converging user device form factor Cell phone + PDA + pager + Pocket PC + ... D-2-B Interfaces – 5 years?
About 10x every five years!
© 2004 mVerify Corporation 5
Robert’s Afternoon
Robert’s Afternoon: Seamless Mobility
courtesy Motorola, Inc.
This shows Motorola’s high-level vision.
It is not a product roadmap or indicative of any specific product/service offering.
© 2004 mVerify Corporation 6
The Unchanged IT Reality: Software
The Next Big Thing
Extreme Programming ?
Aspect-oriented languages ?
Model-driven Architecture ?
Still no Silver Bullet
Subtractive component reliability
Design limited to human ability and organization
Low-fidelity test suites aren’t effective
Bug barrier: 5/KLOC (pre-test)
any language, any process
© 2004 mVerify Corporation 7
Mobile Technology Challenges
Testing wired apps difficult and expensive
20% to 50% of all software development $ on testing
Available test automation technology 10+ years old
Annual cost to U.S. of inadequate testing: $56 billion
Testing mobile apps much harder:
Connectivity: “Can you hear me now?”
Mobility: location-based services
Scalability: at least 10x web user population and mobile
Security: always on, always hackable
PLUS assure functionality, performance, and integration
Mobile App Fault Space Much Bigger
© 2004 mVerify Corporation 8
Current software technology
CANNOT
achieve reliable mobile apps
A True Crisis
© 2004 mVerify Corporation 9
What Can Testing Do?
Test Performance
Effectiveness (reliability/quality increase)
Efficiency (average cost per test)
Levels
1: Testing by poking around
2: Manual Testing
3: Automated Test Script
4: Model-based
5: Full Test Automation
Each Level 10x Improvement
© 2004 mVerify Corporation 10
Level 1: Testing by Poking Around
Manual
“Exploratory”
Testing
•Low Coverage
•Not Repeatable
•Can’t Scale
•Inconsistent System Under Test
© 2004 mVerify Corporation 11
System Under Test
Manual
Test Design/
Generation
Test Setup
Level 2: Manual Testing
Manual
Test Input
Test Results
Evaluation
•1 test per hour
•Not repeatable
© 2004 mVerify Corporation 12
System Under Test
Manual
Test Design/
Generation
Test Setup
Level 3: Automated Test Script
Test Script
Programming
Test Results
Evaluation
•10+ tests per hour
•Repeatable
•High change cost
© 2004 mVerify Corporation 13
System Under Test
Model-based
Test Design/
Generation
Test Setup
Level 4: Automated Model-based
Automatic
Test
Execution
Test Results
Evaluation
•1000+ tests per hour
•High fidelity
•Case study
© 2004 mVerify Corporation 14
Level 4 Case Study
Leading financial market
3 million transactions per hour
15 billion dollars per day
650 KLOC Java, Distributed Services …
System Test Process & Environment
Automated, Model-based
Executable operational profile
Simulator generates realistic unique test suites
3 years, version 1.0 live Q4 2001
1,000 to 750,000 unique tests per day
© 2004 mVerify Corporation 15
Model-based Testing
Extended Use Case
Mode Machine
Invariant Boundaries
Stealth Requirements Engineering
© 2004 mVerify Corporation 16
Simulator
Discrete event simulation
Prolog implementation (50 KLOC)
Rule inversion
Load Profile
Time domain variation
Orthogonal to operational profile
Each event assigned a "port" and submit time
© 2004 mVerify Corporation 17
Automated Run Evaluation
Oracle accepts output of simulator
About 500 unique rules
Verification
Splainer – result/rule backtracking tool
Rule/Run coverage analyzer
Comparator
Extract transaction log
Post run database state
End-to-end invariant
© 2004 mVerify Corporation 18
Results
Revealed about 1,500 bugs over two years
5% showstoppers
Five person team, huge productivity increase
Achieved proven high reliability
Last pre-release test run: 500,000 events in two hours, no failures detected
No production failures
© 2004 mVerify Corporation 19
System Under Test
Level 5: Total Automation
Automated
Test Setup
Automated
Test Results
Evaluation
Automatic
Test
Execution
•10,000 TPH
•Oracle Problem
Model-based
Test Design/
Generation
© 2004 mVerify Corporation 20
AMATE: Level 5 for Mobile Apps
Advanced Mobile Application Test Environment
NIST/ATP funded R&D
Highly realistic end-to-end mobile testing
Generate and control Signal variation related to mobility
User behavior related to mobility
Traffic related to mobility
Model-based
Trial Use Q2 2005
© 2004 mVerify Corporation 21
Reliability Arithmetic
Reliability: probability of non-failure
MTTR: mean time to
recover, repair, restart …
Availability: percent up-time
Availability = 1 / 1 + (MTTR Reliability)
99.999% availability = 5 min downtime per year
“Five nines”
© 2004 mVerify Corporation 22
Reliability (Failures/million hours)
Availability, 6 min MTTR
Some Reliability Data Points
NT 4.0 Desktop 82,000 0.999000000
Windows 2K Server 36,013 0.999640000
Common Light Bulb 1,000 0.999990000
Stepstone OO Framework 5 0.999999500
Telelabs Digital Cross Connect 3 0.999999842
© 2004 mVerify Corporation 23
Test Automation Envelope
5 Nines
4 Nines
3 Nines
2 Nines
1 Nine
1 10 100 1,000 10,000
L 4 Case Study
Level 5
AMATE
Reliability (Effectiveness)
Productivity: Tests/Hour (Efficiency)
L 2 Manual
L 3 Scripting
© 2004 mVerify Corporation 24
The Mobile Reliability Challenge
Mobile App Fault Space Bigger
Connectivity: “Can you hear me now?”
Mobility: location-based services
Scalability: large and mobile
Security: always on, always hackable
PLUS assure functionality, performance, and integration
What can be done? Test performance envelope the same
Test budget the same
© 2004 mVerify Corporation 26
Scenario: Manual Testing
Level 2 manual testing
Mobile app fault space 10x bigger
Reliability slips to Level 1
1 10 100 1,000 10,000
5 9s
4 9s
3 9s
2 9s
1 9s
© 2004 mVerify Corporation 27
Scenario: Improve Efficiency 10x
L2 Manual improves 10x to L3 Scripted
L3 Scripted improves 10x to L4 Model-based
Expect same average reliability
1 10 100 1,000 10,000
5 9s
4 9s
3 9s
2 9s
1 9s
© 2004 mVerify Corporation 28
Scenario: Mobile 5 Nines
Increase L4 efficiency 10x Realistic mobile
environment Realistic loading Realistic
functional profile Model-based and
automated
AMATE
Expect mobile app 5 Nines
1 10 100 1,000 10,000
5 9s
4 9s
3 9s
2 9s
1 9s
© 2004 mVerify Corporation 29
Model-Based mobile testing
CAN
achieve reliable mobile apps
Conclusion
© 2004 mVerify Corporation 30
Q & A