Make your Test Automation SMARTER - Original Software€¦ · Make your Test Automation SMARTER Jim...
Transcript of Make your Test Automation SMARTER - Original Software€¦ · Make your Test Automation SMARTER Jim...
Make your Test Automation SMARTER
Jim Trentadue, Software Quality Consulting Director [email protected], Original Software
Agenda
Key advantages of Test Automation
Additional advantages of Test Automation not incorporated
Limitations of Test Automation
Topics for SMARTER Test Automation
Case Study
Session recap
Advantages of Test Automation
Robust Test Automation
Better distribution of
testing resources
Unlimited permutations of data usage
24-hour testing coverage
during critical project
timelines
Key advantages of Test Automation
Outside of the time & financial savings, some key points to consider
Better Test
Objectives
during
requirements /
user story
Better Test
Metrics of test
effectiveness
and data
coverage
Work with UAT
earlier;
incorporating
customer tests
Increased data
scenarios for
functional,
negative,
equivalent class,
boundary, etc.
Additional advantages of Test Automation
Going beyond automated testing through your SUT
Test data setup –
not validation
Automate your SQL Editor for table query results
Component-level tests (message queues,
scripts, web services, stored prcs)
Limitations of Test Automation
Understanding limitations can better serve its purpose
What is SMARTER Test Automation ?
Topics for SMARTER Test Automation
Defining topics to make your test automation SMARTER
STRATEGY for understanding application behavior
METHODOLOGY for implementing the logical model
ADAPTABLE across different platforms and browsers
ROADMAP for outlining the sequence and priority of tests
TOOLBOX to work with technologies available
EXPERIENCED personnel requirements for those to support
changes
REPEATABLE design for future success
STRATEGY
Strategy for understanding application behavior
Test Suite
Test Case 1
- Module 1 (Scenario 1)
- Module 2 (Scenario 2)
(if PASS, execute Test Case 3
if FAIL, execute Test Case 2)
Test Case 2
- Module 3 (Scenario 3)
(if PASS, go to Test Case 4)
- Module 4 (Scenario 4)
(if FAIL, stop the testing)
Test Case 3
- Module5 (Scenario 5)
(if PASS, iterate through Test
Case 1 for data-driven tests)
- Module6 (Scenario6)
(if FAIL, execute Test Case 2)
Consider conditional test execution below, with the logical Test Suite layout to the right:
METHODOLOGY
Methodology for implementing the logical model
END-TO-END TESTS CREATED, DIVIDED BY MODULE, GROUPED IN TEST CASES
24 full automation paths (recording or coding)
LOGIN MENU ORDER INVENTORY PROCESSING CONFIRMATION REPORTS
2 (3)* 3 2 2 1 1 1
END-TO-END
MODULE BREAKDOWN
*1 Negative scenario that doesn’t proceed
LOGIN
TC
Module 1
Module 2
MENU
TC
Module 3
Module 4
Module 5
ORDER
TC
Module 6
Module 7
INVENTORY
TC
Module 8
Module 9
PROCESSING
TC
Module 10
CONFIRMATION
TC
Module 11
REPORTS
TC
Module 12
DATA-DRIVEN; only 1 module required
ADAPTABLE
Adaptable across different platforms and browsers
DESKTOP: UI Automation (UIA)
Microsoft Active Accessibility (MSAA)
WEB: Document Object Model (DOM)
USING COMMON STRUCTURES FOR OBJECT RECOGNITION
ROADMAP
Roadmap for outlining the sequence and priority of tests
THREE DIFFERENT VIEWS PRIORITIZING GRADING AND SCALING
TOPIC Manual Test
Case
readiness
Automation
Difficulty
Automation
Effort
# of releases automated test
case(s) will be used
TOTAL
SCORE (1-10)
TOPIC # of test cases
for functional
area
Traceability
from req. thru
to test
# of releases
manual test case(s)
will be used
Release frequency of
functional change
TOTAL
SCORE (1-10)
TOPIC Volume of
defects in
production
Volume of
defects in test
# of defects
opened in last
six months
# of defects opened
within the last 12
months
TOTAL
SCORE (1-10)
Business Prioritization
Functional (Manual) Testing Prioritization
Test Automation Prioritization
TOOLBOX
Toolbox to work with technologies
Application(s)
Under Test
(AUT)
Test Management
EXPERIENCED
Personnel requirements for those to support changes
TEST AUTOMATION PROFESSIONALS
System Testers:
SME
knowledge
on
Application
Under Test
Writes Test
Plans and
Cases
Executes
system tests
against the
requirements
Analyzes test
results; reports
accordingly
Developers:
Creates code
modules
Conducts
code /
design review
with QA team
Documents
coding
standards for
functions and
objects
Executes unit
testing
Understands:
Object Repositories
Coding Functions
Debugging
Knowledge of:
Testing Process
Test Documents
System Integration
REPEATABLE
Repeatable design for future success
Points always
to consider
Error-handling at every turn Use Dynamic Data, build with Static Conditions
Elimination of a manual test/task
Case Study
Case Study: Improving Automation
Example of how to improve automation continually with enhanced logic
Background:
Decision to automate was made for
the following:
o Save Time and Speed Up testing
o Without an independent test
team, the project team had to
do all of the testing; it was
thought that repetitive testing of
the application was dull
o The tools existed & looked fun!
Software Under Test:
Large scale sales & marketing
automation program. Client / Server
application that ran on different
Operating Systems and Relational
Databases
Automated solution used:
SQA Robot
First Generation
Initially tried using Capture / Replay
Functions were written in SQA Basic to enable complex test activities in a single command
Second Generation
12 environments to support, mgmt. kept automation
Automation was only for simple navigation & basic record; data-driving tests
Each component has a main GUI screen & main procedure
Third Generation
Creation of larger dedicated test team
Designed new automation infrastructure and mapped into a test automation estimation matrix
(more on next slide)
ISSUES FACED
Basic use of the tool was limited; required coding
LESSONS LEARNED
Solution requires coding expertise Need a champion Solution lags with
technologies
ISSUES FACED
Object recognition & event actions were unreliable
LESSONS LEARNED
Difficult to build automated region
Had to release tests to test region
ISSUES FACED
Technology used outgrew automated solution; had object recognition issues
LESSONS LEARNED TA managed as a project initiative
Understanding of maintenance costs
Case Study by Steve Allott, Software Test Automation: Fewster & Graham
Case Study: Levels of Prioritization
Characteristics of three levels of prioritization & estimation
Level 1 test exercises the simplest aspect of the functionality in a module
It is usually straightforward to test manually
It is easy to automate
The automated test is likely to work
It is unlikely to find a new bug
Level 2 tests explore all module aspects except interfaces to other
components
It is possible but time consuming to test manually
It looks easy to automate, but doesn’t always turn out so
The automated test is likely to have bugs
It sometimes finds a bug
Level 3 tests exercise the deepest level of functionality in a module, including those that interface to other components
Difficult if not impossible to test manually
Hard to automate
Unlikely to run successfully, repeatedly
Very likely to find a bug
Lev
el
1
Leve
l
2
Lev
el
3
Case Study: Key Points
Key points and recommendations in this case study
Champion must sell the ideas and benefits of automated testing
and manage other people’s expectations carefully
Ensure your testing process is reasonably mature before you start
to automate
Before you buy an automated test tool, first consider your
technical requirements
Evaluate a test tool in your environment against target
applications through a vendor POC
Build a tier of automation levels of where the tests are at within
each level and what level you want each to be at
Session Recap
Presentation Recap
Reviewing the main points of the presentation
Understand all of the advantages of test automation: the well-
known key points, as well as additional advantages
Know the limitations of test automation for your initiative
Build your own definitions using the SMARTER keywords
State an objective for each keyword along with a real or
anticipated scenario
Reference this case study or another that illustrates the evolution
and improvement of a test automation journey
Thank you for attending!
Jim Trentadue, Software Quality Consulting Director [email protected], Original Software