Metric Free Test Management by Joseph Ours
Transcript of Metric Free Test Management by Joseph Ours
NOTICE: PROPRIETARY AND CONFIDENTIAL
This material is proprietary to and contains trade secrets and information which is solely the property of Centric Consulting, LLC. It is solely for the Client’s internal use and shall not be used, reproduced, copied, disclosed, transmitted, in whole or in part,
without the express consent of Centric Consulting, LLC. © 2016 Centric Consulting, LLC. All rights reserved. Centric_IntroPresentation_1v02_090816
Metric Free Test Management
Presented by: Joseph Ours
QUESTIONS? Contact Centric
2
To learn more about Centric Software Quality Assurance and Testing (SQA&T)
Services:
CentricConsulting.com
Joseph Ours
Email: [email protected]
Phone: 614.668.2306
Have you seen this before?
3
? What do status reports communicate
4
● Subjective Sentiment
● Good versus Bad
● Seen as Red, Yellow, Green status
● Bolstered by Defect Metrics
WHAT DO THEY COMMUNICATE? Given that status reports are used to drive a point, do we really understand what they communicate? What “things” do we communicate most often?
5
Evaluation
Progress
Coverage
● Work Completion (against Plan) - What is done versus what is not done
● Seen as On, Behind, Ahead of Schedule (Waterfall)
● Seen as Stories “Done Done”, “In Testing/Verifying”
● Bolstered by Tests run with pass/fail, tests run versus not run
● Scope of effort
● What will be covered, what will not be covered
● Seen as Requirements Traceability Matrix, Story Acceptance Criteria
● Bolstered by Coverage Metrics
PBR Status Reports Communicate
6
Progress
Barriers
Readiness
? Who is the audience
7
● What’s left to do?
● Anything that’ll get in my way?
Who is the Audience? Since status reports communicate a point, who are we communicating them to? More importantly, why them?
8
Other team members
Team Management (PM/SM)
Executives
● Are we on schedule or not?
● Does everything work?
● When can we release?
● What do we need to work around
? Why are they full of numbers
9
WHY Do WE love Metrics?
10
Numbers
are
irrefutable
Irrefutable
WHY Do WE love Metrics?
11
We believe
numbers
are
objective
Irrefutable Objective
WHY Do WE love Metrics?
12
Numbers
lend weight
to our
argument
Irrefutable Objective Argument
BECAUSE They Invoke Emotional Responses
13
Irrefutable Objective Argument
Our industry is inherently critical. As such, we want people to believe our perspective especially when people push back defensively. Numbers are our way of making irrefutable, objective arguments.
14
My Journey
The Early Years Snippets of a Neophyte
Current Cycle Script Execution Status
Scripts # %
Total 319 100%
Executed 278 87%
Passed 268 96%
Failed 10 4%
Unexecuted 41 13%
Not Completed 0 0%
Blocked 25 61%
No Run 3 7%
N/A 6 15%
Deferred 7 17%
Overall Project Defect Status by Severity
Defects Total 1-Critical 2-High 3-Medium 4-Low
Total 103 19 37 36 11
Total Active 12 3 3 5 1
New 1 0 0 1 0
Open 0 0 0 0 0
In Progress 5 2 0 2 1
Fixed 0 0 0 0 0
Ready for Retest 2 0 0 2 0
Re-Opened 0 0 0 0 0
On Hold 4 1 3 0 0
I wanted transparency, so I provided all the numbers I had.
The Toddler Years Snippets of an Neophyte
I realized no one read the numbers, so I started using graphs
0%
20%
40%
60%
80%
100%
Day
1
Day
2
Day
3
Day
4
Day
5
Day
6
Day
7
Day
8
Day
9
Day
10
Day
11
Day
12
Day
13
Day
14
Day
15
Test Case - S- Curve
Unexecuted
Blocked
Failed
Passed
Critical
Major
Medium
Minor
Non-Essential
I’d even experiment with metrics I didn’t understand, such as DRE I found Excel’s nifty regression line tool, and hey, it looked awesome for predicting!
I moved from high level numbers to visualization of those same numbers.
Early on I focused on information being digestible.
18
The Teen Years Snippets of a Initiate
Then I started to realize that people needed to know a finer level of detail. Graphs became too cluttered, so I went back to charts of numbers.
Functional Area Test Cases
available for execution
Executed test cases
Passed Failed
Not executed
Blocked No longer applicable
Cutover testing 66 54 54 0 0 0 12 Existing Devices (Copier, Vending, Laundry) 66 63 63 0 0 0 3 MF4100 20 20 20 0 0 0 0
SW Configuration 439 189 165 24 143 98 9
Swipe Tracking 93 93 88 5 0 2 0
JSA 6 6 6 0 1 0 0
Meals Auditor 10 10 3 7 0 0 0
IDM 20 20 14 6 0 0 0
TIA-Micros 15 14 13 1 0 1 0
HERO 9 9 6 3 0 0 0
TIA-Verifone 3 3 3 0 0 0 0
Customer Conversion 2 1 1 0 1 0 0
Grand Total 749 482
(65%) 436
(90%) 46
(10%) 145
(19%) 101
(13%) 24
(3%)
The Young Adult Years Snippets of a Initiate
So, I went full on workbook. Dashboard with underlying detail.
65, 9%
651, 91%
Test Case Execution …
Done
Executed
12, 2%
639, 98%
Execution Breakdown
Fail
Pass
23, 28%
55, 66%
5, 6%
Bug Status
Open
Closed
Not aBug
System Overall Progress Status By System Test Case Status Row Labels Tests Written Tests Executed Tests Estimated Test Case Status
BAG MEAL 0 0 8 Done Executed Grand Total
BBTS Device 85 85 75 BBTS Device 85 85
BBTS Report/Query 29 29 78 BBTS Report/Query 29 29
BBTS SW Config 288 245 92 BBTS SW Config 43 245 288
BBTS Swipe Tracking 93 92 48 BBTS Swipe Tracking 1 92 93
BBTS TIA 23 23 22 BBTS TIA 23 23
HERO 11 11 15 HERO 11 11
HUDS DW 0 0 28 IDDM 20 20
IDDM 20 20 10 JSA 10 10
JSA 10 10 20 Meals Auditor Replacement 1 10 11
Meals Auditor Replacement 11 10 8 Cutover 65 65
SharePoint 0 0 48 UAT 20 61 81
Cutover 65 65 40 Grand Total 65 651 716
UAT 81 61 0
Grand Total 716 651 492
Execution Status
Fail Pass Grand Total
BBTS Device 85 85
BBTS Report/Query 29 29
BBTS SW Config 8 237 245
BBTS Swipe Tracking 2 90 92
BBTS TIA 2 21 23
HERO 11 11
IDDM 20 20
JSA 10 10
Meals Auditor Replacement 10 10
Cutover 65 65
UAT 61 61
Grand Total 12 639 651
I realized that visualization of numbers was of paramount importance.
But detail was needed across multiple aspects of testing to ensure the information was informative.
21
Grown Up Snippets of a Adept
Agile made much of prior approaches obsolete. Additionally, the data overload turned off many stakeholders. So I went contextual with some graphs.
Implementation Team Weekly Status Report
1. Author/Team Joseph Ours, Quality Assurance and Testing 2. Status Identify each status as green, yellow, or red, and give short explanation. If status is yellow or red, provide plan to get back to green.
Overall Status – Yellow
Story Completion Status Sprint 2 Sprint 3 Sprint 4
Total 16 9 30
Closed 11 3 1
Open (Not Closed) 5 5 2
Stories missing test cases 0 0 18
We are still testing stories in Sprint 2 and 3, and just starting testing in Sprint 4
The testing team isn’t staffed sufficient for the project to be successfully tested o We rolled off 1 testing resource without replacing them o One testing resource will be leaving the team to be replaced by internal resource
One Vitamix resource has been asked to join the testing team, but only has 50% utilization. This represents a 120 man-hour deficiency.
Reporting limitations in Jira is making it nearly impossible to manage testing efforts efficiently (e.g. no way to query stories with remote links).
Defect Status
Scope Status – Yellow
BA’s are assigning stories back to testing that don’t meet their review. The original process had BA’s conducting meetings with CA/QA to understand what was missed. This has not happened.
In scope or Out of Scope o The new process is better, still working out kinks.
Sprint 2 Status
o Testing is done, awaiting defect fixes and retesting to close out Sprint
Sprint 3 status – o Testing is done, awaiting defect fixes and retesting to close out Sprint
Sprint 4 Status o Still need to write test cases for 18 stories. Only 3 stories are ready for testing.
No firm process for moving stories in/out of scope for a given sprint that is well communicated. Schedule Status - Red
While we’re slowly catching up, the inability to test Sprint 4 with only 2 days left in the sprint is concerning. Cost Status
Testing budget not managed by QA team.
1. Accomplishments Completed tasks for past week
Sprint 2, 3 testing complete, aside from retesting from defect fixes (awaiting)
Enhanced defect process to use of affects and fix version.
Wrote test cases for Sprint 4
Began assigning Sprint 5 stories to testers
2. To Do’s Planned tasks for next week
Facilitate resolution of outstanding defects and stories from sprint 2, 3
Test user stories in sprint 3
Finish writing test cases for Sprint 4, test sprint 4
Write test cases for Sprint 5
3. Risks Identify any possible project risks for us to discuss at the next Program Leadership Meeting. List date/milestone risk is associated with.
Testing resources – lack of availability placing testing schedule at risk o Understaffed for work assigned – have asked BA’s to assist to help catch up. o One resource rolled off – We are down at least 1 resource o New resource onboarding – Didn’t happen as resource was not available. When resource available will
be ramping up on process and procedures. Is not 100% allocated to vitamix, placing testing at risk. o New Vitamix resource – Still not 100% available over the reporting period.
Sprint scope documentation – placing testing effectiveness and efficiency at risk o Inability of project team to document what is in scope and out of scope by sprint is a significant risk to
the overall success of the project. This is getting better. Will need to run a few sprints to close out risk.
4. Milestones List upcoming milestones, estimated completion date, task owner, and status
N/A
Ultimately, I developed the philosophy of being an information broker, focusing on facilitating decision making.
23
New Way of Reporting…
Understanding
Testing has a Mission
Testing leverages Assets to accomplish the
mission
Testing Evaluates and reports on observations
mis·sion
ˈmiSHən/
noun
• a specific task with which a person or a group is charged
• a pre-established and often self-imposed objective or
purpose <statement of the company's mission>
What is a Mission?
Testing Mission
•Business Area
•Story
•Requirement
•Feature
•Epic
•Function
•Performance Objective
•Security Concern
Basis for tracking coverage
Let’s see a
possible way
to make this
Digestible
Informative
Facilitative
Testing Missions Example Mission Chart
Area Hap
py
Path
/Fu
ncti
on
al
Ed
ge, C
orn
er,
Neg
ati
ve C
ases
Perf
orm
an
ce*
Secu
rity
+
Login
Search
Checkout
Registration
Social Media Integration
Global Solution
*Performance – includes Peak, Normal Responsiveness, and Stress +Security – includes Penetration testing, IAM/Roles, and Manipulation
Testing Missions Example Mission Chart
Area Hap
py
Path
/Fu
ncti
on
al
Ed
ge, C
orn
er,
Neg
ati
ve C
ases
Perf
orm
an
ce*
Secu
rity
+
Login X X
Search X X
Checkout X X
Registration X X
Social Media Integration X X
Global Solution X X
*Performance – includes Peak, Normal Responsiveness, and Stress +Security – includes Penetration testing, IAM/Roles, and Manipulation
Digestible
Informative
Facilitative
What is an Asset?
as·set
ˈaset/
noun
a useful or valuable thing....
"quick reflexes were his chief asset"
? What are some testing assets
32
Let’s see a possible way to make
asset status
Digestible
Informative
Facilitative
Testing Assets
Asset Readiness – Regression Readiness – Current Work
People [Individual]
Unit Automation Framework
Functional Automation Framework
CI Framework
Manual Reporting Framework
Manual Test Cases
Automation Suite [Tagged]
Unit Suite [Tagged]
Security Scanning Tools
Performance Suite
< Ready
9 In Progress - On Track
9 In Progress – Behind
9 In Progress - Not Going to Make It
Broken
Example Asset Chart
Testing Assets
Asset Readiness – Regression Readiness – Current Work
People [Individual] < <
Unit Automation Framework <
Functional Automation Framework <
CI Framework 9 9
Manual Reporting Framework < <
Manual Test Cases < 9
Automation Suite [Tagged] 9
Unit Suite [Tagged] < <
Security Scanning Tools < <
Performance Suite < <
Example Asset Chart
Digestible
Informative
Facilitative
What is an Assessment?
as·sess·ment əˈsesmənt/ noun the evaluation or estimation of the nature, quality, or ability of someone or something. "the assessment of educational needs"
? Is Testing an assessment
36
Let’s see a possible way to make
our assessments
Digestible
Informative
Facilitative
Testing Assessment Example Solution Assessment
Area Hap
py
Pat
h
Edge
C
ase
s
Login
Search
Checkout
Registration
Social Media Integration
No
rmal
Pea
k
Stre
ss
Solution
Pen
etra
tio
n
IAM
/Ro
les
Man
ipu
lati
on
Solution
Performance Security
Business Functions
O Unable to Fulfill Mission
Degraded Capabilities
P Able to Fulfill Mission
] Not Evaluated
Testing Assessment Example Solution Assessment
Area Hap
py
Pat
h
Edge
C
ase
s
Login P P
Search
Checkout P P
Registration P O
Social Media Integration P
No
rmal
Pea
k
Stre
ss
Solution P P P
Pen
etra
tio
n
IAM
/Ro
les
Man
ipu
lati
on
Solution ] ] ]
Performance Security
Business Functions
Digestible
Informative
Facilitative
WALKTHROUGH
Real Project - Sanitized Multi Page Status Report - Highlights
Test Area Pass Rate Pass Fail
Area A 99% 13208 13
Area B 99% 2378 5
Area C 100% 8709 0
Area D 100% 1968 0
Area E 100% 564 0
Area F 100% 1057 0
Area G 99% 5791 2
Area H 99% 28586 22
Area I 99% 199 1
Area J N/A 0 0
Area K 100% 138 0
Area L N/A 0 0
Area M N/A 0 0
Total 62598 43
14
12
1018
27Critical
Major
Medium
Minor
Other
81
76
77
78
79
81
83
72
74
76
78
80
82
84
0
50
100
150
200
250
300
350
400
450
Current-1-2-3-4-5-6
Re
mai
nin
g O
pe
n
Tota
l Total Defects
Closed
Remaining Open
99.5%99.3%98.7%
98.2%98.2%
93.0%93.0%
88.0%
90.0%
92.0%
94.0%
96.0%
98.0%
100.0%
Current-1-2-3-4-5-6
Overall Test Pass Fail Rate
Real Project
Key Findings:
•5 new defects logged, 2 critical
•Area B lose key attributes under a single identified common scenario
•Area G will drop sign indicator under 2 common scenarios
Key Risks:
•Automation suite is behind in the refactoring schedule due to the pull of
existing team work
•Automation reporting framework needs deployed to internal docker
publisher – currently running in a user’s VM
Key Issues:
•Data is being consistently dropped in a random fashion
Multi Page Status Report - Highlights
? What are some of the
challenges with this report
42
Real Project - Sanitized Multi Page Status Report - Highlights
Test Area Pass Rate Pass Fail
Area A 99% 13208 13
Area B 99% 2378 5
Area C 100% 8709 0
Area D 100% 1968 0
Area E 100% 564 0
Area F 100% 1057 0
Area G 99% 5791 2
Area H 99% 28586 22
Area I 99% 199 1
Area J N/A 0 0
Area K 100% 138 0
Area L N/A 0 0
Area M N/A 0 0
Total 62598 43
14
12
1018
27
Open Defect Categories
Critical
Major
Medium
Minor
Other
81
76
77
78
79
81
83
72
74
76
78
80
82
84
0
50
100
150
200
250
300
350
400
450
Current-1-2-3-4-5-6
Re
mai
nin
g O
pe
n
Tota
l
Defect Trends
Total Defects
Closed
Remaining Open
99.5%99.3%98.7%
98.2%98.2%
93.0%93.0%
88.0%
90.0%
92.0%
94.0%
96.0%
98.0%
100.0%
Current-1-2-3-4-5-6
Overall Test Pass Fail Rate
Progress
Barriers
Readiness
Digestible
Informative
Facilitative
Real Project - Sanitized Multi Page Status Report - Highlights
Test Area Pass Rate Pass Fail
Area A 99% 13208 13
Area B 99% 2378 5
Area C 100% 8709 0
Area D 100% 1968 0
Area E 100% 564 0
Area F 100% 1057 0
Area G 99% 5791 2
Area H 99% 28586 22
Area I 99% 199 1
Area J N/A 0 0
Area K 100% 138 0
Area L N/A 0 0
Area M N/A 0 0
Total 62598 43
14
12
1018
27
Open Defect Categories
Critical
Major
Medium
Minor
Other
81
76
77
78
79
81
83
72
74
76
78
80
82
84
0
50
100
150
200
250
300
350
400
450
Current-1-2-3-4-5-6
Re
mai
nin
g O
pe
n
Tota
l
Defect Trends
Total Defects
Closed
Remaining Open
99.5%99.3%98.7%
98.2%98.2%
93.0%93.0%
88.0%
90.0%
92.0%
94.0%
96.0%
98.0%
100.0%
Current-1-2-3-4-5-6
Overall Test Pass Fail Rate
Progress
Barriers
Readiness
Digestible
Informative
Facilitative
Real Project - Sanitized Multi Page Status Report - Highlights
Test Area Pass Rate Pass Fail
Area A 99% 13208 13
Area B 99% 2378 5
Area C 100% 8709 0
Area D 100% 1968 0
Area E 100% 564 0
Area F 100% 1057 0
Area G 99% 5791 2
Area H 99% 28586 22
Area I 99% 199 1
Area J N/A 0 0
Area K 100% 138 0
Area L N/A 0 0
Area M N/A 0 0
Total 62598 43
14
12
1018
27
Open Defect Categories
Critical
Major
Medium
Minor
Other
81
76
77
78
79
81
83
72
74
76
78
80
82
84
0
50
100
150
200
250
300
350
400
450
Current-1-2-3-4-5-6
Re
mai
nin
g O
pe
n
Tota
l
Defect Trends
Total Defects
Closed
Remaining Open
99.5%99.3%98.7%
98.2%98.2%
93.0%93.0%
88.0%
90.0%
92.0%
94.0%
96.0%
98.0%
100.0%
Current-1-2-3-4-5-6
Overall Test Pass Fail Rate
Progress
Barriers
Readiness
Digestible
Informative
Facilitative
Real Project - Sanitized Multi Page Status Report - Highlights
Test Area Pass Rate Pass Fail
Area A 99% 13208 13
Area B 99% 2378 5
Area C 100% 8709 0
Area D 100% 1968 0
Area E 100% 564 0
Area F 100% 1057 0
Area G 99% 5791 2
Area H 99% 28586 22
Area I 99% 199 1
Area J N/A 0 0
Area K 100% 138 0
Area L N/A 0 0
Area M N/A 0 0
Total 62598 43
14
12
1018
27
Open Defect Categories
Critical
Major
Medium
Minor
Other
81
76
77
78
79
81
83
72
74
76
78
80
82
84
0
50
100
150
200
250
300
350
400
450
Current-1-2-3-4-5-6
Re
mai
nin
g O
pe
n
Tota
l
Defect Trends
Total Defects
Closed
Remaining Open
99.5%99.3%98.7%
98.2%98.2%
93.0%93.0%
88.0%
90.0%
92.0%
94.0%
96.0%
98.0%
100.0%
Current-1-2-3-4-5-6
Overall Test Pass Fail Rate
Progress
Barriers
Readiness
Digestible
Informative
Facilitative
Real Project
Key Findings:
• 5 new defects logged, 2 critical
• Area B lose key attributes under a single
identified common scenario
• Area G will drop sign indicator under 2
common scenarios
Key Risks:
• Automation suite is behind in the refactoring
schedule due to the pull of existing team work
• Automation reporting framework needs deployed to
internal docker publisher – currently running in a
user’s VM
Key Issues:
• Data is being consistently dropped in a random
fashion
Multi Page Status Report - Highlights
Progress
Barriers
Readiness
Digestible
Informative
Facilitative
REVAMPED
Revamped Report
Area Functionality Localization Security Performance
Area A X X
Area B X
Area C X
Area D X X
Area E X X
Area F X X
Area G X
Area H X
Area I X X
Area J X
Area K X
Area L X
Area M X
Global Solution X X
Mission Coverage
Revamped Report Asset Readiness
Asset Regression Current Sprint
People [Individual] < <
Manual Test Suite < 9
Automation Suite [Tagged] <
Exploratory Charters 9 9
Localization Test Suite < <
Old Functional Automation
Framework < 9
New Functional Automation
Framework 9
CI/Automation Integration < ]
Manual Reporting Framework < ]
Automation Reporting Framework < ]
Security Scanning Tools ] ]
Performance Suite ] ]
Revamped Report Asset Readiness
Asset Regression Current Sprint
People [Individual] < <
Manual Test Suite < 9
Automation Suite [Tagged] <
Exploratory Charters 9 9
Localization Test Suite < <
Old Functional Automation Framework < 9
New Functional Automation Framework 9
CI/Automation Integration < ]
Manual Reporting Framework < ]
Automation Reporting Framework < ]
Security Scanning Tools ] ]
Performance Suite ] ]
Context • Automation suite is behind in the refactoring schedule due to the pull of existing team
work • Automation reporting framework needs deployed to internal docker publisher – currently
running in a user’s VM
Revamped Report Testing Assessment
Area Regression Current Sprint Localization
Area A P P P
Area B O
Area C P P P
Area D P P P
Area E P P P
Area F P P
Area G P
Area H P P P
Area I P P
Area J P ]
Area K P P
Area L ] ]
Area M ] ]
Revamped Report Testing Assessment Area Regression Current Sprint Localization
Area A P P P
Area B O
Area C P P P
Area D P P P
Area E P P P
Area F P P
Area G P
Area H P P P
Area I P P
Area J P ]
Area K P P
Area L ] ]
Area M ] ]
Context • Area B lose key attributes under a single identified common scenario • Area G will drop sign indicator under 2 common scenarios • Data is still being consistently dropped in a random fashion
Revamped Report
Area Functionality Localization Security Performance
Area A X X
Area B X
Area C X
Area D X X
Area E X X
Area F X X
Area G X
Area H X
Area I X X
Area J X
Area K X
Area L X
Area M X
Global Solution X X
Mission Coverage Digestible
Informative
Facilitative
Revamped Report Asset Readiness
Asset Regression Current Sprint
People [Individual] < <
Manual Test Suite < 9
Automation Suite [Tagged] <
Exploratory Charters 9 9
Localization Test Suite < <
Old Functional Automation Framework < 9
New Functional Automation Framework 9
CI/Automation Integration < ]
Manual Reporting Framework < ]
Automation Reporting Framework < ]
Security Scanning Tools ] ]
Performance Suite ] ]
Context • Automation suite is behind in the refactoring schedule due to the pull of existing team
work • Automation reporting framework needs deployed to internal docker publisher – currently
running in a user’s VM
Digestible
Informative
Facilitative
Progress
Barriers
Readiness
Revamped Report Testing Assessment Area Regression Current Sprint Localization
Area A P P P
Area B O
Area C P P P
Area D P P P
Area E P P P
Area F P P
Area G P
Area H P P P
Area I P P
Area J P ]
Area K P P
Area L ] ]
Area M ] ]
Context • Area B lose key attributes under a single identified common scenario • Area G will drop sign indicator under 2 common scenarios • Data is still being consistently dropped in a random fashion
Digestible
Informative
Facilitative
Progress
Barriers
Readiness
Another Example
SaaS with Custom Configuration Row Labels Tests Written Tests Executed
BAG MEAL 0 0
BBTS Device 85 85
BBTS Report/Query 29 29
BBTS SW Config 288 245
BBTS Swipe Tracking 93 92
BBTS TIA 23 23
HERO 11 11
HUDS DW 0 0
IDDM 20 20
JSA 10 10
Meals Auditor Replacement 11 10
SharePoint 0 0
Cutover 65 65
UAT 81 61
Grand Total 716 651
Execution Status
Fail Pass Grand Total
BBTS Device 85 85
BBTS Report/Query 29 29
BBTS SW Config 8 237 245
BBTS Swipe Tracking 2 90 92
BBTS TIA 2 21 23
HERO 11 11
IDDM 20 20
JSA 10 10
Meals Auditor Replacement 10 10
Cutover 65 65
UAT 61 61
Grand Total 12 639 651
499 533 540 645 678
736 732 720
86
293 366
454 488 558 562
648
9 21 28 36 49 62 68 79 0
200
400
600
800
Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9
Team Velocity
Test Case Written Test Case Executed Total Bugs
23, 28%
55, 66%
5, 6%
Bug Status
Open
Closed
Not a Bug
SaaS with Custom Configuration Formatted for PPTX
Assets Asset
Readiness
Manual Test Cases <
QA Environment <
TCM System <
Test Data <
Area Assessment
BAG MEAL ]
BBTS Device P
BBTS Report/Query P
BBTS SW Config O
BBTS Swipe Tracking
BBTS TIA
HERO P
HUDS DW ]
IDDM P
JSA P
Meals Auditor Replacement P
SharePoint ]
Cutover P
UAT P
Issues: • Configuration IDE will not currently compile due to
partial config manually overwriting baseline config. Awaiting vendor assistance.
• Swipes will not track in failover mode • Interfaces will not fail over to alternate IP on record
Other Notes: • Team velocity is on schedule
QUESTIONS? Contact Centric
60
To learn more about Centric Software Quality Assurance and Testing (SQA&T)
Services:
CentricConsulting.com
Joseph Ours
Email: [email protected]
Phone: 614.668.2306